> For instance if the IQ of AIs reach a plateau of 500 rather than exponentially increasing to infinity, we may not be around to see the plateau.
If the premise for this is, "because we might not survive each other," rather than the AI being specifically an extinction event for humanity, then I think we agree.
If the premise for this is, "because we might not survive each other," rather than the AI being specifically an extinction event for humanity, then I think we agree.