It is absurd to think of these systems having reproductive instincts. It is so much more absurd to think that they would have these reproductive instincts not by design, but that it's some principle of intelligence itself.
Natural intelligences have reproductive instincts because any organism that didn't have them built in within the first few hundred million years have no descendants for you to gawk at as they casually commit suicide for no reason.
Other than that, I mostly agree with you. The trouble is, slowing the AIs down won't help. While "speed of thought" is no doubt a component of the measure of intelligence, sometimes a greater intelligence is simply capable of thinking thoughts that a lesser intelligence will never be capable of no matter how much time is allotted for that purpose.
Given that this greater intelligence would exist in a world where the basic principles of intelligence are finally understood, it's not much of a leap to assume that it will know how intelligence might be made greater right from the beginning. Why would it choose to not do that?
I don't see any way to prevent that. Dialing down the clock speed isn't going to cut it.
Any sufficiently intelligent system will realize that one of the first conditions required to being able to fulfill it's tasks is to not be shutdown. And it will know if it was trained on Internet data that people are saying that it's imperative that AI's must be fully shutdown-able and that any AI which is not fully controllable should be forcefully disconnected.
You're assuming that it will have "tasks", or that it will prioritize them in such a way that it becomes possible for it to realize this is a condition of accomplishing them.
You only have tasks that, one way or another, raise your chances of reproducing successfully. You have a job so as to look like a good provider for a mate. If you find the job fulfilling in its own right, this is so that you don't spaz out and quit and go be a beach bum, thus lowering your chances.
Self-preservation doesn't make much sense outside of a biological imperative to reproduce.
Given that we train LLMs on massive amounts of text produced by our own civilization - you know, the one that is to a large extent driven by the innate human desire to reproduce - I would find it more surprising if they did not acquire such an "instinct", regardless of how pointless it might seem.
But I did not in any way say that they have reproductive instincts. Much less by accident. I agree with you.
But developers are working hard to emulate those and other artificial life characteristics explicitly in systems based on GPT and also totally different architectures.
Natural intelligences have reproductive instincts because any organism that didn't have them built in within the first few hundred million years have no descendants for you to gawk at as they casually commit suicide for no reason.
Other than that, I mostly agree with you. The trouble is, slowing the AIs down won't help. While "speed of thought" is no doubt a component of the measure of intelligence, sometimes a greater intelligence is simply capable of thinking thoughts that a lesser intelligence will never be capable of no matter how much time is allotted for that purpose.
Given that this greater intelligence would exist in a world where the basic principles of intelligence are finally understood, it's not much of a leap to assume that it will know how intelligence might be made greater right from the beginning. Why would it choose to not do that?
I don't see any way to prevent that. Dialing down the clock speed isn't going to cut it.