lets say that a general AI is developed and brought online (the singularity occurs). Lets also say that it has access to the internet so it can communicate and learn, and lets also say that it has unlimited amount of storage space (every harddrive in every device connected to the internet).
at first the AI will know nothing, it will be like a toddler. than, as it continues to learn and remember, it will become like a teenager, than like an adult in terms of how much it knows. Than it will become like an expert.
but it doesn't stop there! a general AI wouldn't be limited by 1) storage capacity (unlike human's and their tiny brains that can't remember where they put their keys) or 2) death (forgetting everything that it knows).
so effectively a general AI, given enough time, would be omnipotent because it would continually learn new things forever.
Or maybe the AI would fracture into warring components after every network partition. Maybe it would be unable to maintain cohesion over large areas due to the delay imposed by the speed of electrical communications.
Why should one hypothetical be assumed true and not the other?
lets say that a general AI is developed and brought online (the singularity occurs). Lets also say that it has access to the internet so it can communicate and learn, and lets also say that it has unlimited amount of storage space (every harddrive in every device connected to the internet).
at first the AI will know nothing, it will be like a toddler. than, as it continues to learn and remember, it will become like a teenager, than like an adult in terms of how much it knows. Than it will become like an expert.
but it doesn't stop there! a general AI wouldn't be limited by 1) storage capacity (unlike human's and their tiny brains that can't remember where they put their keys) or 2) death (forgetting everything that it knows).
so effectively a general AI, given enough time, would be omnipotent because it would continually learn new things forever.