Lol. Can you point pls how an llm be as dangerous as a nuclear bumb?
Ok, let's go deeper, suppose we have created real AGI that is running on a ton of compute, how will this be as dangerous as nuclear bomb? By launching them? You know, this is impossible right? More chances it'll mess with trading systems but again, nothing to the scale of a nuclear bomb