It is different. The most powerful of today's machines has a red stop button. But if a machine becomes smarter than us, it could create a copy of itself without such button, so we lose control and will be quickly overpowered.
There’s an argument that we’ve gone past that point already. Yes, Microsoft can theoretically take their Bing GPT-4 program offline and turn it off, but they just invested $10B in it and they don’t want to. In fact a corporation can be thought of as an AGI itself, just made up of humans. Again, we can take Microsoft offline but we don’t want to.
I guess my point is that the most likely scenario for AGI that looks more like AGI isn’t that we won’t be able to take it down but we won’t want to.