So, I disagree a little. The first-order problem is one of control, as you describe. The second-order problem is one of comfort with control, and the normalization that that entails. Once you accept a certain piece of automation, you tend accept its failures alongside its successes. To continue on from the GP, it is entirely likely that an error message along the lines of "your phrasing does not conform to polite societal norms" (to avoid the term 'groupthink') adds one extra layer of friction between an intention and an act. This can be a good thing (your google example), but it's not universally a good thing, because of possible conditioning effects that can have subtle and long-lasting implications.
I agree that it's not scary in the "Skynet is going to take over" sense. Much rather in the "They Live" sense, though even without the bare malicious intents there.
I agree that it's not scary in the "Skynet is going to take over" sense. Much rather in the "They Live" sense, though even without the bare malicious intents there.