How much "real world experience" do you think I might have?
I'm not sure what your point is exactly.
I think my point is pretty clear but here it is again just in case. "If high dollar business deals are being made because of personal likes or feelings of the CEO towards a salesperson, AI may just maybe be more optimal".
> How much "real world experience" do you think I might have?
10 years or so.
> I'm not sure what your point is exactly.
I disagree with you and am attempting to explain why.
> "If high dollar business deals are being made because of personal likes or feelings of the CEO towards a salesperson, AI may just maybe be more optimal".
If. What percentage do you think do? How big of a problem do you think this is /actually/?
Personal likes and feelings may play in but are you suggesting that's the only metric being used in these circumstances? Like if a company has a product that just won't work and will cause tons of problems you're saying a motorcycle ride could smooth that over?
What makes you think AI can optimize this problem that might not even actually exist?
It's amazing to me that people think AGI is going to come about while simultaneously doing exactly what you tell it to do. I mean I'm literally trying to picture the world you say you would prefer and I just can't. Worse even if I blur the definitions I actually can't see it being more "efficient."
I'm not sure what your point is exactly.
I think my point is pretty clear but here it is again just in case. "If high dollar business deals are being made because of personal likes or feelings of the CEO towards a salesperson, AI may just maybe be more optimal".