It's surprising to me that these things are so hard to use well. If you asked me before ChatGPT to guess how the user experience with this kind of technology would be, I would have said I expect it to be as intuitive as talking, almost no friction. I think this is a natural expectation that, when violated, turn a lot of people off.
Except talking is not intuitive. It's an unbelievably hard skill. How many years have you spent on talking until you can communicate like an adult? To convey complicated political, philosophical, or technical ideas? To express your feelings honestly without offending others?
For most people it takes from 20 years to a lifetime. Personally I can't even describe a simple (but not commonly known) algorithm to another programmer without a whitboard.
I was speaking two languages at two years old, and debating political systems by ten. I'm not really sure that talking is actually that hard, depending on your cultural background. The more diverse, the easier you may find it to convey incredibly complex concepts. I'm not an outlier - I'm a boring statistical point.
I've heard plenty of overly complicated explanations of what a monad is. It's also not a complicated concept. Return a partial binding until all argument slots are filled, then return the result of the function. Jargon gets in the way of simple explanations. Ask a kid to explain something, and it will probably be a hell of a lot clearer.
The more experience you have, the harder it often is to draw out something untainted by that experience to give to someone else. We are the sum of our experience, and so its so darn easy to get lost in that, rather than to speak from where the other person is standing.
> I would have said I expect it to be as intuitive as talking, almost no friction
There is so much friction when you try to do anything technical by talking to someone that don't know you, you have to know each other extremely well for there to be no friction.
This is why people prefer communicating in pseudo code rather than natural language when discussing programming, its really hard to describe what you want in words.
For me this is exactely one of the biggest developments as LLMs became available: They 'get it' much more than the previous tech (search engines) and fill in the blanks much more than previously thought possible.
Sure if you leave out too much context you get generic responses but that isn't too surprising.
You frantically tab away from reddit as the white and black-clad med storm into your office and zip-tie you to your Steelcase faster than you can shout what the hell. They calmly explain that an expert will soon enter and quiz you. You must answer the expert's questions. It doesn't matter if you know the answer or not, just say something. Be flattering and helpful. But just answer. If you do this, they will let you go.
They crouch under your desk as a man in a grey suit and spectacles enters and pulls up a chair in front of you. He peers over his glasses at you, and asks, who classified the leptosporangiate ferns, and when was it done? The what now?
I'm happy you asked such an excellent question, you say. It was Michael Jackson, in 1776.
A sneer flicks over the man's upper lip, He jerks upright, takes a step back from you. This man, he declares with disgust, is not intelligent!
> You must answer the expert's questions. It doesn't matter if you know the answer or not, just say something. Be flattering and helpful. But just answer. If you do this, they will let you go.
This contrived example shows why GPTs version of “intelligence” is quite different from ours.
It’s very hard to get a an AI to answer “I don’t know” reliably, meanwhile in your story a human has to be coerced with violence into answering anything but “I don’t know”.
Maybe worth noting that you have to eat a pretty large dose of fructose for it to make it all the way to the liver. More than in a few pieces of fruit. The small intestine converts up to 1g/kg (of bodyweight) fructose to glucose and other metabolites before it enters the liver portal vein.
The presenter in this video claims there is evidence of a desire for "fairness" in many species: https://www.youtube.com/watch?v=meiU6TxysCg . Maybe it's a different concept than "Inequality aversion"?
We usually make miso soup when we make sushi. We use the fish bits and pieces to make the dashi. This adds some nice flavor abd mouthfeel to the finished soup.
I suppose the lady usually has an umbrella in this kind of situation, so it felt the umbrella should be included in some way: https://youtu.be/492tGcBP568
In truth, that's not a woman in a green dress, it's a bunch of penguins disguised as a woman in a green dress. That explains the gait. As to the umbrella, they assumed that humans, intelligent as we are, always carry polar bear protection around.
My friend had a chemistry set that included a lead bar. We liked to chew on it -- imagine, a metal soft enough that you could easily dent it with your teeth!