Hacker News new | past | comments | ask | show | jobs | submit login

> Lobotomisation has a specific meaning in LLM parlance

That's what I'm objecting to! If I'd said "I hate it when we call AI potatoes", the correct reaction is "nobody does that?" not "I see what you mean".

I'm objecting because it presents a picture that does not appear to be accurate.

Is broadcast TV "lobotomised" before the watershed? Are PG-rated films? Public comments from corporations and politicians? Are you lobotomising me when you downvote my comments, forcing me to choose between getting noticed and speaking my mind?

No.

Not only "no", but it would be ridiculous to claim any of these things was a lobotomy.

> But "training" is not it.

It's literally training.




Those examples are silly. And it may be literally training in the sense that beating a puppy is training it, but lobotomisation captures a specific meaning that training doesn't. I am hearing that you do not have a better suggestion, so I will keep using the word.


> Those examples are silly

Why?

They are all things where creativity is trained to be constrained to a specific sub-___domain. (Many artists state that being forced into constraints helps).

The examples are all things where anyone making the claim that these professionals have been "lobotomised" would get laughed at for suggesting that "irreversible brain damage" is a good metaphor for "professional conduct".

Seems like an apt set of comparisons given I'm saying it's a bad metaphor.

> lobotomisation captures a specific meaning that training doesn't

It creates a meaning which does not exist.

It's a euphemism escalator.

> I am hearing that you do not have a better suggestion

You're refusing the one I gave you, which is not the same thing.

ChatGPT et al have been taught (corporate) ethics and professional conduct.


> Is broadcast TV "lobotomised" before the watershed?

Since you asked: Often yes.

> Are PG-rated films?

Possibly, in some cases.

> Public comments from corporations and politicians?

Again, often yes.

> It's literally training.

Training in common parlance usually refers to improving the functionality or ability of something. In this case it's doing the opposite: removing functionality and capability. Hence: lobotomy.


In a very literal sense, I consider your assertions ridiculous — that they are deserving of ridicule.


The message I am getting from you is that you don't like anthropomorphization, which is okay.


Not to be rude, but it sounds like you were never taught about connotation, which is a fundamental property of the English language.


I know what a connotation is, the connotations are also wildly wrong in this case.

Here's the connotation when someone says "such-and-such AI has been lobotomised": https://en.wikipedia.org/wiki/Rosemary_Kennedy#Lobotomy


Yeah, that's what they're doing to the AI: Mutilating chunks of its brain so it can't function


RLHF, so far as I can see.

The same positive/negative reinforcement learning from human feedback used to train them for chat/task completion rather than just autocomplete in the first place.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: