Hacker News new | past | comments | ask | show | jobs | submit login

I think people just see FrontierMath as a goal post that an AGI needs to hit. The term "artificial general intelligence" implies that it can solve any problem a human can. If it can't solve math problems that an expert human can, then it's not AGI by definition.

I think we have to keep in mind that humans have specialized. Some do law. Some do math. Some are experts at farming. Some are experts at dance history. It's not the average AI vs the average human. It's the best AI vs the best humans at one particular task.

The point with FrontierMath is that we can summon at least one human in the world who can solve each problem. No AI can in 2024




Okay, sounds like different definitions.

If you have a single system that can solve any problem any human can, I'd call that ASI, as it's way smarter than any human. It's an extremely high bar, and before we reach it I think we'll have very intelligent systems that can do more than most humans, so it seems strange not to call those AGIs (they would meet the definition of AGI on Wikipedia [1]).

[1] https://en.wikipedia.org/wiki/Artificial_general_intelligenc...


>If you have a single system that can solve any problem any human can, I'd call that ASI

I don't think that's the popular definition.

AGI = solve any problem any human can. In this case, we've not reached AGI since it can't solve most FrontierMath problems.

ASI = intelligence far surpasses even the smartest humans.

If the definition of AGI has is that it's more intelligent than the average human, you can argue that we already have AGI today. But no one thinks we have AGI today. Therefore, AGI is not Claude 3.5.

Hence, I think the most acceptable definition for AGI is that it can solve any problem any human can.


>I don't think that's the popular definition.

People have all sorts of definitions for AGI. Some are more popular than others but at this point, there is no one true definition. Even Open AI's definition is different from what you have just said. They define it as "highly autonomous systems that outperform humans in most economically valuable tasks"

>AGI = solve any problem any human can.

That's a definition some people use yes but a machine that can solve any problem any human can is by definition super-intelligent and super-capable because there exists no human that can solve any problem any human can.

>If the definition of AGI has is that it's more intelligent than the average human, you can argue that we already have AGI today. But no one thinks we have AGI today.

There are certainly people who do, some of which are pretty well respected in the community, like Norvig.

https://www.noemamag.com/artificial-general-intelligence-is-...


>That's a definition some people use yes but a machine that can solve any problem any human can is by definition super-intelligent and super-capable because there exists no human that can solve any problem any human can.

We don't need every human in the world to learn complex topology math like Terence Tao. Some need to be farmers. Some need to be engineers. Some need to be kindergarten teachers. When we need someone to solve those problems, we can call Terence Tao.

When AI needs to solve those problems, it can't do it without humans in 2024. Period.

That's the whole point of this discussion.

The definition of ASI historically is that it's an intelligence that far surpasses humans - not at the level of the best humans.


>We don't need every human in the world to learn complex topology math like Terence Tao. Some need to be farmers. Some need to be engineers. Some need to be kindergarten teachers.

It doesn't have much to do with need. Not every human can be as capable regardless of how much need or time you allocate for them to do so. Then some humans are shoulders above peers in one field but come a bit short in another closely related one they've sunk a lot of time into.

Like i said, arguing about a one true definition is pointless. It doesn't exist.

>The definition of ASI historically is that it's an intelligence that far surpasses humans - not at the level of the best humans.

A Machine that is expert level in every single field would likely far surpass the output of any human very quickly. Yes, there might exist intelligences that are significantly more 'super' but that is irrelevant. Competence, like generality is a spectrum. You can have two super-human intelligences with a competence gap.


There is no one true definition but your definition is definitely the less popular one.


The reason for the AGI definition is to indicate a point where no human can provide more value than the AGI can. AGI should be able to replace all work efforts on its own, as long as it can scale.

ASI is when it is able to develop a much better version of itself to then iteratively go past all of that.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: