Hacker News new | past | comments | ask | show | jobs | submit login
Genomic study: our capacity for language emerged at least 135k years ago (phys.org)
113 points by wglb 50 days ago | hide | past | favorite | 91 comments



The article doesn't say what the headline says. At all. Never in the article does it say that "our capacity for language emerged 135,000 years ago". What the article says is that they have genomically traced our species back to around 135,000 years ago to the point where we started splitting off in different groups, and as we all have the capacity for language we must have had it at the point of split as well (before that all could be said to be in the same local group. Possible, what with those population bottlenecks and everything).

There's nothing in there saying there wasn't language before, nor why it shouldn't have been present much earlier. Early enough that e.g. Neanderthals could equally well be covered.


Theoretically, it is possible language developed later. Some people hold the position that there is no specific language hardware in our brains, and that language is the result of a socio-cultural process (which then would spread).


Some animals have provably some sort of language, and most probably have them for tens of millions of years, or their ancestors did. No reason to believe there was no language before those 135k. Clearly you don't need something very special and unique to us.

Now where is the threshold when we consider something a proper language and something... less lets say I don't know. Science probably doesn't have a clue neither. The evolution was probably very gradual, ie every 1000 years the complexity grew up by a notch, on average. Compound it to a million years.


"some sort of language" covers a wide range. Some song birds learn their songs, but the songs are stereotyped--the bird sings pretty much the same thing over and over, and there's no evidence that Bird A is communicating anything to other birds beyond "This is my territory, stay away!" to males and "Hey girl!" to females. Modern human languages are several steps above this on the Chomsky hierarchy, with at least context-free phrase structure, and in some cases reportedly context-sensitive phrase structure.


Any social species must have a language. Ours is much more complex than what we've encountered in other animals. Our language developed because of its ease of transmission, its capability to convey complex concepts, and because we discovered writing. But there probably are a few qualitative biological differences between us and other species that are essential for the difference in language capacity. Those haven't evolved much, I suspect.


I agree to a certain point, but writing (invented in the past few thousand years) has not led to more complex languages. I've worked on languages for which there was, until a few decades ago, no writing system, and they're every bit as complex as long-written languages.

The qualitative differences between human languages and any animal "language" (scare quotes intentional) include recursion and breadth of vocabulary, and likely a lot of other things to. Recursion and the ability to learn a large vocabulary are probably biological and unique to us, the other things might or might not be unique to us


What about the speech centers? Broca and Wernicke areals is the closest thing humans have to language hardware.


I'm in the hard-wired camp, but the people that defend the 'blank slate' idea, say it's just a good place for language skills to develop during the normal learning process. Both theories have holes.


What distinction are you trying to draw? To me the headline and the claim the article makes are effectively the same, just with the article claim dotting some i's in terms of assumptions being made (it seems reasonable to simify a headline).


Thank you, I read the same thing! Glad folks are commenting on this.


The above comment feels really spammy/AI generated but I promise I'm a real human and just genuinely appreciated someone mentioning the disparity between the headline and the actual research statement.


That's what "at least" means in the title.


If the title cannot be edited then the post should be removed, IMO.


It looks like they actually say at least 135K years ago, as that is the latest it could have happened.


So this finding is in the same vein as mathematicians who can't yet prove or disprove a conjecture, but can prove tighter restrictions than we knew before. Still interesting, and, more importantly, often productive of new paths or methods of analysis.

As for the article itself, I'm not sold. The analysis was about narrowing down to find the most recent regional split in our genetics, with the assumption that language formed before humans split across regions. That would work if we assumed the capability to develop language rested largely on genetics. If a lot of animals could develop a language, then the explosion of human language could come from imitation when one regional group without a language meets another with a language.

>"Human language is qualitatively different because there are two things, words and syntax, working together to create this very complex system," Miyagawa says. "No other animal has a parallel structure in their communication system. And that gives us the ability to generate very sophisticated thoughts and to communicate them to others."

I have seen a lot of ideas about human mental exceptionalism fall apart. It wouldn't surprise me to find out in a few years that dolphins developed syntax.


I think we already have decent evidence of animals having synytax. Presumably Miyagawa means something like recursive syntax.


I agree - and some animals even have semantics, as evidenced by those which have some intuitive understanding of alarm calls, and even more so when they do so for the calls of other species. On the other hand, I doubt they know they have syntax and semantics.


> On the other hand, I doubt they know they have syntax and semantics.

A lot of humans probably ignore it too. Most don't even ever heard about phonology, which is more fundamental to language than syntax.


Humans cannot ignore syntax or semantics, if that's what you're saying. Every human language has both, and no one can speak a language without knowing (usually implicitly) both.

As for phonology, languages have that (even signed languages), but I don't see it as being more fundamental than syntax or semantics. Indeed, while some writing systems have something analogous to phonology (context-sensitive letter shapes, as in Arabic script, but also in Greek), some don't. And it's possible to learn written language without any spoken language--deaf people do that.


Exactly. The article literally says "I think we can say with a fair amount of certainty that the first split occurred about 135,000 years ago, so human language capacity must have been present by then, or before."

The title is wrong.


On that note, we know that interbreeding between Neanderthals and (the ancestors of) modern humans happened during at least two periods:

> Neanderthal-derived genes descend from at least 2 interbreeding episodes outside of Africa: one about 250,000 years ago, and another 40,000 to 54,000 years. Interbreeding also occurred in other populations which are not ancestral to any living person [0]

... which makes me think that it's more likely that both populations had the genomic capacity for language all the way back in the first "episode" 250k years ago as well (but to be clear, I'm saying it seems more likely to me, not that we can safely assume this must be true) .

Yes, I know that "humans are impossibly horny" is a bit of a meme, but we're not just talking about sex here, but about having offspring that is successful enough within then-existing society over many generations, to the point where the Neanderthal DNA was absorbed into the common gene pool and survived all the way to this day.

It seems very unlikely to me that that would happen unless both species were on a comparable level of mental capabilities. So we either both evolved language capabilities separately around similar times somewhere after 250k years ago, or already had the basis for it before our ancestral populations split and then later met again.

[0] https://en.wikipedia.org/wiki/Neanderthal#Interbreeding


Yeah, this is a common occurrence in that something is reported to have happened at year X, but in practice it will have emerged rouglhy around that time, and never at one fixed moment in time. Possibly re-emerged over and over again until it stuck.


Title was modified from what it originally said.


This appears to be an argument for terminus ante quem and useful in that sense but it ignores the possibility there is a far earlier terminus post quem when the actual language capacity emerges.

I think it true(ish) that in a model strongly aligned to a single root language the point of segmentation is the last point language seen in all post-fragmented states can exist. But I don't see why that is also the first point. It's just the one we can detect genetically. There will be some subsequent genetic evidence perhaps to a specific structural change in the brain, or vocal chords, or something else, indicative of language emergence.

If holographically defined families tools pre-date this time, then abstract concepts were being communicated, even if not vocally. Show-and-tell has limits and I would argue strongly suggest concepts inherent in language existed to communicate how to do the tool making.


Abstract conceptual thinking cannot be separated from our language ability, both of which are dependent on our access to mind. [Our brains are the tuners we use to select which realms of thought we are attempting to focus on. Some people choose to tune into universal compassion, some prefer pack-mentality behaviors such as racism or religious bigotry (including against those who haven't any), some just choose to make money or pursue greater pleasure, the list goes on and on.]

Our bodies are mammalian in the basic pattern, but we are clearly distinct in ability. It wasn't 6000 frickin years ago, but there was a creation event where we instantaneously diverged from pure animal life.

Mitochondrial Eve traces us to our root. Thanks again, science!


The word "instantaneously" seems a bit strong, even assuming you don't mean it literally. Many different kinds of animals apparently exhibit many of the prerequisites or precursor traits that seem necessary for language, even if they don't posses all of them in order to make the leap humans apparently did. Communication is not unique to humans. Vocalization is not unique to humans. Awareness of self is likely not unique to humans. We are clearly unique in some way, but it still seems plausible that it was a gradual (if relatively rapid) process that could have occurred from the confluence of many unrelated circumstances and abilities that has simply only happened once on this planet (as far as we're aware). Language, and everything derived from it, is also a positive feedback loop that only accelerates its own advancement once it occurs. And by looking at the world today, it also seems to preclude similar advancements happening in other species once it's occurred in one.


> Abstract conceptual thinking cannot be separated from our language ability

1. Clearly false (what words do you use when you plan to throw a ball to a target?)

2. There is no reason to believe other animals are incapable of abstract thinking (although we can clearly see they are far less intelligent)

3. I don't know what your long screed about bigotry and religion has to do with language ability

Just imagine how slow university-level maths would be if you had to do all the thinking in words in your head.

Sorry if I sound rude, but this is such a common misconception people believe in, and it's so blatantly not true so I'm baffled people keep on claiming it is 'scientific' to believe that language is identical to intelligence. No wonder some people insist LLMs are just weeks away from becoming AGIs.


> Clearly false (what words do you use when you plan to throw a ball to a target?)

If a monkey can do it, it's not very abstract.


When I drive to and from work I drive through several places where I have to mentally imagine what several other drivers plan to do, and, depending on what they actually do, (and what problems they may run into due to the slippery road) what the traffic picture will look like in a few moments. Which will let me decide if I stop right here, for example, so that the car which will otherwise be blocked by me and itself will block the traffic coming the other way can pass in front of me and resolve the (future) situation.. things like that. I have to do the whole thought experiment in my brain in a second or two. There are no words at all involved in that. And LLMs can't do such things as of now. I doubt a monkey could either.

At work I do a lot of design thinking. I don't use words for that either, that comes later when I document the thing.


Spacial dynamics don't need abstract thinking because they're concrete. For example: throwing a ball, or catching a ball.

Understanding the physics or logic behind complex moving pieces does require abstract thinking, but that's not what your brain does to perform physical feats.

We saw them, practiced dealing with them, then became proficient, or not.


I'm talking about the time analyzing, not the time executing. And that's definitely abstract. And, as I mentioned, this applies to my design work too. And when programming. No words involved, until I write the documentation.


> I'm talking about the time analyzing, not the time executing.

I see, and I agree.

> And when programming. No words involved, until I write the documentation.

Your variable names must be horrible ;-)

I do get your point about visualizing software relationships in the design phase, though. I'd still say, as a 40+ year programmer, that we can never escape the concept-words that are the foundation of programming, e.g. types, variables, functions, classes, encapsulation, pipeline, executable, etc.

Even if we are not consciously thinking of the terms, my guess is that, at a certain level of proficiency, we are using them at a kindof subconscious level.

It's an interesting meta-topic, and I won't say you're wrong, but we've certainly stumbled into a mostly unexplored land where no map yet exists. That's why programming is so challenging, difficult, and fascinating.

Peace be with you, friend.


But it is abstract, without requiring language. Driving would be a better example, the other guy put it better.

Look at my other example - maths. How do you reason about solving mathematical problems rapidly with words? Many mathematicians visualise abstract problems without using words, only using words later to attempt to write more thorough proofs.


> 1. Clearly false (what words do you use when you plan to throw a ball to a target?)

That's physical, not abstract. Our entire body's neural net is used to develop those physical skills. When it comes time to catch or throw a ball, no abstract thinking is involved, just doing.

If you want to study the physics of throwing a ball, then you need abstract thought, but not for just practicing it.

> 2. There is no reason to believe other animals are incapable of abstract thinking (although we can clearly see they are far less intelligent)

And there is reason to believe they're capable of abstract thinking?

> 3. I don't know what your long screed about bigotry and religion has to do with language ability

Because we communicate abstract ideas to one another via language. Once we acquire a mind-topic, we can then focus on it, as per our desire and mental focusability. That's why programming is so difficult.

But we can also contemplate our moral basis with respect to another mind-topic that teaches us how to abstractly evaluate it. It can be called religion or ethics or whatever. But we navigate our choices against whatever we deem permissible or impermissible, desirable or repellant. And we can self-evolve that basis against which we make our choices. It's an essential aspect of human nature, and it requires abstract concepts such as compassion, harm, happiness, sadness, kindness and anger, to name but a few.

> Just imagine how slow university-level maths would be if you had to do all the thinking in words in your head.

Just imagine how uni-level maths would be if there were no abstract concepts being taught with actual words.

Are you saying that not only did your maths education only use diagrams, but that no verbal explanation was involved?

> Sorry if I sound rude, but this is such a common misconception people believe in, and it's so blatantly not true so I'm baffled people keep on claiming it is 'scientific' to believe that language is identical to intelligence.

It's not rude, but it is just wrong. For one, I never said it was identical to intelligence. Intelligence is evaluating an abstract concept net to attempt to evaluate how a particular concept relates to it.

Abstract concepts are built on the same logical network that facilitates our ability to process and produce language. How we build and utilize our ever-evolving network results in abilities that we call intelligence.

> No wonder some people insist LLMs are just weeks away from becoming AGIs.

That's not me, brother. I'm not even on the LLM hype train, much less think they're going to facilitate AGI. They probably have a few niche uses, but that's my take on all the LLM stuff.


> That's physical, not abstract. Our entire body's neural net is used to develop those physical skills. When it comes time to catch or throw a ball, no abstract thinking is involved, just doing.

Abstraction is about generalising beyond the physical objects we experience. The ability to understand the physics behind how objects work is abstract, but it's a low enough level of abstraction that even simple animals are capable of it.

> Just imagine how uni-level maths would be if there were no abstract concepts being taught with actual words.

Language is pretty good for communication, obviously.

> Are you saying that not only did your maths education only use diagrams, but that no verbal explanation was involved?

My driving test also used language to convey the lesson. But I don't talk to myself before I decide which direction to turn the wheel...

> Abstract concepts are built on the same logical network that facilitates our ability to process and produce language.

Overlap, certainly. Anything else is conjecture.


> The ability to understand the physics behind how objects work is abstract, but it's a low enough level of abstraction that even simple animals are capable of it.

No, it's a physical coordination of our body's neural net that connects our senses, brain, nerves, and musculature to perform coordinated feats. It's the direct opposite of abstract: it's concretely physical. And we share this neural net wetware wiring with our cousins, the animals.

There is nothing abstract about pulling your hand off the stove, but describing anything requires abstract concept-nets.

> Language is pretty good for communication, obviously.

It's essential for communicating abstract concepts, as well as contemplating them.

> My driving test also used language to convey the lesson. But I don't talk to myself before I decide which direction to turn the wheel...

That's because you have taught your wetware how to physicalize the concept net's decision tree by learning how to recognize the inputs and coordinate the proper responses. Lather, rinse, repeat.

> Overlap, certainly. Anything else is conjecture.

Anything that can be named exists as an abstract mental concept; that's what a concept is. The map is not the territory, because it is abstract, because the map exists only in our brain, even if we learned its contours from a paper document. That's what makes it abstract, because we have abstracted it to a mental concept, where its name refers to a network of other names in a describable way.

That is how langauge is built up from literally nothing but our elders' repetitive descriptions of objects and their interactions, kinda like typed values and functions where the typing describes their relationships to other typed values and functions. And the relationships are abstract concepts, too.

Thanks for this interesting philosophical discussion. Peace be with you.


The invention of writing in Ur has an uncanny alignment with the Jewish calendar.


Writing is far older than that.


Huh? How so?


I recommend the book “Why Only Us?” by Chomsky and Berwick on this topic. It gets quite technical in places but I still got quite a lot of out it.


If you’re interested in this topic then you should definitely know the Chomsky/Berwick position, but it’s not the only view and has never been the consensus. For an alternative argument, “The Origin of Human Communication” by Michael Tomasello is good.


Can someone recommend good reading on the "psychology of languages", i.e., how does a language develop in the brain; why is it easy for me (as someone who grew up in the East) to learn guttural sounds of Arabic, but it's nearly impossible for my kids raised in the West? Similarly, I could never hope to speak clicking sounds of Southern African languages. Does the writing system affect speech, and if yes, then how? Like for example, in some Central Asian countries the alphabet changed multiple times within a century - from Arabic-based script, to Latin, then Cyrillic, then Latin again. What are some known neuroscientific tricks (for adults, not kids) for acquiring specific languages?


The recommendations in other replies to your post look good.

If you grew up hearing certain sounds, it will usually be easy for you to hear and produce those sounds, whether it's the sounds of Arabic, click languages, or English (as native speakers of Chinese or Japanese know wrt the sounds made by 'l' and 'r'). Infants hear all the distinctions, but they tend to lose the ability as their brain learns their first language (or first and second in bilingual homes). To some extent you can, however, pick up the distinctions up to around puberty. And adults who have studied phonetics (all the sounds used by all the languages of the world) can generally learn to hear and produce those as well.

Writing systems generally do not change the spoken language, as evidenced by the way English is written, which bears only some relationship to the spoken language (and often only to the way English was spoken hundreds of years ago).

In order to to learn another language, linguists study phonetics (as mentioned above), as well as phonology, morphology, syntax etc. I can say from experience that these help. I don't think I'd call these subjects "neuroscientific", just linguistic.


> how does a language develop in the brain

Huge question, but there's not really a lot of data to go on. There do seem to be some things that are a lot easier to learn when you are very young.

> why is it easy for me (as someone who grew up in the East) to learn guttural sounds of Arabic

It all has to do with the speech sounds you're exposed to when you're young.

> Does the writing system affect speech, and if yes, then how?

Probably not. Very little if so. The main effect would be preserving old speech forms of the language to some extent, if the writing is phonetically based.

> What are some known neuroscientific tricks (for adults, not kids) for acquiring specific languages?

The most important thing is lots of exposure and practice and also the way you're practicing the language should be as close as possible to how you will actually use the language. There aren't any massively effective shortcuts.


I asked an LLM the same questions, and that's what I got:

Here are some key resources on language psychology and acquisition:

1. "Language in Mind: An Introduction to Psycholinguistics" by Julie Sedivy - Covers basic concepts and neural basis of language

2. "The Language Instinct" by Steven Pinker - Classic work on language acquisition and development

3. "Foundations of Bilingual Memory" by Heredia & Altarriba - Explains why early exposure shapes sound perception abilities

Writing system changes mainly affect literacy, less so speech. The brain processes spoken and written language somewhat separately.

For deeper research, look up "phonological acquisition" and "critical period hypothesis" in scientific journals.

--

Would appreciate human input on these topics as well. Thanks.


#1 is a textbook that I've taught out of. It's a good summary of decently up-to-date psycholinguistics.

#2 is more of a polemical book, maybe not the best if you want a fair summary.

#3 I've never heard of.


Chomsky has that wrong theory about language innateness. In reality languages that don't fit child brains don't survive. It's the other way around!


Not sure why you think that would prove Chomsky wrong. First, languages that don't fit brains (children's or otherwise) aren't going to be spoken by humans in the first place. And second, languages that fit our brains--all 7000 of them spoken (or signed) today, as well as those that are now extinct--don't fit chimpanzees' brains. So there's something about our brains that makes it possible for us to learn language(s).


Are Chomsky's ideas on language still taken seriously these days?


Yes, although there are other theories. Some of those other theories actually retain some of Chomsky's ideas, such as our innate capability to learn language (which seems to be the idea most commonly disagreed with). Two examples are Lexical Functional Grammar (LFG) and Head-driven Phrase Structure Grammar (HPSG).


Why would that idea be disagreed with? Even if language learning started out piggybacking on some general mechanism, that general mechanism would have biases, and evolution would, I think, tend to optimize/specialize it over time.


I can't really speak for why it is disagreed with--I personally find the claim that language is innate completely convincing. But one reason I've heard for disagreeing is the notion that some animals have language (not true unless you water down the definition of "language"). Others seem to think that general learning mechanisms explain the acquisition of language. I find that outrageously naive, since virtually all children acquire language (and if they're raised bilingually, more than one language), whereas even good linguists struggle to define all the rules of a language.


Why not? Do you think they are somehow refuted?



That looks like a disagreement with him on the utility of machine learning; while I don’t know whether Chomsky is right about his more fundamental ideas on linguistics, that doesn’t seem relevant.


Ah, thanks. Makes sense. norvig's opinions would be highly regarded here, and the existence of LLMs implies strong evidence to the contrary.


Yes, Chomsky's earlier positions include the idea that recursive natural grammars cannot be learnt by machines because of the "poverty of the stimulus." But this result is only true if you ignore probabilistic grammars. (See the Norvig article for some footnotes linking to the relevant papers.)

And of course, LLMs generate perfectly reasonable natural language without any "universal grammar", or indeed, much structure beyond "predict the next token using a lot of transformer layers."

I'm pretty sure that most of Chomsky's theoretical model is dead at this point, but that's a long discussion and he's unlikely to agree.


Chomsky had a stroke awhile back, which apparently left him unable to speak. But I guarantee that there are many linguists who would not agree that his model is dead.

As for LLMs, at present they require orders of magnitude more training data than children are exposed to, so it's unclear they have anything to say about how humans learn language.


The paper https://www.frontiersin.org/journals/psychology/articles/10.... appears in Frontiers in Psychology.


Does beginning a submission title with "Genomic study:" provide an air of credibility? As in, before any reference to what is being studied, genomic study is mentioned. I'm actually curious.


I think it's less about credibility in general rather than the source of the information. It could have been some sort of archaeological evidence instead.


I wonder, if it is possible to detect what exactly changed in brain, as this could be a clue to create GAI.

Even strict list of changed genes could be extremely helpful for AI progress.


Hey, heads up. It's now 2025, machines have mastered language, displaying better linguistic abilities than those of a large percentage of the population. They also do so using a small fraction of the neurons and synapses found in the human brain, despite being able to express themselves in tens or hundreds of languages and across an extremely wide range of subjects (much beyond the average general knowledge of human beings), dispelling the myth that artificial neurons are much less powerful than the real ones.


LLMs are interest case, but they are nearly flat. Neural system of mammals is structured, as I understand, neocortex consists of about 100 millions structures each ~ 100x400 neurons, something like this. Anyway, just from calculation of human thinking delay appear human NN is just about 500 layers.

Second difference, natural I are feed-forward network, not back propagated as typical AI NN, because natural use some chemical method to "calculate" parameters.

I think, with right structure and some FF method, GAI is already technically achieved.

PS FF NN already exist, but problem that it now using classical method of differential equations which is prohibiting real use, because too much computation need.


> LLMs are interest case, but they are nearly flat.

They are less specialized than brains, but compensate millions of years of evolutionary adaptation with a larger corpus of text, about the same size as 11,000 people use in their lifetime, assuming 1B words/human.

Interestingly, humans needed 10 million times more text for cultural evolution, based on an estimate of 110 billion humans who ever lived.

So making progress is millions of times harder than catching up.


> humans needed 10 million times more text for cultural evolution

Could you provide source, where exists this 10 million times more text?

- Text which is not stored on some material carrier is just forgotten.

Largest old text system I know is Confucianism. It is huge, but very far from 10 million times 1B.

Other known text systems are all much smaller than Confucianism.

https://en.wikipedia.org/wiki/Confucianism


Estimating the number of words used by humanity: starting from total number of humans who ever lived 110B, multiply that with 80 year * 365 days * 35000 words ~= 1B words/human lifetime, you get 1.1*10^20 words.

Now, the training set of GPT-4 was around 14T tokens, let's say 10T words. If you divide them you get around 10 million.


> total number of humans who ever lived 110B

But why do you think that all 110B individuals have made significant contribution to overall intelligence?

I think, at best case, in every generation exists few significant contributors, but all others are just carriers of genome diversity and nothing more, and all they have done, just disappear as a breath of wind.

So, for 200k years, if one generation 40 years, will be just 5000 generations, ok, lets consider significant 1000 persons from each generation, will be just 5millions, 5 magnitudes less than your estimation, and BTW much closer to known estimations of humanity knowledge and size of datasets used to train largest existing models.


BTW one of the largest in world collection of texts - Library of Congress, old estimation considered to about 100B characters, or approximate 100B LLM tokens.

Plus, they have thousands of multimedia carriers (cinema, music, etc) and one time was archived all tweets for history preservation.

But all multimedia and tweets are much less volume than texts, but added few additional dimensions, hard to express with text.


No one said artificial neurons are “less powerful.” They just work differently, and the way we “train” them is much less efficient and extensible. So of course there’s value in trying to understand how our language mechanisms work.


“Express themselves”: such a wealth of wrongness packed into two short words. There is no self involved. There is no expression involved.


It is impossible to figure this out and it's dumb to try to give a number. We don't think Neanderthals could speak? They also had languages.


We already know that genes regulates speech - for example, FOXP2 [0] - and have successfully sequenced the human genome, and have started similar initiatives on other archaic human and primate species.

Phylogenetic Analysis has been fairly successful already in analyzing our genetic history, so I'm not sure why you'd think it's impossible.

[0] - https://en.m.wikipedia.org/wiki/FOXP2


I found this comment interesting.

https://www.reddit.com/r/AskAnthropology/comments/78o048/did...

>As a quick point and this is quite a late response, having the same FOXP2 gene may not be enough. New evidence suggests

>Using statistical software that evaluates gene expression based on the type of gene, Vanderbilt graduate student Laura Colbran found that Neandertal versions of the gene would have pumped out much less FOXP2 protein than expressed in modern brains. In living people, a rare mutation that causes members of a family to produce half the usual amount of FOXP2 protein also triggers severe speech defects, notes Simon Fisher, director of the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands, who discovered the gene.


There are many cases, when animals (or birds) literally speak and even showing very human like behavior on talking.

But, as I could see, only humans have so significant language culture that lead to great number of manuscripts.

I will even accent - little number of humans write books, but near none other species do this.


I'm not an expert, but I guess they mean "complex enough language" to discard isolated words and consider only "languages" that have "sentences that may have 5 words" or some more accurate criteria.


How do we know Neanderthals had languages? I'm curious what the evidence is for this.


Well, we know that they could communicate (or be attractive enough) to interbreed with Homo sapiens — presumably they weren’t the genetically stronger species because their DNA is generally 1-3% represented in ours.

We don’t know, as you note. But the genetic evidence combined with tool use and their large Supra-orbital indices (fairly big brains, if shaped differently than ours) all would make that the preferred prior I’d say.


It's not clear how much interbreeding there was. It doesn't take many introgression events over the millennia to show up in the genetic record. And, to put it delicately, it's not clear any communication was involved in these events.


We do not know whether Neanderthals could speak.


I'd say those guys back then started fuddling around with the definition of genocide about that time too.


[flagged]


We have observed that just knowing something doesn't make it true.


the one true salvation is jesus christ to the one true god repent evolution is sin people knew better during feudalism


Existence of language requires 2 mutation trees, one for hearing, one for sound making. Could this be why it took so long?


Sign language doesn’t need hearing or sound making!

I suppose language requires a means of communication and someone to communicate to at a minimum. Although actually we think in a language so I guess we don’t need someone to communicate to.

Then maybe language creation just requires the right sort of thoughts?

Edit:

I should have read the article first LOL!

They say as much:

> "Language is both a cognitive system and a communication system," Miyagawa says. "My guess is prior to 135,000 years ago, it did start out as a private cognitive system, but relatively quickly that turned into a communications system."


A good question would be whether sign language can exist at all without spoken language (and its related adaptations) having first existed within the species.


There’s a lot of this in evolution. Look into how possibly unlikely the emergence of eukaryotes was.

The most likely explanation for the Fermi paradox is just that high intelligence is incredibly rare.

Why are we here? Because we are having this conversation over a global digital network. The universe is probably full of bubbling mats of bacteria.


Why is emergence of bacteria not rare? We don't know in sufficient detail how Origin of Life happened to constrain its probability very well.


We of course don't know. But I think the claimed evidence is that bacterial life arrived quite early in Earth's history, perhaps a few hundred million years or less after there were permanent basins of water. So the assumption is that it arrived almost as soon as it possibly could (whereas intelligent life (us) took a few billion years longer to show up).


But hearing and sound making both have many other uses. Jellyfish can "hear", crickets can make noises. I'm guessing we had both of these long before language.


I can't speak for the OP, but I think what he means is the ability to map sounds into a structured analysis of a sequence of words, and the ability to map thoughts into the same kind of structure.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: