Hacker News new | past | comments | ask | show | jobs | submit login

Not an expert here but could the fact that character based techniques work at all indicate that linguistics inspired ML may be superfluous? The authors here argued that the biological based consideration should point to phoneme based training. As the Chinese romanization corresponds tightly to phonemes (no irregular pronunciations as in English) the approach worked well with Chinese pinyin even though the native Chinese written system is totally different with thousands of characters.

What is interesting to me is that if ConvNet works well both for language and for visual processing that may well be because the human circuitry for processing both are very similar, while formalized grammar is at a different level (like logic) above speech as opposed to the linguistic view of a universal grammar undergirding speech.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: