Hacker News new | past | comments | ask | show | jobs | submit login

image caption is a separate, albeit related problem to what I'm talking about.

Ontologies are much the same; they are interesting for the problems they solve, but it's not clear how well those problems relate to the more general problem of language.

word embeddings are also quite interesting, but again, are typically based entirely off whatever emergent semantics can be gleaned from the structure of documents. It's not clear to me that this is anymore than superficial understanding. Not that they aren't very cool and powerful. Distributional semantics is a powerful tool for measuring certain characteristics of language. I'm not sure how much more useful it will be in the future.

Uunsupervised learning from video and images is a strictly different problem that seems to me to be much lower down the hierarchy of AI Hardness. More like a fundamental task that is solvable in its own universe, without requiring complete integration of multiple other universes. Whether the information extracted by these existing technologies is actually usefully semantic in nature remains to be seen.

I agree that we'll get there, somewhat inevitably; not trying to argue for any Searlian dualistic separation between what Machines can do and what Biology can do. I'm personally interested in the 'how'. Emergent Strong AI is the most boring scenario I can imagine; I want to understand the mechanisms at play. It may just be that we need to tie together everything you've listed and more, throw enough data at it, and wait for something approximating intelligence to grow out of it. We can also take the more top-down route, and treat this as a problem in developmental psychology. Are there better ways to learn than just throwing trillions of examples at something until it hits that eureka moment?




I think the key ingredient is to be reinforcement learning, and more importantly, agents being embedded in the external world.

Regarding the "internal world", we already see the development of AI mechanisms for attention, short term memory (references to concepts recently used), episodic memory (autobiographic) and semantic memory (ontologies).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: