Hacker News new | past | comments | ask | show | jobs | submit login

All machine learning is about finding hyperplanes.



This is wrong!

The term hyperplane already assumes that the hypothesis space that your learning algorithm searches has some kind of dimension and is some variant of an Euclidean / vector space (and its generalisations). This is not the case for many forms of ML, for example grammar induction (where the hypothesis space is Chomsky-style grammars) or inductive logic programming (hypothesis space are Prolog (or similar) programs), or, more generally, program synthesis (where programs form the hypothesis space).


It can also just be some sort of partitioning. I would be really surprised if there was no partitioning of some space.


Note that "some sort of partitioning" isn't a hyperplane. A partition is a set-theoretic concept. A hyperplane is (a generalisation of) a geometric concept, so has much more structure.


Alright how about coalgebra.


Hyperplanes is all you need


Someday I'm going to write a paper that achieves SOTA results with a nigh-incomprehensible mishmash of diverse techniques and title it "All You Need Considered Harmful".


Hyperplanation is all you need


The large dimensionality seems to be what creates the need for heuristic designs rather than a generic approach


Hyperplanes are the heuristics.


What is hyperplane ?


For 2d, a line, for 3d a plane, for nd a hyperplane.

https://en.wikipedia.org/wiki/Hyperplane


A hyperplane is a multi-dimensional linear function that splits space into two distinct regions. In the context of a classifier, it splits feature space into disjunct sub-spaces (one for each class). SVMs effectively place a hyperplane with maximum margin, thereby separating classes in an optimal way.


Worth keeping in mind that though it may be optimal according to some mathematical criterion, that is no guarantee that it's the best for the purposes you have in mind.


A subspace of dimension n-1 of a n-dimensional vector space. It is an extension of the well-known concept of a 2d-plane in a 3d-space to nd-spaces.


You could also describe a hyperplane as the set of solutions of a system of linear equations.


Or as the subspace of all the vectors are orthogonal to a given single vector, or as the subspace generated by any orthogonal basis with one base vector removed, or as the kernel of a linear form, ... – but a more visual explanation is probably better as a first foray in the question.


I agree that a more visual explanation is better in general.

I was trying to hint how the visual explanation relates to the long vectors of numbers we actually feed our machine learning contraptions with. Not sure I was successful.


This is really good !


a partition of Euclidean space into two convex sets ;)


it's a word (a made up word)


All words are made up!


Yeah, but only a few are made up to seem like terms of art designed to obfuscate their actual meaning; and usually prepending "hyper-" to something is a signal that a more clear description of the thing doesn't yet exist.

Downvote away, fellas.



and multi-dimensional topological manifolds, maybe :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: