Hacker News new | past | comments | ask | show | jobs | submit login
Move beyond behavioral targeting: using mouse movements to read visitor’s mind (visualwebsiteoptimizer.com)
62 points by sushi on April 1, 2011 | hide | past | favorite | 27 comments



Surely the raininess factor is an April fools at least?

The happyness factor from mouse movements? Well I'm unsure but I can see how someone could argue that could be calculated. But raininess? Please no.


It is not about whether it will rain or not. It is about whether a user thinks it will rain or not. Surely it can be predicted, isn't it?


I remember GazeHawk had done some research about this, I'd love to hear their thoughts.


I was preparing my "this is bullshit" face, which involves aggressively linking to our rebuttal ( http://www.gazehawk.com/blog/eye-tracking-vs-mouse-tracking/ )

Then I looked at the calendar. Well played Paras!


In the non-calendar related world, some of the cutting edge research on using cursor movements is done by the University of Washington and Microsoft Research: http://jeffhuang.com/Final_CursorBehavior_CHI11.pdf

The article shows how movements and hesitations can be used to figure out which links are relevant to you. I believe this is the largest scale study to date (38 subjects on eye-tracking + cursor-tracking, thousands of subjects on cursor-tracking only).

Also, this article from Emory University: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.169... uses cursor movements to figure out if a user is browsing/researching a product or ready to buy.


It's funny because that paper and our blog post cite the same Google paper. There's some level of interpretation of data, but some of his cited papers offer strong evidence contrary to his point.

From the Jeff Huang paper:

   The mean Euclidean distance between cursor and gaze is 178px 
   (σ = 139px) and the median is 143px.
The problem here is that there's a long tail of completely inaccurate data: the average distance between eye and mouse is 178px. I believe there are some people who tend to follow their eyes with their mouse, and some people who don't. Google research suggests the ratio is about 1:2.

That being said, I am a bit off topic here: there is most definitely value that can be extracted from mouse position. Can it be used to tell where the user is looking? Unlikely in the common case. Can it indicate purchase intent? I wouldn't be surprised.

My rebuttal was mostly premeditated by the amount of false interpretation of research about mouse-eye correlation. There's a lot of really good research out there, but it's often mis-cited to support an opposing claim.


Please make these an independent post?!


Looks cool :) looking forward to use it !!


There are mathematics symbols, it looks scientific. Must be true :)


One step further... using body movements...

http://mail.google.com/mail/help/motion.html

"Create a flowchart with ease" :-)

But is the undertone a suggestion that Kinect just isn't going to make it out of the world of gaming?


This is kool.


The ten test subjects is hardly a representative sample.


Don't forget all of them are from 4Chan. It is as representative as it can get.


Wow, I missed that completely. Tags: april fool joke, behavioral targeting, joke, lol, sophisticated


Mouse tracking only solves 20% of the problem.


almost fell for it.. well done


Can it also tell whether I am thinking of pink ponies while viewing your site, so that an appropriate visual style and colour scheme can be presented? I think that would be a valuable addition to an exemplary toolbox for professional web developers.


Is that an MCP neuron??


hey one question though.. is it related to the day?? :D


Nope, 4chan confirmed their involvement.


Which day are you talking about ;)


April 1st


you understood i guess :)


Read my mind. Also stole some of the change I had laying beside my laptop. Impressive.


I heard Color was using this technology to determine whether people were angry, disappointed, or just waiting for something interesting to happen when using their app.


Actually Color used an early version of this technology to determine how VCs would react to their pitch and tuned it appropriately. This really helps explain both the amount they raised and the confusing quote from Sequoia that "we haven't seen anything this rainy since Google!"


I heard the overwhelming outcome of that experiment was "perplexed"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: