Not sure how that would work in practice, but I personally work with a cup of tea between my hands most of the time. That could interfere. Leap Motion was popular some time ago, and it detected the cup as such, not something bigger. I don’t know if they used IR though.
As you said, when interpreting non-obvious outputs of probes, several detection elements presumably help interpret issues: say pressure, conductivity for trackpads. I believe ‘Moves’ (the app recently bought by Facebook) use both the phone cell-tower signal to triangulate ___location and the motion to correct the occasional jump to the next neighbourhood when the nearest tower is suddenly busy.
For that ‘virtual trackpad’, using several frequencies or motion sensors might help; I’m not a big fan of your suggestion of a button: it seems… cumbersome, but it might actually work. I was made to realise today how I unconsciously used three- to four-button keystrokes most of the time.
Not sure how that would work in practice, but I personally work with a cup of tea between my hands most of the time. That could interfere. Leap Motion was popular some time ago, and it detected the cup as such, not something bigger. I don’t know if they used IR though.
As you said, when interpreting non-obvious outputs of probes, several detection elements presumably help interpret issues: say pressure, conductivity for trackpads. I believe ‘Moves’ (the app recently bought by Facebook) use both the phone cell-tower signal to triangulate ___location and the motion to correct the occasional jump to the next neighbourhood when the nearest tower is suddenly busy.
For that ‘virtual trackpad’, using several frequencies or motion sensors might help; I’m not a big fan of your suggestion of a button: it seems… cumbersome, but it might actually work. I was made to realise today how I unconsciously used three- to four-button keystrokes most of the time.