Any chance apps can use raw imagery from those cameras? can imagine some pretty nifty uses (say, true live 3d videophone: use cameras to construct 3d model from low res data, then map color image onto model, just have to transmit already-sent video feed plus modest model data).
Interesting! B&W makes total sense. Low res also makes sense for head tracking but I guess that means pupil tracking is out of the question. Regarding low framerate, like lower than 30 fps? I would have thought continuous perspective shifting would require a framerate at or better than screen refresh rate to appear normal to the user.