Personally, I can’t imagine using VR/AR headsets for development because of eventual eye strain and the sweatiness factor. But I’m willing to entertain that there are specific dev activities that could benefit from the enhanced immersion. Maybe a debugger that’s built from the ground up to support gestures and provide all of my breakpoints, watchpoints, stack frames, variables, etc in a visually intuitive AR display. That could be amazing, even a killer app, but would be ridiculously difficult I think to get right. How cool would it be to single step through a function with a snap of the fingers, wave downward to jump to the return, and then jump over to an adjacent thread with a glance to the left?