Hey, something I know about! I actually worked on one of those research projects in college, programming the experiments. The idea is that there is a certain radius around the focal point where you stop being able to detect changes. I'm not sure what the final results were, but the theory was that you can calculate how blurry an image can be and still be discernable based on how far it is from the focal point of your vision. It was surprising how good people are at detecting changes in a blurry picture that's way out in their periphery.
My prof used to joke that, whenever he'd discuss fovea and periphery, that if we had a fovea the size of our full field of view, we'd need a brain the size of an elephant to process all that information. It's interesting how we're very sensitive to sudden changes (thus movement) in our periphery, but are so bad at classifying/identifying static imagery.
I remember reading about right eye and left eye dominance. Where they'd keep an image on the screen saccade invariant (ie, compensate for any saccades that were made). Slowing moving a letter/character/word/whatever to the edges of the participant's field of view and asking when the character was no longer legible. This happened surprisingly quickly, but at different positions for the left eye and right eye for pretty much all participants..