For what it's worth, this was with an HTC Vive of some kind. However, the screen pixel densities don't change when you do foveated rendering, it's more of a performance trick - the GPU focuses most of its compute power on what you are looking at.
> the screen pixel densities don't change when you do foveated rendering
That's the limited kind of foveated rendering, yes.
Apple has a system of lenses on a gimbal inside this thing. Which is precisely what's required to do the (so-far hypothetical) "full" kind of foveated rendering — where you bend the light coming in from a regular-grid-of-pixels panel, to "pull in" 90% of the panel's pixels to where your pupil is, while "stretching out" the last 10% to fill your peripheral vision. Which gives you, perceptually, an irregular grid of pixels, where pixels close to the edge of the screen are very large, while pixels in the center of the screen are very small.
The downside to this technique is that, given the mechanical nature of "lenses on a gimbal", they would take a moment to respond to eye-tracking, so you wouldn't be able to immediately resolve full textual detail right away after quickly moving your eyes. Everything would first re-paint just with "virtual" foveated rendering from the eye-tracking update; then gradually re-paint a few hundred more frames in the time it takes the gimbal to get the center of the lens to where your pupil now is.
(Alternately, given that they mentioned that the pixels here are 1/8th the size in each dimension, they could have actually created a panel that is dense with tiny pixels in the center, and then sparse with fatter pixels around the edges. They did mention that the panel is "custom Apple silicon", after all. If they did this, they wouldn't have to move the lens, nor even the panel; they could just use a DLP mirror-array to re-orient the light of the chip to your eye, where the system-of-lenses exists to correct for the spherical aberration due to the reflected rays not coming in parallel to one-another.)
I'm not sure whether Apple have actually done this, mind you. I'm guessing they actually haven't, since if they had, they'd totally have bragged about it.
I'm guessing from this comment that you may not know much about optics or building hardware. Both of the solutions you have proposed here are incredibly bulky today, and would not fit in that form-factor.
> The custom micro‑OLED display system features 23 million pixels, delivering stunning resolution and colors. And a specially designed three‑element lens creates the feeling of a display that’s everywhere you look
They have advertised that there are 3 lenses per eye, which is about enough to magnify the screens and make them have a circular profile while correcting most distortion. That's it - no mention of gimbals or anything optically crazy.
I'm thinking there is confusion with the system used to set the PD (distance between eyes). Of course there are not many details, but it does look like there's a motorized system to move the optics and screens outwards to match the PD of the user.