TL;DR: Reality Labs Research have been on a mission to pass the visual Turing test—attempting to create virtual experiences that are indistinguishable from the physical world. While it’s a subjective rubric, no present-day VR system has met the mark. But with our latest research prototype headsets being presented next week at SIGGRAPH 2025, it’s an achievement that may be closer than you think.

It’s not often that you get your mind well and truly blown. But in the past five years or so, I’ve managed a bit of a hat trick.
First, I witnessed the magic of true audio presence and perceptual superpowers—two ways in which the human sense of sound could shape the future of the next computing platform. In 2023, I got to try out Butterscotch Varifocal and Flamera: two groundbreaking research prototypes that took the SIGGRAPH community by storm. And last year, I got to try Orion, our first AR glasses prototype, for the first time, which was pretty damn special—there’s something about sEMG as a control scheme that just hits different.
These were defining moments, up there with the first time I strapped on an Oculus DK2 and got my first taste of modern-day virtual reality. And on my latest trip to the Reality Labs Research offices in Redmond, Washington, I had my mind blown once again.
If you’re on the ground at SIGGRAPH 2025 in Vancouver next week, you can get hands-on with three of our latest research prototypes as part of the Emerging Technologies program. If not, read on for the full story.
Tiramisu: Hyperrealistic VR
With high contrast—roughly 3x that of Meta Quest 3—combined with an angular resolution of 90 pixels per degree (PPD)—3.6x that of Quest 3—and brightness up to 1,400 nits—14x that of Quest 3—Tiramisu is an industry first, combining the benefits of above retinal resolution* with significantly enhanced brightness to set a new milestone for realism in VR.
Tiramisu is a time machine, a glimpse of what could be possible with years of research and development. And there are definite trade-offs—the form factor is bulkier and heavier than today’s consumer VR headsets, and the field of view (FOV) is quite limited at 33° x 33°—but it’s the closest we’ve come to a visual experience that passes the visual Turing test yet.
In other words, it’s a small window onto a virtual world that rivals the intricate detail and believability of the physical one.
With dual high resolution µOLED displays and specially designed refractive viewing optics, text is sharper, the brights are brighter, and the darks are darker—comparable to what you get with some HDR TVs. The demo incorporates the best real-time graphics that Unreal Engine 5 can produce. The high performance cost is offset thanks to NVIDIA’s DLSS (Deep Learning Super Sampling) 3.
“Our mission for this project was to provide the best image quality possible,” explains OPALS Optical Research Scientist Xuan Wang. “We deprioritized the form factor and used glass instead of plastic lenses that you find in most consumer headsets. This provides much better image quality—albeit at a heavier weight—and minimizes any aberrations and pupil swim across the FOV.”
Even though Tiramisu is a time machine, it incorporates a lot of existing consumer technology, including the inside-out tracking system of Quest 2.
“Obviously, we wanted to focus on the optical performance, which is why we used custom glass lenses and the µOLED panels that give us high contrast,” says Wang. “And we built the custom optical module based on existing system architectures like what you find in Quest 2. We tried to leverage existing products as much as we could.”
Developed by Reality Labs Research’s Optics, Photonics, and Light Systems (OPALS) team, Tiramisu has been wowing folks on sister teams like Display Systems Research (DSR) during the run up to SIGGRAPH.
“The OPALS team is researching new ways to radically change the way that people see and interact with light,” notes OPALS Optical Science Manager Ying “Melissa” Geng. “Our mission is to create a virtual display that is almost indistinguishable from being in a place physically, allowing for social presence using a compact and comfortable VR headset. We want to make something worth making—to build toward a product that we’d all want to use ourselves.”
“Tiramisu isn’t something you can really understand just by reading an abstract,” adds DSR Director Douglas Lanman. “It really is like the first time you see 4K TV or an HDR TV, or the first time you go on a really immersive theme park ride that sprays water in your face. It’s something new, and it’s hard to understand this combination of numbers on a page really feeling different. Honestly, it’s the first headset in a while that really gives me a sense of wonder. It’s the most realistic VR image I’ve seen yet, with very impressive specs that add up to something that does look more realistic than anything we’ve seen in VR before.”
Boba 3: Ultrawide Field of View Virtual & Mixed Reality
Where Tiramisu sacrifices on FOV, Boba 3 fills the gap.
Most consumer VR headsets have a horizontal FOV of ~110°, while the human visual system’s horizontal FOV is roughly 200°. Boba 3, a pair of research prototype headsets developed by DSR, boasts a horizontal FOV of 180° and vertical FOV of 120°—compared to 110° and 96° respectively for Quest 3—with a form factor and resolution comparable to today’s consumer VR headsets.

The world got a sneak peek at our work on ultrawide FOV VR in October 2024. And as Meta CTO & Head of Reality Labs Andrew “Boz” Bosworth noted in an Instagram AMA, there was a very real tradeoff at the time. Not only was the form factor far bulkier than something like a Quest 3, the resolution per eye was roughly 1080p.
What the world didn’t know was that the prototype Boz showed off had an optical and display stack that was invented about nine years ago.

“The constraint is basically the total bandwidth that you can provide to your visual system,” says DSR Optical Scientist Yang Zhao. “ If you cover a large field of view, you’ll see very sparse sampling of the world. If you want to see both very dense sampling and a wide range FOV, then you require a lot of bandwidth—a lot of pixels—and that was not available at the time. With Boba 1’s optical stack, the display resolution per eye was about 2K by 1K. Boba 2 from last year was 3K by 3K, and Boba 3, which we’re showing at SIGGRAPH this year, is 4K by 4K. That’s more than 7x the number of pixels compared to Boba 1. That’s the advancement we’ve seen in the display. And we needed optics to be able to resolve that resolution, which is only possible because Meta has invested in pancake lenses over the last 10 years.”

The supply chain was also brought up during that time, with our industry partners investing in fundamental technologies, components such as films and lens material resins, and more.
“And of course, there’s silicon,” adds Zhao. “Thanks to AI and gaming, crazy-spec GPUs exist on the consumer market and let us power that many pixels.”
Quest 3 is an amazing piece of hardware, letting people step inside and interact with virtual worlds like never before. Boba 3 feels like breaking through to the next level of immersion, opening up the periphery and showing off what you didn’t even know was missing.
And it’s important to note that Boba 3 is not a time machine. Rather than requiring years of additional R&D, it leverages displays in mass production and similar lens technologies to those found in Quest 3. And while the Boba 3 prototype weighs in at 840 grams, the Boba 3 VR prototype is just 660 grams—a modest reduction compared to Quest 3 with the Elite strap at 698 grams.
“It’s something that we wanted to send out into the world as soon as possible, but it’s not for everyone,” Zhao notes. “It’s not going to easily hit a mass-market price point. And it requires a top-of-the-line GPU and PC system.”
With a custom optical design leveraging high-curvature reflective polarizers, Boba 3 sets a new standard for state-of-the-art entertainment and telepresence in both VR and MR. While Quest 3’s FOV covers approximately 46% the FOV of the human visual system, both the Boba 3 and Boba 3 VR research prototypes cover roughly 90%—and they do it at a resolution slightly greater than that of Quest 3 (30 PPD for Boba 3 compared to 25 PPD for Quest 3).

Boba 1, Boba 2, and Boba 3.
“If I had the headset for Boba 3 in 2025 and I teleported back to 2017, I wouldn’t be able to run it,” Zhao explains. “It’s just way too many pixels. But now, all the ingredients are there, and our job is to put them together and cook a nice meal. We don’t usually go out and grow new vegetables—we identify the latest and greatest and put them together. We also had the ability to bootstrap based on our earlier projects, like Butterscotch, which kind of compounded our success. It was a matter of iterating and finding the right recipe with the supply chain.”
The Visual Turing Test Redux
At Meta, data wins arguments. And for Reality Labs Research, and within DSR in particular, that ethos is summed up by the words “demo or die.”
While a project may sound good on paper, it’s only once the experience is realized through a full-fledged demo that our teams are able to truly understand and rally around the why. And although Tiramisu and Boba 3 are purely research prototypes, with novel technologies that may never make their way into a consumer product, they’re important steps on the road to the next computing platform—allowing Reality Labs Research to demonstrate the tangible value of these innovations and what matters most when delivering a truly immersive experience.
“We’re working to figure out what lies beyond television and laptop screens,” says Lanman. “Most of the people I work with are makers at heart. We’re just trying to make something awesome—not just as a demo, but as something that we’d use every day. And that really is the true Turing test: Are we excited about what we’re doing after a decade of doing it? That really is the mission, and we try to do it as best we can.”
“Both OPALS and DSR thrive on innovation, and we have the best collaboration that I could imagine between the two teams,” adds Geng. “Also, I feel lucky to have autonomy to find our own research charters. Amazing innovations happen when passionate inventors take on a new challenge together. Turning inventions into demos is deeply rewarding, and this collaborative spirit also drives me personally. Autonomy, innovation, and collaboration with wonderful partner teams such as DSR—these are the things that get me excited to come to work every day. We basically get paid to do our hobbies with the smartest and kindest people around. It’s hard for me to think of anything better.”

Left to right: DSR Optical Scientist Yang Zhao, OPALS Optical Science Manager Ying “Melissa” Geng, DSR Director Douglas Lanman, and OPALS Optical Research Scientist Xuan Wang.
If you’re attending SIGGRAPH 2025, check out Tiramisu and Boba 3 from 10:30 am on August 11 – 3:00 pm on August 14 in the West Building Exhibit Hall 3. And if you’re not, consider booking your ticket next year for the latest taste of what our teams have been cooking up.
TrendForce 2024 Near-Eye Display Market Trend and Technology Analysis
Release Date:2024 / 07 / 31
Languages:Traditional Chinese / English
Format:PDF
Page:164
If you would like to know more details , please contact:
|