Meta has lastly launched the long-awaited Passthrough Digicam API for Quest, which provides builders direct entry to the headset’s passthrough RGB cameras for higher scene understanding—kickstarting the following technology of extra immersive blended actuality experiences on Quest.
Till now, Quest’s passthrough cameras have been principally locked down, limiting what builders may do past Meta’s built-in functionalities. The corporate talked about again at Join in September it could finally launch Quest’s Passthrough Digicam API, though it wasn’t sure when.
Now, within the Meta XR Core SDK v74, Meta has launch Passthrough Digicam API as a Public Experimental API, offering entry to Quest 3 and Quest 3S’ forward-facing RGB cameras.
Passthrough digicam entry will primarily assist builders enhance lighting and results of their blended actuality apps, but in addition apply machine studying and pc imaginative and prescient to the digicam feed for issues like detailed object recognition, making blended actuality content material much less of a guessing sport of what’s in a person’s atmosphere.
When it was introduced final 12 months, former Meta VP of VR/AR Mark Rabkin stated the discharge of Quest’s Passthrough API would allow “all types of cutting-edge MR experiences,” together with issues like tracked objects, AI purposes, “fancy” overlays, scene understanding.
This marks the primary time the API has been publicly obtainable, though Meta has beforehand launched early builds with choose companions, together with Niantic Labs, Creature, and Decision Video games—that are presenting at the moment at GDC 2025 in a Meta discuss entitled ‘Merge Realities, Multiply Marvel: Professional Steering on Combined Actuality Improvement’.
Granted, as an experimental function, builders can’t publish apps constructed utilizing the Passthrough Digicam API simply but, though it’s probably Meta is once more taking an iterative method to the function’s full launch.
The v74 launch additionally contains Microgestures for intuitive thumb-based gestures (e.g., thumb faucets, swipes), an Immersive Debugger so builders can view and examine Scene Hierarchy straight throughout the headset, and new constructing blocks, akin to buddies matchmaking and native matchmaking.