Camera access through passthrough technology is currently sparking quite a debate in the XR world. We know where Meta, Apple, and Pico stand, but the big question remains: What will Google do with its Android XR? Well, after having a chat with Google, I can share that their approach will resemble the current phone system. Stick around to learn more!
## The Camera Access Conundrum
If you’re scratching your head about this, let’s take a moment to break it down. These days, the standalone VR headsets we see are also MR devices, which present an RGB passthrough view via front cameras. This feature paves the way for a host of mixed reality apps, like Cubism, Starship Home, and Pencil, just to name a few.
Through these cameras, the operating system provides users with passthrough views. As developers, many of us are keen to tap into these camera frames to apply AI and computer vision algorithms, enhancing users’ experiences by making applications more context-aware. In previous work, I’ve even managed to whip up an AI+MR prototype that aids in interior design, all thanks to a clever workaround on the Quest headset.
While this is exciting, it also opens up a can of worms: privacy concerns. If a developer has malicious intents, they could potentially capture images without users’ consent, extracting sensitive info like ID or banking details in the process. Then there’s the risk of snapping pictures of faces or bodies for unsavory purposes.
Balancing user privacy with the potential to revolutionize mixed reality isn’t easy. It’s a tightrope that needs careful navigation.
## The Stance of XR Companies
In the early days, camera access was open, with no major restrictions. Long-time followers might recall some fun experiments my team at NTW and I conducted back in 2019—we experimented with diminished reality, Aruco marker tracking, sound reactivity, and more on the Vive Focus.
However, as mixed reality caught on, companies started tightening the reins due to privacy concerns. Meta, Pico, HTC, and Apple have all restricted developers’ access to camera frames.
This blockade stayed in place until developers, acknowledging the necessity of the feature, began to push back. Pioneers like Cix Liv, Michael Gschwandtner, and myself began advocating for transparent camera access, emphasizing the potential for user-approved enhancements like object recognition via computer vision. We questioned why, while phone users can freely grant camera access, XR devices should differ.
This advocacy led to companies like Meta announcing a “Passthrough API” release. But what’s Google’s play with Android XR?
## Android XR’s Approach: A Phone-Like Experience
Google’s Android dominates the global smartphone scene. Right now, if you’re creating an Android app, you simply request camera access, and if approved by the user, you specify which camera you need (e.g., the rear camera). With Google aiming to align Android XR with its phone counterpart, they’re planning something similar for their new OS. After some discussion with a Google spokesperson, here’s the gist of how camera access will work on Android XR:
1. Developers, with user consent, can use camera frames just like any Android app.
2. A detailed developer guide covers additional app permissions.
3. Developers can access the main camera stream by requesting camera_id=0, which acts like a regular Android rear camera.
4. Similarly, camera_id=1 will give developers access to a selfie-camera stream showing a user’s avatar, created from OpenXR API tracking data.
In essence, Android developers will continue using familiar camera management classes (like CameraX) for XR headsets, allowing for frame capture, video recording, and ML analysis. It’s excellent news!
But there’s a minor twist: while developers can access the “front camera,” the “rear camera” essentially shows a user avatar, mirroring Apple’s Vision Pro. This mimics the phone experience: see the world in front of users via the “rear camera,” while the “selfie camera” focuses on the user’s avatar.
## Android XR’s Consistency with Phone Permissions
Google’s design choices mean Android apps, including those accessing cameras, will operate seamlessly on Android XR. As a consistency enthusiast, especially in how permissions are granted across devices, I’m a fan of their approach.
Some might wonder about access to raw camera streams. Unfortunately, that’s still out of reach:
> Currently, applications cannot access non-standard sensor data (e.g., forward-facing or inward-facing camera).
This stance might change in the future, particularly for enterprise applications.
For Unity aficionados asking about development, Android Camera2 and CameraX are native classes. However, if Android XR mirrors standard Android behaviors, Unity developers can likely use WebcamTexture to capture frames. If that doesn’t pan out, JNI techniques can bridge Unity and native CameraX functions.
## A Heads-Up on Android XR
Remember, Android XR is still in the preview stage, and no official headset launch has happened yet. So, while these insights seem promising, they could evolve before the final release.
## The Shift Toward Camera Access
With Google and Meta embracing camera access, it’s likely other XR giants will follow. By 2025, we expect a wave of new possibilities in mixed reality, and I’m eagerly anticipating what the developer community will birth from this openness!
Disclaimer: This blog may contain advertisements and affiliate links. Clicking on an affiliate link might earn me a small commission. You can read the complete disclosure here.