News

the-best-thing-about-apple-vision-pro?-meta-finally-has-big-competition

The Best Thing About Apple Vision Pro? Meta Finally Has Big Competition

Meta has undeniably been the lone looming Goliath in a field of smaller Davids in the XR scene for years now. With Apple finally making its entrance into the market, Meta won’t be able to go at its own pace.

Apple’s new headset might be an absurd $3,500, putting it in a completely different class than Meta’s upcoming Quest 3 at $500, let alone the Quest 2 now at $300. But the pressure will still be on as comparisons are made between the experience Apple has crafted and what Meta offers.

After all, there’s no denying that while the Vision Pro is packed full of hardware, and has the benefit of Apple’s proprietary and powerful M2 chips, so much of what the headset is doing right is about the software experience rather than the fidelity that’s unlocked with the hardware.

Great Hardware, Struggling Software

The thing is, Meta’s headsets are plenty capable. Quest 2 is still a solid product that is in many ways still best in class and Quest 3 only promises to up the ante later this year with more power, higher resolution, improved lenses, and better passthrough AR. Meta’s hardware has always been quite impressive, even as far back as the original Oculus Rift CV1.

But on the software side the company has seriously struggled to make usability a priority. For all the lessons the company learned about the power of reducing friction in VR—by building a standalone headset that doesn’t need a computer or external tracking beacons—there has been seemingly little emphasis on making the same reduction in friction by creating a cohesive interface between Quest’s system interface, and Meta’s own first-party apps; let alone providing a set of clear and useful guidelines so that developers and users alike can benefit from a common user experience.

Lean on Me

Meta has leaned substantially on third-party developers to make using its headsets worthwhile to use. Game developers have done the painstaking work of refining how users should control their apps and interact with their worlds in entertaining ways. When you’re inside of a VR game, the developer is fully controlling the experience to make it cohesive and enjoyable, while sussing out the pitfalls that would turn off users—like bugs, convoluted menus, and inconsistent interactions.

If Meta’s headsets didn’t have games—but still did everything else they’re capable of—they would be dead in the water because of how painful it can be to use the headset outside of carefully crafted game experiences designed to entertain. On the other hand, Apple Vision Pro has a minimal emphasis on gaming (at least at the outset), but is spending significant effort to make everything else the headset does easy and consistent. By doing so, Apple is ensuring that the headset will be great for more than just gaming.

Despite the price difference between Vision Pro and Quest headsets, Meta is still going to have to stare this thing in the face and come to grips with what it could be doing better—for users, developers, and itself. The good news, at least, is that much of the room for improvement is in the software side of things.

The Vacuum

Until now, Meta has had no serious competition in this space. Its headsets—despite the criticisms I’ve laid out here—have consistently offered the best value in their class, with great hardware and a great game library, all at a very attractive price that others have largely been unable to match.

That’s made it hard for other headset makers to compete and left Meta little need to respond even if other companies do something better or innovative. It’s also meant that developers and users have very little leverage over what Meta decides to do—after all, where else are they going to go if they want an affordable standalone headset with the best library of content?

Meta has been able to create a vacuum in the consumer VR space which on the surface might look like success… but in reality, it has left Meta unfocused on what it needs to do to make its headsets appeal to a broader audience.

Better for Everyone

Now we have Apple in the game, ready to challenge Meta on hardware and the software experience. Price advantage is clearly in Meta’s favor, but it’s going to need to up its game, otherwise it risks losing not just customers, but more importantly developers, who might see greener grass on the other side—especially if they’re looking forward to a future where Apple’s headset comes down in price.

Apple’s entrance into the market might seem like a threat, but ultimately Meta now gets to sit back and examine all the hard work Apple has done over the years, then choose the best ideas to incorporate into its own offerings, while ignoring what it sees as missteps by Apple.

In the end, Apple’s headset is going to force Meta’s headsets to get better, faster. And that’s good for everyone, including Meta.

The Best Thing About Apple Vision Pro? Meta Finally Has Big Competition Read More »

‘sairento’-follow-up-‘hellsweeper’-coming-to-major-vr-headsets-in-september

‘Sairento’ Follow-up ‘Hellsweeper’ Coming to Major VR Headsets in September

Mixed Realms, developers behind the samurai-style action-adventure game Sairento VR (2018), announced that its follow-up, Hellsweeper VR, is set to release on major VR headsets in September.

Update (June 8th, 2023): Hellsweeper VR is coming to Quest, PSVR 2, and SteamVR headsets on September 21st, 2023. You can pre-order now on Steam, and wishlist on PSVR 2 and Quest.

The studio also released a new trailer showing off some of the high-intensity gameplay, embedded below this update.

Original Article (June 9th, 2022): The game, which is currently planned for Early Access release on Steam sometime later this year, is also coming to Quest 2, slated to arrive on the standalone “soon,” the studio says.

Published by Vertigo Games, Hellsweeper VR is a roguelike first person combat game where you take on the role of an undead immortal.

“Traverse the underworld where every step brings a challenge or a chance. Gain mastery of your weapons and elemental magic, or fall to the unrelenting onslaught of dark creatures,” Mixed Realms says.

In it, you’re tasked with gaining mastery of a wide range of weapons and elemental magic. As you’d imagine, upgrading your gear as you take on undead immortals along the way is supposedly a big part.

Here’s what the studio says about Hellsweeper VR:

“Hellsweeper VR came about from our desire to improve on what made our first game, Sairento VR, a huge hit with fans,” Mixed Realms says. “We wanted to build upon a core tenet of Sairento – an intense no holds barred locomotion system that offered wall-running, power-sliding, backflips and more – all while improving its arcade-style action with semi realistic physics, allowing you to pull off even crazier moves. Juggle enemies in the air. Lop off a limb and use it as a club. Land on an enemy and use them as a bloody surfboard – yes, you read that right.”

Check out the new trailer below:

‘Sairento’ Follow-up ‘Hellsweeper’ Coming to Major VR Headsets in September Read More »

zuckerberg-gives-his-first-reaction-to-apple’s-vision-pro

Zuckerberg Gives His First Reaction to Apple’s Vision Pro

Meta founder and CEO Mark Zuckerberg hasn’t been shy about addressing the elephant in the room: with Apple Vision Pro, the Cupertino tech giant is officially entering a market that, up until now, Meta has basically owned. In a meeting with Meta employees, Zuckerberg thinks that while Apple Vision Pro “could be the vision of the future of computing […] it’s not the one that I want”

As reported by The Verge, Zuckerberg seems very confident in the company’s XR offerings, and is less impressed with Apple’s design tradeoffs. During a companywide meeting, Zuckerberg said that with Vision Pro, Appe has “no kind of magical solutions” and that they haven’t bypassed “any of the constraints on laws of physics that our teams haven’t already explored and thought of.” He calls that “the good news.”

Largely, Zuckerberg says Apple is making some telling design tradeoffs, as its higher resolution displays, advanced software, and external battery comes alongside a $3,500 price tag—or seven times more than Meta’s upcoming Quest 3 mixed reality standalone.

Photo by Road to VR

But it’s also about ethos. Zuckerberg says the companies’ respective headsets represent a divide in company philosophy, as Apple products are typically developed to appeal to high income consumers. “We innovate to make sure that our products are as accessible and affordable to everyone as possible, and that is a core part of what we do. And we have sold tens of millions of Quests,” he said.

“More importantly, our vision for the metaverse and presence is fundamentally social. It’s about people interacting in new ways and feeling closer in new ways,” Zuckerberg continued. “Our device is also about being active and doing things. By contrast, every demo that they showed was a person sitting on a couch by themself. I mean, that could be the vision of the future of computing, but like, it’s not the one that I want.”

The Meta chief echoed some of these statements on the Lex Fridman podcast where he spoke about his opinions on Apple Vision Pro, noting that Apple’s mixed reality headset offers a “certain level of validation for the category.” Because Vision Pro will cost so much though, Zuckerberg maintains Quest 3 will overall benefit as people inevitably gravitate to towards the cheaper, more consumer-friendly option.

Here’s Zuckerberg’s full statement, sourced from the companywide address:

Apple finally announced their headset, so I want to talk about that for a second. I was really curious to see what they were gonna ship. And obviously I haven’t seen it yet, so I’ll learn more as we get to play with it and see what happens and how people use it.

From what I’ve seen initially, I’d say the good news is that there’s no kind of magical solutions that they have to any of the constraints on laws of physics that our teams haven’t already explored and thought of. They went with a higher resolution display, and between that and all the technology they put in there to power it, it costs seven times more and now requires so much energy that now you need a battery and a wire attached to it to use it. They made that design trade-off and it might make sense for the cases that they’re going for.

But look, I think that their announcement really showcases the difference in the values and the vision that our companies bring to this in a way that I think is really important. We innovate to make sure that our products are as accessible and affordable to everyone as possible, and that is a core part of what we do. And we have sold tens of millions of Quests.

More importantly, our vision for the metaverse and presence is fundamentally social. It’s about people interacting in new ways and feeling closer in new ways. Our device is also about being active and doing things. By contrast, every demo that they showed was a person sitting on a couch by themself. I mean, that could be the vision of the future of computing, but like, it’s not the one that I want. There’s a real philosophical difference in terms of how we’re approaching this. And seeing what they put out there and how they’re going to compete just made me even more excited and in a lot of ways optimistic that what we’re doing matters and is going to succeed. But it’s going to be a fun journey.

Zuckerberg Gives His First Reaction to Apple’s Vision Pro Read More »

apple’s-computer-vision-tool-for-developers-now-tracks-dogs-&-cats

Apple’s Computer Vision Tool for Developers Now Tracks Dogs & Cats

Would reality really be complete without our beloved four-legged friends? Certainly not. Luckily the latest update to Apple’s ‘Vision’ framework—which gives developers a bunch of useful computer vision tools for iOS and iPad apps—includes the ability to identify and track the skeletal position of dogs and cats.

At Apple’s annual WWDC the company posted a session introducing developers to the new animal tracking capabilities in the Vision developer tool, and explained that the system can work on videos in real-time and on photos.

The system, which is also capable of tracking the skeletal position of people, gives developers six tracked ‘joint groups’ to work with, which collectively describe the position of the animal’s body.

Image courtesy Apple

Tracked joint groups include:

  • Head: Ears, Eyes, Nose
  • Front Legs: Right leg, Left leg
  • Hind Legs: Right rear leg, Left rear leg
  • Tail: Tail start, Tail middle, Tail end
  • Trunk (neck)
  • All (contains all tracked points representing a complete skeletal pose)

Yes, you read that right, the system has ‘tail tracking’ and ‘ear tracking’ so your dog’s tail wags and floppy ears won’t be missed.

The system supports up to two animals in the scene at one time and, in additional to tracking their position, can also identify a cat from a dog… just in case you have trouble with that.

Image courtesy Apple

Despite the similarity in name to the Vision Pro headset, it isn’t yet clear if Apple will expose the ‘Vision’ computer vision framework to developers of the headset, but it may well be the same foundation that allows the device to identify people in the room around you and fade them into the virtual view so you can talk to them.

That may have also been a reason for building out this animal tracking system in the first place—so you don’t trip over fido when you’re dancing around the room in your new Vision Pro headset—though we haven’t been able to confirm that system will work with pets just yet.

Apple’s Computer Vision Tool for Developers Now Tracks Dogs & Cats Read More »

apple-vision-pro-debrief-on-the-voices-of-vr-podcast

Apple Vision Pro Debrief on the Voices of VR Podcast

Apple’s announcement of Vision Pro is reverberating throughout the industry. Beyond just a new headset, the company’s entrance into the space introduces new ideas that are now being discussed around the tech-sphere. To dig further into what Apple Vision Pro means for the XR industry more broadly, I spoke with host Kent Bye on the Voices of VR podcast.

Kent Bye has been consistently documenting the XR space since 2014 through his prolific podcast, Voices of VR, which now spans more than 1,200 episodes.

Over the years I’ve had the fortune of joining Bye on the podcast during pivotal moments in the XR industry. With the long-awaited release of Apple Vision Pro, it was once again time for a check-in; you can listen here to episode #1,217 of the Voices of VR podcast.

Beyond my previously published hands-on impressions with the headset, our discussion on the podcast covers some of the broader implications of Apple Vision Pro, including how the company’s ecosystem plays a major role in the value of the headset, whether or not the headset’s ergonomics are aligned with its use-case vision, and the ways in which Apple’s entrance into the space feels like a reboot of the industry at large.

Bye also interviewed several others for their takes and impressions of Apple Vision Pro. You can check out episode #1,216 to hear from Sarah Hill, CEO of Healium, and Raven Zachary, COO of ARound; episode #1,218 with Ian Hamilton, Editor at UploadVR; and episode #1,219 with Scott Stein, Editor at CNET.

Voices of VR is a listener-supported podcast; if you like what you hear, you can support Bye’s work on Patreon.

Apple Vision Pro Debrief on the Voices of VR Podcast Read More »

apple-vision-pro-to-support-one-of-vr’s-most-prominent-social-apps

Apple Vision Pro to Support One of VR’s Most Prominent Social Apps

Apple unveiled Vision Pro on Monday, its long-awaited standalone headset capable of both virtual and augmented reality. While the Cupertino tech giant seems to be emphasizing Vision Pro’s AR capabilities thanks to its color passthrough cameras, it’s also going to pack one of VR’s most prominent social apps, Rec Room.

Apple’s launch of Vision Pro is still a good bit away—it’s coming first to the US in early 2024 at the hefty price of $3,500. Still, what apps the Fruit Company will allow on the undoubtedly very curated Vision App Store will be telling.

As first noted by UploadVR, among them will be the hit social VR game Rec Room, which so far shares cross-compatibility with SteamVR, Meta Quest, Meta PC VR, PSVR, PlayStation 4/5, Xbox, iOS, Android, and standard monitors via Steam.

Rec Room was the only native VR app shown during the part of the keynote discussing third-part apps, which are coming to the headset via Apple’s integration of the Unity game engine.

Notably, Vision Pro doesn’t offer any sort of motion controller, instead relying on hand and eye-tracking, and voice input. In the past, Rec Room has primarily targeted motion controllers for VR input, however the apps is also set to bring both full-body avatars and new hand models to the platform, which will seemingly do away with the game’s wristless mitten-hands.

Apple Vision Pro to Support One of VR’s Most Prominent Social Apps Read More »

‘escape-simulator’-is-bringing-its-8-player-co-op-escape-rooms-to-vr

‘Escape Simulator’ is Bringing Its 8-player Co-op Escape Rooms to VR

Pine Studios, the team behind the Escape Simulator franchise, announced an upcoming VR port of the studio’s hit escape rooms.

Called Escape Simulator VR, the game is slated to bring both solo and online co-op, the latter of which supports up to eight players. Pine Studios says however two to three players is the best number for its swath of escape rooms, with is said to include 25 original rooms made in collaboration with real-world escape room designers.

Here’s how Pine Studios describes it:

Following the unprecedented success of the original version Escape Simulator VR was rebuilt from the ground up to be a comfortable and highly immersive VR experience. Pick up and examine everything, break objects, solve locks, and decipher puzzles to escape! After finishing the main game, watch for free content updates and explore 3000+ rooms built by the community.

The game is said to include all standard locomotion types such as teleport, smooth move, controller-based movement, and being able to be played in the full room-scale setting. There is also snap-turning, seated, and stationary mode.

Since it’s a VR version of the studio’s original Escape Simulator (2021), the studio is also promising cross-platform co-op. Escape Simulator VR is coming to SteamVR headsets and Quest 2/3 “soon” the studio says. You can wishlist it now on Steam here.

‘Escape Simulator’ is Bringing Its 8-player Co-op Escape Rooms to VR Read More »

hands-on:-apple-vision-pro-isn’t-for-gaming,-but-it-does-everything-else-better

Hands-on: Apple Vision Pro isn’t for Gaming, But it Does Everything Else Better

While Apple’s new Vision Pro headset isn’t going to satisfy the existing base of consumer VR users, it’s mastering the rest of the basics better than anyone else.

Probably 90% of what consumers are using VR headsets for today is entertainment, and of that entertainment, most of it is gaming. And if you’re among those people using such headsets today, you’ll reasonably be disappointed that Apple Vision Pro lacks controllers and isn’t going to be playing top VR games anytime soon. But for everyone else, it’s a back-to-basics approach that’s laying a sturdy foundation to build upon in the future.

Today at Apple’s headquarters I got to check out Vision Pro for myself. Unfortunately the company didn’t permit any photos or footage during the demo, but the clips below are a fair representation of what I saw.

Photo by Road to VR

Apple Vision Pro (AVP, let’s call it) is doing what only Apple can: carving out a subset of what other devices do, and making sure that subset of things is done really well. And given the current state of UX on most other headsets, this is a reckoning that was a long time coming.

Look & Tap

It starts with the input. Apple is leaning heavily into using your eyes as a cursor, and a pinch gesture as a click. The headset has cameras on the bottom that face downward so that even subtle pinches from your hand in your lap are visible and detected. But you don’t see a floating cursor where your eyes are, nor do you see a laser pointer shooting out of your hand. You just look at the thing you want to press, then do a quick pinch.

On paper you might think this sounds shoddy. But remember, this is Apple. They’ve tested and refined this system six ways from Sunday, and it works so well that after a minute or two you hardly think about how you’re interacting with the headset, you just are.

The pinch input is responsive and reliable. It felt so natural that the two or three times the headset missed my pinch during a 30 minute demo it felt really weird because my brain was already convinced of its reliability.

This look-and-pinch system is so simple for the headset’s basic input that I won’t be surprised if we see other companies adopt it as soon as possible.

Reality First

So there’s the simple input and then there’s a passthrough-by-default view. This is an MR headset after all, meaning it can easily do augmented reality—where most of your view is of the real world, with some virtual content; or virtual reality—where all of your view is virtual content.

When you put AVP on your head, you instantly see the outside world first. In fact, the way that Apple defers to the passthrough view shows that they want to treat fully immersive experiences as the exception rather than the rule. Generally you won’t pop into a fully immersive scene unless you actively making the decision to do so.

The passthrough view is certainly best-in-class, but we’re still probably two generations away from it truly feeling like there’s nothing separating your eyes from the real world. Granted, I was able to read all the text on my phone with no issue, which has been the ‘bar’ for passthrough quality that I’ve been waiting to see exceeded.

Beautiful Virtual Displays

The imperfect passthrough resolution somewhat betrays the exceptional display resolution which exhibits not even a hint of screen-door effect. It may not be ‘retina resolution’ (generally agreed to be around 60 pixels per-degree), but it’s good enough that I won’t know how far off it is from retina resolution until I sit down with an objective test target to find out.

That’s a long way of saying that the headset’s display has excellent resolution with great clarity across the lens. Top of the class.

This clarity is helped by the fact that Apple has done its Apple-y thing and ensured that panels, text, and images consistently render with superb quality. The entire interface feels iOS-polished with animations and easy to use buttons and controls. The interface was so simple to use that the demo chaperones had a hard time keeping me on task as I wanted to flick through menus and move floating apps around the room.

But here’s the thing, probably 75% of what Apple showed me was essentially just floating screens. Whether it was videos or a floating iMessage app or the web browser, it’s clear that Apple wants Vision Pro to be first and foremost be great at displaying flat content to the user.

The other 25% of what I saw, while very impressive all around, felt like just the start of a journey for Apple to build out a broader library immersive experiences.

Record & Rewatch Memories

AVP might not be a VR gaming headset, but it does at least one thing that no other headset does: capture volumetric memories using its on-board cameras. Using the button on the top of the headset you can capture volumetric photos and videos with just a press.

Apple showed me a demo of a volumetric video capture of a group of kids blowing out candles on a birthday cake. It was like they were right in front of me. I’d never even seen these kids before but I could immediately feel their giddy emotions as they giggled and bounced around… as if I was sitting right there while it was happening. Not to mention that the quality was good enough, at least in this best-case-scenario demo capture, that my first thought had nothing to do with the famerate or quality or dynamic range, but purely of the emotion of the people in front of me.

That instant connection—to people I don’t even know—was a clear indicator that there’s something special to this. I can already imagine watching a volumetric video of a cherished memory, or of a loved one that has passed, and I know it would be a powerful experience.

Doing it Right

And here’s the thing; I’ve seen plenty of volumetric video demos before. This isn’t a new idea, not even close. The thing that’s novel here is that everyday users could potentially shoot these videos on their own, and readily watch, share, and store them for later. On other headsets you’d need a special camera for capturing, special software for editing, a player app, and a sharing app to make the same thing happen.

This is the ‘ecosystem’ part of XR that’s missing from most other headsets. It’s not about what’s possible—it’s about what’s easy. And Apple is focused on making using this headset easy.

Continue on Page 2: Immersion Isn’t Off the Table »

Hands-on: Apple Vision Pro isn’t for Gaming, But it Does Everything Else Better Read More »

apple-to-open-locations-for-devs-to-test-vision-pro-this-summer,-sdk-this-month

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month

Ahead of the Apple Vision Pro’s release in ‘early 2024’, the company says it will open several centers in a handful of locations around the world, giving some developers a chance to test the headset before it’s released to the public.

It’s clear that developers will need time to start building Apple Vision Pro apps ahead of its launch, and it’s also clear that Apple doesn’t have heaps of headsets on hand for developers to start working with right away. In an effort to give developers the earliest possible chance to test their immersive apps, the company says it plans to open ‘Apple Vision Pro Developer Labs’ in a handful of locations around the world.

Starting this Summer, the Apple Vision Pro Developer Labs will open in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino.

Apple also says developers will be able to submit a request to have their apps tested on Vision Pro, with testing and feedback being done remotely by Apple.

Image courtesy Apple

Of course, developers still need new tools to build for the headset in the first place. Apple says devs can expect a visionOS SDK and updated versions of Reality Composer and Xcode by the end of June so support development on the headset. That will be accompanied by new Human Interface Guidelines to help developers follow best practices for spatial apps on Vision Pro.

Additionally, Apple says it will make available a Vision Pro Simulator, an emulator that allows developers to see how their apps would look through the headset.

Developers can find more info when it’s ready at Apple’s developer website. Closer to launch Apple says Vision Pro will be available for the public to test in stores.

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month Read More »

apple-unveils-vision-pro,-its-first-xr-headset

Apple Unveils Vision Pro, Its First XR Headset

Today at Apple’s Worldwide Developers Conference (WWDC) the Cupertino tech giant unveiled its long-awaited XR headset, dubbed Vision Pro.

Similar to Meta’s Quest Pro and newly unveiled Quest 3 headset, Apple’s first mixed reality headset is capable of both virtual reality and augmented reality thanks to its color passthrough cameras, however it appears the company is focusing much more on AR tasks.

Called a “spatial computer” by Apple, the device is in large part targeting general computing tasks such as content consumption, video chatting, and productivity apps–the sort you might find on the company’s iPads and Macs, albeit available through its own device-specific App Store.

Apple Vision Pro’s input is based on optical hand tracking, eye-tracking, and voice input, and doesn’t feature controllers like headsets decidedly more dedicated to gaming.

Image courtesy Apple

Here’s a brief breakdown of the spec shared with us today: Apple’s M2 chipset runs the standalone headset, while its new R1 chip processes input from 12 cameras, five sensors, and six microphones. R1 is said to stream new images to the displays “within 12 milliseconds — 8x faster than the blink of an eye,” the company says. That cable and pack you see in the image above is actually a battery, which the company says provides two hours of use.

Image courtesy Apple, via ArsTechnica

Vision Pro features custom micro‑OLED display system which the company says packs in 23 million pixels, or more than a 4K TV. We’re still learning about more specific hardware specs, such as field-of-view (FOV) and more concrete numbers for its displays.

The headset also features an exterior display to show a user’s eyes. A system, called EyeSight, can either obscure the digital version of your eyes to other people in the room, or show them to indicate you’re ready to talk face-to-face.

Image courtesy Apple

Vision Pro is coming to the US first in early 2024, priced starting at $3,500, putting it clearly in the “enthusiast” camp. Apple calls it its “most advanced personal electronics device ever.”

We’re at Apple’s campus for WWDC today and are going hands-on with Vision Pro today. Check back soon for our full impressions, and to find out if Apple’s first big entry into XR was worth the wait.


This story is breaking. Check back soon for more info.

Apple Unveils Vision Pro, Its First XR Headset Read More »

watch-apple’s-wwdc-keynote-right-here-at-10am-pt

Watch Apple’s WWDC Keynote Right Here at 10AM PT

Apple’s WWDC keynote is today, and the company is heavily expected to reveal an immersive headset for the first time. Here’s where to see the action live.

Apple’s WWDC keynote will be held at 10AM PT on June 5th (your timezone here). You can catch the official livestream from Apple embedded below:

Follow for Up-to-the-minute Updates

I’ll be on-site at Apple Park for the WWDC keynote, and maybe more than that… if you want the most up-to-the-minute updates for what comes after the keynote, follow along on Twitter: @benz145.

What to Expect

We’re expecting that Apple’s WWDC keynote will first focus first on its existing products, including major updates to its mobile and desktop operating systems, with the potential for a revamped 15-inch MacBook Air.

But of course the thing we’re looking for is the rumored announcement of Apple’s first XR headset, which we expect will come at the end of the keynote—though we’re still 50/50 on whether or not it’ll be preceded by the words “one more thing,” which the company hasn’t dropped since 2020.

Rumors for what an Apple XR headset might actually do or look like have varied substantially over the years, though recent leaks suggest the following:

  • Resolution: Dual Micro OLED displays at 4K resolution (per eye)
  • FOV: 120-degrees, similar to Valve Index
  • Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
  • Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hot-swappable for longer sessions.
  • PassthroughISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
  • Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
  • ControllerApple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
  • Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
  • IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
  • Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
  • Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
  • Room Tracking:  Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
  • App Compatibility: Said to have the ability to run existing iOS apps in 2D.

It’s very likely that this is only an initial announcement of the company’s headset, with a heavy focus on what developers will be able to do with it (need we remind you, this is Apple’s Worldwide Developers Conference). We don’t expect it to launch until later this year at the earliest, but when it does it’s not clear if Apple will position the device like a sort of early adopter development kit, or market it to consumers outright. The latter seems less likely considering the rumored price between $1,500–$3,000.

While Apple pretty much never launches any product as a ‘dev kit’, an XR headset might be such a shift for the company and its army of iOS developers that they will need that interim step to hone the experience ahead of a full blown push to consumers. We’ll find out soon enough.

Watch Apple’s WWDC Keynote Right Here at 10AM PT Read More »

hands-on:-creal’s-light-field-display-brings-a-new-layer-of-immersion-to-ar

Hands-on: CREAL’s Light-field Display Brings a New Layer of Immersion to AR

More than four years after I first caught wind of their tech, CREAL’s light-field display continues to be one of the most interesting and promising solutions for bringing light-fields to immersive headsets. At AWE 2023 I got to check out the company’s latest tech and saw first hand what light-fields mean for immersion in AR headsets.

More Than One Way to Focus

So first, a quick recap. A light-field is a fundamentally different way of showing light to your eyes compared to the typical displays used in most headsets today. The key difference is about how your eyes can focus on the virtual scene.

Your eyes have two focus methods. The one most people are familiar with is vergence (also called stereoscopy), where both eyes point at the same object to bring overlapping views of that object into focus. This is also what makes things look ‘3D’ to us.

But each individual eye is also capable of focusing in a different way by bending the lens of the eye to focus on objects at different distances—the same way that a camera with only one lens focuses. This is called accomodation.

Vergence-Accommodation Conflict

Most XR headsets today support vergenge (stereoscopic focus), but not accomodation (single-eye focus). You may have heard this called Vergence-Accomodation Conflict; also known to the industry as ‘VAC’ because it’s a pervasive challenge for immersive displays.

The reason for the ‘conflict’ is that normally the vergence and accommodation of your eyes work in tandem to achieve optimal focus on the thing you want to look at. But in a headset that supports vergence, but not accomodation, your eyes need to break these typically synchronous functions into independent functions.

It might not be something you ‘feel’ but it’s the reason why in a headset it’s hard to focus on things very near to you—especially objects in your hands that you want to inspect up close.

The conflict between vergence and accommodation can be not just uncomfortable for your eyes, but in a surprising way also rob the scene of immersion.

Creal’s Solution

And this is where we get back to Creal, a company that wants to solve the Vergence-Accommodation Conflict with a light-field display. Light-field displays structure light in the same way that we see it in the real world, allowing both of the focus functions of the eyes—vergence and accommodation—to work in tandem as they normally do.

At AWE 2023 this week, I got to check out the company’s latest light-field display tech, and came away with an added sense of immersion that I haven’t felt in any other AR headset to date.

I’ve seen Creal’s static bench-top demos before, which show static floating imagery through the lens to a single eye, demonstrating that you can indeed focus (accommodate) at different depths. But you won’t really see the magic until you see a light-field with both eyes and head-tracking. Which is exactly what I got to do this week at AWE.

Photo by Road to VR

On an admittedly bulky proof-of-concept AR headset, I got to see the company’s light-field display in its natural habitat—floating immersively in front of me. What really impressed me was when I held my hand out and a little virtual turtle came floating over to the palm of my hand. Even though it was semi-transparent, and not exceptionally high resolution or accurately colored, it felt… weirdly real.

I’ve seen all kinds of immersive XR experiences over the years, and holding something in your hand sounds like a banal demo at this point. But there was just something about the way this little turtle looked—thanks to the fact that my eyes could focus on it in the same way they would in the real world—that made it feel more real than I’ve ever really felt in other headsets. Like it was really there in my hand.

Photo by Road to VR

The trick is that, thanks to the light-field, when I focused my eyes on the turtle in my hand, both the turtle (virtual) and my hand (real) were each in proper focus—something that isn’t possible with conventional displays—making both my hand and the turtle feel more like they were inhabiting the same space right in front of me.

It’s frustratingly impossible to explain exactly how it appeared via text alone; this video from Creal shot through-the-lens gives some idea of what I saw, but can’t quite show how it adds immersion over other AR headsets:

It’s a subtle thing, and such added immersion probably only meaningful impacts objects within arms reach or closer—but then again, that distance is where things have the potential to feel most real to use because they’re in our carefully watched personal space.

Digital Prescriptions

Beyond just adding a new layer of visual immersion, light-field displays stand to solve another key problem, which is vision correction. Most XR headsets today do not support any kind of prescription vision correction, which for maybe even more than half of the population means they either need to wear their correctives while using these devices, buy some kind of clip-on lens, or just suffer through a blurry image.

But the nature of light-fields means you can apply a ‘digital prescription’ to the virtual content that exactly matches the user’s corrective prescription. And because it’s digital, this can be done on-the-fly, meaning the same headset could have its digital corrective vision setting change from one user to the next. Doing so means the focus of virtual image can match the real world image for those with and without glasses.

Continue on Page 2: A More Acceptable Form-factor »

Hands-on: CREAL’s Light-field Display Brings a New Layer of Immersion to AR Read More »