Apple has sent invitations to its upcoming WWDC keynote to select media outlets, including Road to VR.
Apple has historically not invited XR media to its events, let alone commented in any way on its XR R&D that has been reportedly happening behind the scenes at the company for years.
Road to VR is among the XR media outlets who have received an invitation to Apple’s WWDC keynote for the first time. Our friends at UploadVR confirmed the same.
It’s difficult not to interpret the invitations amidst the growingnumberofreports that Apple plans to reveal its first XR device at the keynote which takes place on June 5th at 10AM PT.
Apple’s official entrance into the XR space has been rumored for years, with many expecting it to be a boon for the industry thanks to Apple’s penchant for solving usability challenges, one of the core issues that has held XR back from more mainstream usage. Much speculation has happened about whether the company will lean most into AR, VR, or MR.
Alas, we’ll have to wait until we’re there to find out for ourselves just what Apple has up its sleeve.
Independent tech analyst Ming-Chi Kuo says Apple’s highly anticipated mixed reality headset is very likely set for its reported Worldwide Developers Conference (WWDC) unveiling in June. Another generation is also in the pipeline, Kuo maintains, which he suggests may come at some point in 2025.
Kuo, a long-time Apple analyst and respected figure in supply chain leaks, says in a Medium post it’s “highly likely” we’ll see an unveiling at WWDC. This comes despite earlier reports of supply chain delays that would ultimately see the headset launch later this year. He says Apple is “well prepared” for the announcement of the headset, which is rumored to cost $3,000.
Should Apple’s MR headset announcement surpass expectations, Kuo suggests the device will pave the way for a transformative investment trend in the industry, as other makers follow suit to jump on the trend.
A positive announcement at WWDC could be a promising development for the share prices of companies involved in the headset’s production, Kuo maintains. Apart from Luxshare-ICT, which the analyst says has an exclusive assembly agreement for the headset, companies such as Sony (micro-OLED display), TSMC (dual processors), Everwin Precision (primary casing supplier), Cowell (12 camera modules), and Goertek (external power supply) may greatly benefit from their involvements as exclusive component suppliers.
Furthermore, Kuo claims a second-generation Apple headset is expected to go into mass production in 2025, which will be offered in both a high and low-end version.
“Shipments of the 2nd generation in 2025 are expected to be around ten times those of the 1st generation in 2023,” Kuo says in a separate Medium post.
Outside of the avalanche of leaks, and even a brief tweet by Oculus founder Palmer Luckey stating Apple’s headset was “so good”, the whole industry is waiting for the June 5th keynote at the company’s annual developer conference. One thing is for sure: whether a hit or miss, however you slice it Apple’s headset will be pivotal for the XR industry as a whole.
Palmer Luckey, the founder of Oculus who left the company in 2017, appears to have insider knowledge of the upcoming Apple XR headset, which is expected to be unveiled at its Worldwide Developers Conference (WWDC) next month. To Luckey, Apple’s hotly awaited entrance into the space is apparently “so good.”
Luckey hasn’t quantified his experience beyond this, or even said that his impressions indeed come from a personal demo of Apple’s long-rumored mixed reality headset, which, like Meta Quest Pro, is thought to be capable of both virtual reality and passthrough augmented reality thanks to outward facing cameras. Whatever the case, the VR pioneer is sufficiently impressed with whatever the fruit company has in store.
Luckey, who founded defense company Anduril after his 2017 Facebook departure, is no stranger to candidly voicing his opinions on headset design. When unicorn AR startup Magic Leap released its ML1 headset in mid-2018, he called it a “tragic heap,” further stating the AR headset was “a tragedy in the classical sense.”
At the time, Magic Leap was just as secretive about its hardware as Apple is today. And Luckey’s opinion was undoubtedly tinged by the company’s self-generated hype which grew in the shadow of that secrecy.
“Magic Leap needed to really blow people away to justify the last few years,” Luckey wrote in his review of the headset. “The product they put out is reasonably solid, but is nowhere close to what they had hyped up, and has several flaws that prevent it from becoming a broadly useful tool for development of AR applications. That is not good for the XR industry.”
Does this mean Apple is actually delivering on the hype and pushing the ball forward with the reported $3,000 headset? Even with an avalanche of patently unverifiable leaks to go on and Luckey’s word, we truly won’t know until that ‘one more thing’ is announced on stage. Then again, you simply never can tell with Apple. We have our calendars marked for the June 5th WWDC keynote, so join us then to find out.
Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.
Updated – May 2nd, 2023
Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.
With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.
Foveated Rendering
Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.
The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.
Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.
Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.
Automatic User Detection & Adjustment
In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.
Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.
With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.
In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.
Varifocal Displays
A prototype varifocal headset | Image courtesy NVIDIA
The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:
Accommodation
Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman
In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.
Vergence
Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)
Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.
The Conflict
With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.
But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.
In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).
That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.
But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.
Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.
Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.
A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.
And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.
As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.
Foveated Displays
While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.
Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.
Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.
A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo
Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.
Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.
Though delayed from its commitment last year, Magic Leap today announced that ML2 now fully supports OpenXR. The timing might have something to do with Apple’s looming entrance into the XR space.
OpenXR is an open standard that aims to standardize the development of VR and AR applications, making hardware and software more interoperable. The standard has been in development since 2017 and is backed by virtually every major hardware, platform, and engine company in the XR industry.
“The adoption of OpenXR as a common AR ecosystem standard ensures the continual growth and maturation of AR,” Magic Leap said in its announcement. “Magic Leap will continue to advance this vision as Vice Chair of the OpenXR Working Group. In this role, Magic Leap provides technical expertise and collaborates with other members to address the needs of developers and end-users, the scope of the standard, and best practices for implementation.”
Its true that Magic Leap has been part of the OpenXR Working Group—a consortium responsible for developing the standard—for a long time, but we can’t help but feel like Apple’s heavily rumored entrance into the XR space lit a bit of a fire under the feet of the company to get the work across the finish line.
In doing so, Magic Leap has strengthened itself—and the existing XR industry—against what could be a standards upheaval by Apple.
Apple is well known for ignoring certain widely adopted computing standards and choosing to use their own proprietary technologies, in some cases causing a technical divide between platforms. You very well may have experienced this yourself, have you ever found yourself in a conversation about ‘blue bubbles and green bubbles’ when it comes to texting.
With an industry as young as XR—and with Apple being so secretive about its R&D in the space—there’s a good chance the company will have its own way of doing things, especially when it comes to how developers and their applications are allowed to interact with the headset.
If Apple doesn’t want to support OpenXR, this is likely the biggest risk for the industry; if developers have to change their development processes for Apple’s headset, that would create a divide between Apple and the rest of the industry, making applications less portable between platforms.
And while OpenXR-supporting incumbents have the upper hand for the time being (because they have all the existing XR developers and content on their side), one would be foolish to forget the army of experienced iOS developers that are used to doing things the ‘Apple way’. If those developers start their XR journey with Apple’s tools, it will be less likely that their applications will come to OpenXR headsets.
On the other hand, it’s possible that Apple will embrace OpenXR because it sees the value that has already come from years of ironing out the standard—and the content that already supports it. Apple could even be secretly part of the OpenXR Working Group, as companies aren’t forced to make their involvement known.
In the end it’s very likely that Apple will have its own way of doing things in XR, but whether that manifests more in the content running on the headset or down at the technical level, remains to be seen.
If the avalanche of recent reports can indicate anything at all, it seems Apple is entering the VR/AR headset market fairly soon, bringing along with it the most inflated expectations the industry has ever seen. It’s probably going to be expensive, but whether it flops or becomes a big hit, the mere existence of Apple in the space is set to change a lot about how things are done.
The iPhone wasn’t the first smartphone. That award goes to an obscure PDA device called the IBM Simon, released in limited numbers in 1994. The Apple Watch wasn’t the first smartwatch either. That was debatably the Seiko Raputer, which was released in 1998 in Japan. Its monochrome LCD wasn’t capable of touch, instead offering up a tiny eight-direction joystick and six function buttons to browse files, play games, and set calendar appointments. Similarly, iPad wasn’t the first tablet. Mac wasn’t the first home computer. iPod wasn’t the first MP3 player. But all of these products have become nothing short of iconic. There’s very little benefit to being first, at least as far as Apple is concerned.
And while it seems the company’s first mixed reality headset could finally debut at its Worldwide Developers Conference (WWDC) in June, like all of its other products, it won’t be the first MR headset. Just the same, like everything else the fruit company makes, it’s going to be the one everyone is talking about—for better or worse.
In case you haven’t noticed, Apple is a big deal. It has an ecosystem of products which connect to each other, design-forward hardware that has helped it maintain brand name cache, and a philosophy that puts user-friendliness at the core of its software experience. Oh, and it’s the most valuable company in the world.
And while the irrational exuberance for successive device generations has mostly petered out since its heydays in the early 2000s, reducing its famed long-line launch extravaganzas to more chill online pre-order releases, becoming an Apple apostate is still unthinkable to many. Once you’re in, you’re in. You buy the phone, the laptop, the headphones, and now, maybe you’ll get the newfangled headset too. Maybe. Let’s put aside the rumors for now. Forget about the spec breakdowns, hardware design leaks, software capabilities, etc. There are plenty of them out there, and you can read about those here. The only thing we know for sure is Apple is… well… Apple. Here’s what you, and probably everyone else is expecting.
Apple’s BKC Store in Mumbai, India | Image courtesy Apple
For Better: What Should Happen
Unless the company is making a drastic departure here, its first mixed reality headset should be built with this same level of user friendliness as all of its other devices, which means it should connect to the Apple ecosystem easily, and have a simple and intuitive UI. Log in with Apple ID. No muss, no fuss (whatever ‘muss’ is). Privacy should be a giant focus for the headset from the outset, since it will almost certainly pack eye-tracking in addition to a host of cameras to get a glimpse of the inside of your immediate surroundings, messiness and all. Apple has its fair share of data collection scandals, yet it seems to inspire enough confidence for privacy to be a big historical selling point for all of its devices.
If you want to avoid drawing the ire of tech reviewers everywhere though, wearing it should be fairly simple and very comfortable, and the experiences within should be of high enough value to overcome that inherent friction of charging it, putting it on, setting up a tracking volume, and wearing it for extended periods of time—everything we expect from any mixed reality headset at this point. It should fit most people, and offer up a clear picture to people with heads and eyes of all shapes and sizes.
Meta Quest Pro | Image courtesy Meta
An obvious analogue here is Meta Quest Pro, which is relatively low friction, but things like a halo strap that forces too much weight on your brow, or a passthrough that’s just a little too grainy, or a display that doesn’t have a high enough pixel per degree (ppd) for staring at text—all of these things make it less appealing to users in the day-to-day, introducing what you might call accumulative friction. You use it a bunch at first until you figure out all of the niggles, at which point you may revert to traditional computing standards like using a laptop or smartphone. The thing isn’t really the all-purpose device you hoped it would be, and the company thinks twice about when to send the better, more improved version down the pipeline.
One would hope that Apple’s headset, on the other hand, should have a mature design language and have obviously useful features from day one. While there’s bound to be some stutters, like with the first Apple Watch, which was critiqued for its slow software, short battery life, and lack of customization, it should all be there, and not require a ton of feature updates to enhance after the big launch day.
It should sell well out of the gate—at least by the standards of the existing XR industry—even if everything isn’t perfect. And it should be so cool that it’s copied. Like a lot. And it should drag top-level studios into the XR scene to start making innovative and useful apps that aren’t just straight ports of ARkit or ARcore apps made for mobile, but things people need and want to use in-headset. A big win from Apple should not only spur its new mixed reality product category, but kick off a buzz among developers, which would include those who currently work in the XR industry and Apple’s existing cohort of dedicated iOS developers.
But more than merely being the latest shiny new headset within the existing XR industry, Apple’s entrance into the field has a real chance of radically expanding the industry itself, by showing that the world’s most iconic tech company now thinks the medium is worth pursuing. That’s the way it happened when Apple jumped into MP3 players, smartphones, tablets, wireless earbuds, and more.
As the saying goes, a rising tide lifts all boats. The inverse is also true though….
For Worse: What Could Happen
Apple’s headset is reportedly (okay, maybe just one rumor) priced somewhere near $3,000, so it probably won’t be the sort of accessory that initially attracts people to the ecosystem; that would be the job of a peripheral like Apple Watch. It will likely rely on the pool of built-in Apple users. Despite the price, the first iteration very likely won’t offer the sort of power you’d expect from a workhorse like Apple MacBook Pro either.
At the outset, any sustained draw from prosumers will invariably hinge on how well it can manage general computing tasks, like you might have with an iPad or MacBook, and everything else current mixed reality headset should do too, namely VR and AR stuff. That includes a large swath of things like fitness apps, both AR and VR games and experiences, productivity apps, standard work apps, everything. Basically, it has to be the Quest Pro that Meta wanted to release but didn’t.
AR turn-by-turn directions on an iPhone | Image courtesy Apple
And if not, it leaves Apple in a pretty precarious situation. If their headset can’t find a proper foothold within its ecosystem and attract enough users, it could lead to low adoption rates and a lack of interest in the technology as a whole. Mixed reality is largely seen as valuable steppingstone to what many consider the true moneymaker: all-day AR glasses. And despite some very glassses-shaped AR headsets out there, we’re still not there yet. Even if Apple is willing to take a hit with a bulky device in service of pushing use cases for its AR glasses yet to come, the short term may not look very bright.
And perhaps most importantly for the industry as a whole are the (metaphorical) optics.
After all, if the iconic Apple can’t manage to make MR something that everybody wants, the rest of the world watching from the sidelines may think the concept just can’t be conquered. In turn, it may mean capital investment in the space will dry up until ‘real’ AR headsets are a thing—the all-day glasses that will let you play Pokémon Go in the park, do turn-by-turn directions, and remind you the name of that person you met last week. The steppingstone of mixed reality may get waterlogged. Those are a lot of ifs, coulds, shoulds, and won’ts though. The only thing truly certain is we’re in for a very interesting few months, which you can of course follow at Road to VR.
Apple’s entrance into XR has the potential to expand the industry by demonstrating its viability, just as Apple has done with previous technologies. It stands a good chance at carving out a sizeable claim in the space, but it’s a gamble that could equally backfire if both sales and public perception aren’t on their side.
Is Apple’s XR headset going to be the “one more thing?” we’ve all been waiting for at WWDC this year? Will it live up to the Apple name, or be an expensive dev kit? Let us know in the comments below!
Apple appears to be getting ready to unveil its first mixed reality headset at its Worldwide Developers Conference (WWDC) in June. Now a report from Bloomberg maintains the Cupertino tech giant is also prepping a dizzying number of first-party apps, including gaming, fitness, video and collaboration tools.
Bloomberg’s Mark Gurman is a lightning rod for all things Apple, and in his new info dump it appears we now have a pretty sizable list of first-party content coming to the still very much under-wraps mixed reality headset.
Here’s all of the major apps and features mentioned in the report:
iPad apps adapted for mixed reality: Calendars, Contacts, Files, Home control, Mail, Maps, Messaging, Notes, Photos, Reminders, Music, News, Stocks, and Weather apps.
FaceTime: conferencing service will generate 3D versions of users in virtual meeting rooms.
Freeform collaboration app: will let users work on virtual whiteboards together while in mixed reality.
Work apps: Pages word processing, Numbers spreadsheet and Keynote slide deck apps, as well as iMovie and GarageBand for video and music production.
Apple TV: both immersive sports content and traditional video content – the latter presented in virtual environments, such as a desert or the sky.
Apple Books: will allow users to read in virtual reality.
Fitness+: will let users exercise while watching an instructor in VR.
Multitasking & Siri: will be able to run multiple apps at the same time, floating within the mixed reality space. Siri voice control is also present.
Camera app: can take pictures from the headset.
Provided all of the above is accurate, Apple may be releasing the industry’s most feature-rich headset out of the gate, as it appears to be hauling in a ton of its mature and battle-tested ecosystem of apps.
It’s also said that gaming will be a major focus—a reversal from previous reports. This could mean we’ll see a wider push for Apple to court third-party developers soon after release, which is said to release a few months after is June unveiling, priced at somewhere around $3,000.
As for hardware, as many suggested in the past, Gurman reconfirms the existence of a dial crown similar to the one seen on Apple Watch, which will let the wearer seamlessly switch between virtual and augmented reality views.
Here’s a compilation list of alleged Apple MR headset features scavenged from previous reports—all of which you should take with a heaping handful of salt. We’ve broken them down into specs and design rumors:
Rumored Apple MR Specs
Resolution: Dual Micro OLED displays at 4K resolution (per eye)
FOV: 120-degrees, similar to Valve Index
Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hotswappable for longer sessions.
Passthrough: ISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
Controller: Apple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
Room Tracking: Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
App Compatibility: Said to have the ability to run existing iOS apps in 2D.
Price: $3,000 – $4,000
Design Rumors
Outer Shell: Aluminum, glass, and carbon fiber to reduce its size and weight. Cameras are largely concealed for aesthetic reasons.
Presence Displays: Outward-facing display can show user’s facial expressions and also presumably eye movements. Said to be an always-on display similar in latency and power draw of Apple Watch or iPhone 14 Pro.
Dedicated Passthrough Switch: Digital Crown-like dial on its right side to switch between VR and AR.
Headstrap: Various available, including consumer-focused headstrap similar in material to Apple Watch sport bands with built-in speakers. Unspecified, but different headstrap targeted at developers.
– – — – –
As you’d imagine, Apple has confirmed absolutely nothing, so we’ll be tuning in for the June 5th keynote to see whether we’ll finally get a big “one more thing” moment we’ve been waiting for.
In an interview ahead of Apple’s upcoming Worldwide Developers Conference event, CEO Tim Cook talks about the potential of XR and why elements of it may be “even better than the real world.”
In an interview by GQ’s Zach Baron, Apple CEO Tim Cook explained that he first joined Apple—which at the time was nearly bankrupt—because Steve Jobs convinced him the company could really change the world.
And change the world it has, with products like the iPhone that have fundamentally altered the way much of the world goes about its daily business.
The next shot the company is rumored to take has a chance to do more than change the world—it could change everyday reality itself.
While Apple remains secretive about its plans for an XR device—which is rumored to be revealed at WWDC in June—Cook said in the interview that in some ways the technology could be “even better than the real world.”
“If you think about the technology itself with augmented reality, just to take one side of the AR/VR piece, the idea that you could overlay the physical world with things from the digital world could greatly enhance people’s communication, people’s connection,” Cook told GQ. “It could empower people to achieve things they couldn’t achieve before.”
“We might be able to collaborate on something much easier if we were sitting here brainstorming about it and all of a sudden we could pull up something digitally and both see it and begin to collaborate on it and create with it. And so it’s the idea that there is this environment that may be even better than just the real world—to overlay the virtual world on top of it might be an even better world,” said Cook. “And so this is exciting. If it could accelerate creativity, if it could just help you do things that you do all day long and you didn’t really think about doing them in a different way.”
When prompted about the company’s criticism of Google Glass around the time the device was introduced back in 2013—saying that head-worn devices would feel to invasive—Cook suggests he may have changed his mind on that point.
“My thinking always evolves. Steve [Jobs] taught me well: never to get married to your convictions of yesterday. To always, if presented with something new that says you were wrong, admit it and go forward instead of continuing to hunker down and say why you’re right.”
Just as Apple was skeptical of Google Glass, Cook knows Apple will always be in a similar boat when launching new products.
“Pretty much everything we’ve ever done, there were loads of skeptics with it,” Cook said. “If you do something that’s on the edge, it will always have skeptics.” When entering new markets, Cook said he considers a handful of questions: “Can we make a significant contribution, in some kind of way, something that other people are not doing? Can we own the primary technology? I’m not interested in putting together pieces of somebody else’s stuff. Because we want to control the primary technology. Because we know that’s how you innovate.”
Ming-Chi Kuo, a respected supply chain analyst, reports that Apple is tamping down enthusiasm for its upcoming mixed reality headset, which was rumored to see its big announcement at Apple’s Worldwide Developers Conference (WWDC) in June.
In a tweet, Kuo reports Apple is delaying release of its MR headset due to decreased optimism in recreating the “iPhone moment” the company was hoping to achieve with the device.
Kuo, an analyst at Asia-Pacific financial services group TF International Securities, is widely considered one of the most accurate voices in predicting Apple releases. Kuo has made many predictions in the past based on supply chain movements, including Apple’s 2020 switch to its own custom ARM-based processors for Mac computers, the 2019 release of a new MacBook Pro with a 16-inch display, and the release of the entry-level iPad with an A13 chip in 2021—just to name a few.
Kuo says Apple’s MR headset, which is reportedly codenamed N301, is being pushed back “another 1-2 months to mid-to-late 3Q23,” noting that the assembly line delay could mean we won’t see the new device at WWDC 2023 in early June as previously reported by The Financial Timesearlier this month.
It was said Apple CEO Tim Cook was a leading force in pushing the device’s launch this year, something that’s reportedly been a source of tension between the Apple chief and the industrial design team since the company began efforts in 2016.
Furthermore, Kuo says that due to the device’s delay in mass production, “the shipment forecast this year is only 200,000 to 300,000 units, lower than the market consensus of 500,000 units or more.”
“The main concerns for Apple not being very optimistic regarding the market feedback to the AR/MR headset announcement include the economic downturn, compromises on some hardware specifications for mass production (such as weight), the readiness of the ecosystem and applications, a high selling price (USD 3,000-4,000 or even higher), etc,” Kuo concludes.
If you’ve been following with the Apple rumor mill for the past few years, you’ll know there are almost too many reports to name at this point. To simplify, we’ve included a list of the headset’s rumored features and specs which we’ve collated from those reports.
Take note, none of the info below has been confirmed by Apple, so please take it with a large grain of salt.
Rumored Apple MR Specs
Resolution: Dual Micro OLED displays at 4K resolution (per eye)
FOV: 120-degrees, similar to Valve Index
Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hotswappable for longer sessions.
Passthrough: ISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
Controller: Apple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
Room Tracking: Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
App Compatibility: Said to have the ability to run existing iOS apps in 2D.
Price: $3,000 – $4,000
Design Rumors
Outer Shell: Aluminum, glass, and carbon fiber to reduce its size and weight. Cameras are largely concealed for aesthetic reasons.
Presence Displays: Outward-facing display can show user’s facial expressions and also presumably eye movements. Said to be an always-on display similar in latency and power draw of Apple Watch or iPhone 14 Pro.
Dedicated Passthrough Switch: Digital Crown-like dial on its right side to switch between VR and passthrough.
Headstrap: Various available, including consumer-focused headstrap similar in material to Apple Watch sport bands with built-in speakers. Unspecified, but different headstrap targeted at developers.
Apple’s upcoming mixed reality headset has been the subject of many reports and rumors over the past few years—that’s just the nature of the Cupertino-based black box. Now a new report from the Financial Timesalleges we may see the company’s first XR device unveiled this summer.
The headset, which is still unnamed, is allegedly nearing its big unveiling, which is said to take place in June 2023, or around when the company traditionally holds its Worldwide Developers Conference (WWDC).
The report maintains Apple CEO Tim Cook is the principal force pushing the device’s launch forward this year, which has apparently been a source of tension since as far back as early 2016.
The company’s operations team reportedly found itself at odds with the industrial design team, the former wanting to push out an early version of the headset while the latter hoped to delay in order to slim down the device.
Cook is ostensibly backing the operations team, as he allegedly hopes to push out the first version of the company’s XR headset, which is said to be targeting enthusiasts at an eye-watering $3,000 price point.
Citing sources familiar with Apple’s plans, the company is expected to only sell around one million units of the XR headset over the course of 12 months.
Relatively low sales targets notwithstanding—Apple sells 200 million iPhones per year—the company is expected to go in for a “marketing blitz” to attract prospective users.
According to a Bloomberg report earlier this year, Apple may be putting its plans to release a full AR headset on hold, as the company is allegedly planning what is described as a “lower-cost version” of its MR headset first. That cheaper version is said to target a 2024 or early 2025 launch window.
Note: This list of the headset’s prospective features and specs have been gathered from a few disparate reports. None of the below has been confirmed by Apple, so please take anything you read here with a large grain of salt:
Reported Apple MR Specs
Resolution: Dual Micro OLED displays at 4K resolution (per eye)
FOV: 120-degrees, similar to Valve Index
Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hotswappable for longer sessions.
Passthrough: ISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
Controller: Apple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
Room Tracking: Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
App Compatibility: Said to have the ability to run existing iOS apps in 2D.
Design Rumors
Outer Shell: Aluminum, glass, and carbon fiber to reduce its size and weight. Cameras are largely concealed for aesthetic reasons.
Presence Displays: Outward-facing display can show user’s facial expressions and also presumably eye movements. Said to be an always-on display similar in latency and power draw of Apple Watch or iPhone 14 Pro.
Dedicated Passthrough Switch: Digital Crown-like dial on its right side to switch between VR and passthrough.
Headstrap: Various available, including consumer-focused headstrap similar in material to Apple Watch sport bands with built-in speakers. Unspecified, but different headstrap targeted at developers.
Apple’s long-awaited mixed reality headset is still deep under wraps, although a recent report from The Information has shed what appears to be new light on some of the features coming to the fruit company’s first AR/VR headset.
There’s a lot of new information here, and of course, we can’t substantiate it even if we tried. We’ve restructured the main takeaways, courtesy of MacRumors, into a sort of fantasy spec sheet:
Reported Apple MR Specs
Resolution: Dual Micro OLED displays at 4K resolution (per eye)
FOV: 120-degrees, similar to Valve Index
Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hotswappable for longer sessions.
Passthrough: ISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
Controller: Apple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
Room Tracking: Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
App Compatibility: Said to have the ability to run existing iOS apps in 2D.
Then there are some design rumors, which don’t fit so well into our fantasy spec sheet. The Information says it has reconfirmed these previously reported rumors.
Design Rumors
Outer Shell: Aluminum, glass, and carbon fiber to reduce its size and weight. Cameras are largely concealed for aesthetic reasons.
Presence Displays: Outward-facing display can show user’s facial expressions and also presumably eye movements. Said to be an always-on display similar in latency and power draw of Apple Watch or iPhone 14 Pro.
Dedicated Passthrough Switch: Digital Crown-like dial on its right side to switch between VR and passthrough.
Headstrap: Various available, including consumer-focused headstrap similar in material to Apple Watch sport bands with built-in speakers. Unspecified, but different headstrap targeted at developers.
Apple supplier Pegatron is said to have already assembled “thousands of prototype units of the headset” over the course of 2022 at its Shanghai-based facility.
According to four people with knowledge of the matter, The Information reports that Apple could price its MR headset around $3,000 or more depending on its configuration.
The report maintains the headset was initially supposed to launch in 2022, although by now it’s clear it’s obviously been delayed. A previous Bloomberg report alleged this was due to “overheating, cameras and software” having been stumbling blocks along the way to launch.