News

report:-apple-mixed-reality-headset-delayed-to-late-2023-amid-decreased-confidence-in-market-appeal

Report: Apple Mixed Reality Headset Delayed to Late 2023 Amid Decreased Confidence in Market Appeal

Ming-Chi Kuo, a respected supply chain analyst, reports that Apple is tamping down enthusiasm for its upcoming mixed reality headset, which was rumored to see its big announcement at Apple’s Worldwide Developers Conference (WWDC) in June.

In a tweet, Kuo reports Apple is delaying release of its MR headset due to decreased optimism in recreating the “iPhone moment” the company was hoping to achieve with the device.

Kuo, an analyst at Asia-Pacific financial services group TF International Securities, is widely considered one of the most accurate voices in predicting Apple releases. Kuo has made many predictions in the past based on supply chain movements, including Apple’s 2020 switch to its own custom ARM-based processors for Mac computers, the 2019 release of a new MacBook Pro with a 16-inch display, and the release of the entry-level iPad with an A13 chip in 2021—just to name a few.

Kuo says Apple’s MR headset, which is reportedly codenamed N301, is being pushed back “another 1-2 months to mid-to-late 3Q23,” noting that the assembly line delay could mean we won’t see the new device at WWDC 2023 in early June as previously reported by The Financial Times earlier this month.

It was said Apple CEO Tim Cook was a leading force in pushing the device’s launch this year, something that’s reportedly been a source of tension between the Apple chief and the industrial design team since the company began efforts in 2016.

Furthermore, Kuo says that due to the device’s delay in mass production, “the shipment forecast this year is only 200,000 to 300,000 units, lower than the market consensus of 500,000 units or more.”

“The main concerns for Apple not being very optimistic regarding the market feedback to the AR/MR headset announcement include the economic downturn, compromises on some hardware specifications for mass production (such as weight), the readiness of the ecosystem and applications, a high selling price (USD 3,000-4,000 or even higher), etc,” Kuo concludes.

If you’ve been following with the Apple rumor mill for the past few years, you’ll know there are almost too many reports to name at this point. To simplify, we’ve included a list of the headset’s rumored features and specs which we’ve collated from those reports.

Take note, none of the info below has been confirmed by Apple, so please take it with a large grain of salt.

Rumored Apple MR Specs

  • Resolution: Dual Micro OLED displays at 4K resolution (per eye)
  • FOV: 120-degrees, similar to Valve Index
  • Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
  • Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hotswappable for longer sessions.
  • PassthroughISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
  • Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
  • ControllerApple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
  • Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
  • IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
  • Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
  • Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
  • Room Tracking:  Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
  • App Compatibility: Said to have the ability to run existing iOS apps in 2D.
  • Price: $3,000 – $4,000

Design Rumors

  • Outer Shell: Aluminum, glass, and carbon fiber to reduce its size and weight. Cameras are largely concealed for aesthetic reasons.
  • Presence DisplaysOutward-facing display can show user’s facial expressions and also presumably eye movements. Said to be an always-on display similar in latency and power draw of Apple Watch or iPhone 14 Pro.
  • Dedicated Passthrough Switch: Digital Crown-like dial on its right side to switch between VR and passthrough.
  • Headstrap: Various available, including consumer-focused headstrap similar in material to Apple Watch sport bands with built-in speakers. Unspecified, but different headstrap targeted at developers.

Report: Apple Mixed Reality Headset Delayed to Late 2023 Amid Decreased Confidence in Market Appeal Read More »

croquet-for-unity:-a-new-era-for-multiplayer-development-with-“no-netcode”-solution

Croquet for Unity: A New Era for Multiplayer Development With “No Netcode” Solution

Croquet, the multiplayer platform for web and gaming, which took home the WebXR Platform of the Year award at this year’s Polys WebXR Awards, recently announced Croquet for Unity.

Croquet for Unity is an innovative JavaScript multiplayer framework for Unity – a platform for creating interactive, real-time 3D content – that simplifies development by eliminating multiplayer code and server setup. It connects developers with the distinct global architecture of the Croquet Multiplayer Network. The framework was demonstrated at GDC last week, while early access beta is arriving in April 2023.

Effortless Networking for Developers

Croquet for Unity alleviates the developers’ need to generate and sustain networking code. By employing Croquet’s Synchronized Computation Architecture, server-side programming and traditional servers become unnecessary.

Users connect through the Croquet Multiplayer Network, which consists of Reflectors—stateless microservers located across four continents—that guarantee smooth and uniform experiences for gamers.

Synchronizing Computation for Flawless Multiplayer

At its essence, Croquet focuses on synchronizing not only the state but also its progression over time. By harmonizing computation, Croquet eliminates the need to transmit the outcomes of intricate computations like physics or AI.

It also eliminates the necessity for particular data structures or sync indicators for designated objects. As a result, crafting multiplayer code becomes akin to creating single-player code, with the full game simulation executing on-device.

Shared Virtual Computers for Perfect Sync

A shared virtual computer runs identically on all clients, providing perfect synchronization and giving each player a unique perspective. Lightweight reflectors can be positioned at the edge of the cloud or in a 5G network’s MEC, offering lower latency than older architectures.

In addition, synchronized calculations performed on each client will replace traditional server computations, resulting in reduced bandwidth and improved latency.

Unprecedented Shared Multiplayer Simulations

Croquet not only facilitates multiplayer development but also enables previously unfeasible shared multiplayer simulations. Examples include real-time interactive physics as a fundamental game feature, fully reproduced non-player character behaviors, and sophisticated player interactions that allow players to interact while the game is live.

Due to bandwidth limits and intrinsic complexity, traditional networks are incapable of supporting these simulations.

“Innately Multiplayer” Games With No Netcode

“Multiplayer games are the most important and fastest-growing part of the gaming market. But building and maintaining multiplayer games is still just too hard,” said David A. Smith, founder and CTO of Croquet, in a press release shared with ARPost. “Croquet takes the netcode out of creating multiplayer games. When we say, ‘innately multiplayer,’ we mean games are multiuser automatically from the first line of code and not as an afterthought writing networking code to make it multiplayer.”

Croquet’s goal is to simplify developing multiplayer games, making it as easy as building single-player games. By removing netcode creation and administration, developers can concentrate on improving player experiences while benefiting from reduced overall creation and distribution costs, a speedier time to market, and enhanced player satisfaction.

Opening Doors for Indie Developers

Croquet for Unity is created for a wide range of gaming developers, but it is highly advantageous for small, independent developers that often find it more difficult to create multiplayer games because of the absence of in-house networking and backend technical background.

Secure Your Spot on the Croquet for Unity Beta Waitlist

Developers can sign up for the Beta Waitlist to access the Croquet for Unity beta, launching in April.The Croquet for Unity Package will be available in the Unity Asset Store upon commercial release for free, requiring a Croquet gaming or enterprise subscription and developer API key for global Croquet Multiplayer Network access.

Croquet for Unity: A New Era for Multiplayer Development With “No Netcode” Solution Read More »

hands-on:-bigscreen-beyond-–-a-little-headset-that-could-be-a-big-deal

Hands-on: Bigscreen Beyond – A Little Headset That Could be a Big Deal

It’s exceedingly rare to see a VR software startup transition to making hardware, let alone decent hardware. But that’s exactly what Bigscreen—creators of the long-running social VR theater app of the same name—has done with its upcoming Beyond headset.

Bigscreen has clearly targeted PC VR enthusiasts who are willing to pay for the best hardware they can get their hands on. And with major players like Meta and HTC focusing heavily on standalone headsets, Bigscreen Beyond could prove to be the best option they’ll find any time soon.

Photo by Road to VR

The company has set out to make a headset that’s not just better than what’s out there, but one that’s much smaller too. And while it remains to be seen if the headset will hit all the right notes, my initial hands-on shows plainly the company knows what it’s doing when it comes to building a VR headset.

Bigscreen Beyond Specs
Resolution 2,560 × 2,560 (6.5MP) per-eye

microOLED (2x, RGB stripe)
Pixels Per-degree (claimed) 28
Refresh Rate 75Hz, 90Hz
Lenses Tri-element pancake
Field-of-view (claimed) 93°H × 90°V
Optical Adjustments IPD (fixed, customized per customer)

eye-relief (fixed, customized per facepad)
IPD Adjustment Range 58–72mm (fixed, single IPD value per device)
Connectors DisplayPort 1.4, USB 3.0 (2x)
Accessory Ports USB-C (1x)
Cable Length 5m
Tracking SteamVR Tracking 1.0 or 2.0 (external beacons)
On-board Cameras None
Input SteamVR Tracking controllers
On-board Audio None
Optional Audio Audio Strap accessory, USB-C audio output
Microphone Yes (2x)
Pass-through view No
Weight 170–185g
MSRP $1,000
MSRP (with tracking & controllers) $1,580

Custom-made

Bigscreen is building something unique, quite literally—every Beyond headset comes with a custom-made facepad. And this isn’t a ‘choose one of three options’ situation, Bigscreen has a sleek app that walks buyers through the process of capturing a 3D scan of their face so the company can create a completely unique facepad that conforms to each specific customer.

And it really makes a difference. The first thing that Bigscreen CEO Darshan Shankar showed me during a demo of the Beyond headset was the difference between my personal facepad (which the company created for me prior to our meetup) and someone else’s facepad. The difference was instantly obvious; where mine fit against my face practically like two connected puzzle-pieces, the other facepad awkwardly disagreed with my face in various places. While I’ve recognized for a long time that different facial topology from person-to-person is a real consideration for VR headsets, this made me appreciate even more how significant the differences can be.

The facepad may look rough, but it’s actually made of a soft rubber material | Photo by Road to VR

Shankar says the custom-fit facepad is an essential part of making such a small headset. It ensures not only that the headset is as comfortable as it can be, but also the user’s eyes are exactly where they’re supposed to be with regard to the lenses. For a headset like Beyond, which uses high magnification pancake optics with a small sweet spot, this is especially important. And, as Shankar convincingly demonstrated by shining a flashlight all around the headset while I was wearing it, the custom-fit facepad means absolutely no external light can be seen from inside.

And the custom facepad isn’t the only way each headset is dialed in for each specific customer; instead of wasting weight and space with the mechanics for an IPD adjustment, the headset ships with one of 15 fixed IPD distances, ranging from 58–72mm. The company selects the IPD based on the same face scan that allows them to make the custom facepad. And given the size of the Beyond headset, there’s no way that glasses will fit inside; luckily the company will also sell magnetically attached prescription inserts for those who need them, up to −10 diopter.

Diving In

With my custom facepad easily snapped onto the headset with magnets, it was time to dive into VR.

The baseline version of the $1,000 Bigscreen Beyond headset has a simple soft strap, which I threw over the back of my head and tightened to taste. I felt I had to wear the strap very high on the back of my head for a good hold; Shankar says an optional top-strap will be available, which ought to allow me to wear the rear strap in a lower position.

Photo by Road to VR

As I put on the headset I found myself sitting in a dark Bigscreen theater environment, and the very first thing I noticed was the stellar darks and rich colors that are thanks to the headset’s OLED displays. The second thing I noticed was there was no sound! That’s because the baseline version of the headset doesn’t have on-board audio, so I still had to put on a pair of headphones after the headset was donned.

While the baseline headset lacks on-board audio, Bigscreen is offering a $100 ‘Audio Strap‘, which is a rigid headstrap with built-in speakers. As someone who really values rigid straps and on-board audio, I’m glad to see this as an option—for me it would be the obvious choice. Unfortunately the company wasn’t ready to demo the Audio Strap.

Shankar toured me around a handful of VR environments that showed off the headset’s 2,560 × 2,560 (6.5MP) per-eye displays, which offered a level of clarity similar to that of Varjo’s $2,000 Aero headset, but with a smaller notably field-of-view (Bigscreen claims 90°H × 93°V).

On many current-gen headsets like Quest 2 you can’t quite see the individual lines of the screen-door effect, but it’s still clear that it’s there in aggregate. While the Beyond headset isn’t ‘retina resolution’ there’s essentially no evidence of any screen-door effect. Everything looks really sharp. This was best demonstrated when I ran around in Half-Life: Alyx and the game felt like it had instantly upgraded graphics compared to a headset like Valve’s Index.

There is, however, some persistence blurring and glare. Shankar openly demonstrated how the brightness of the display directly relates to the level of persistence. While there’s some noticeable persistence at the default brightness, when overdriving the display’s brightness the persistence becomes entirely unbearable. The reverse is true; turning the brightness down below the default cuts the persistence down noticeably. While it would be nice if the default brightness had less persistence, at least users will be able to trade brightness for lower persistence based on their specific preference.

Continue on Page 2: Dialing In

Hands-on: Bigscreen Beyond – A Little Headset That Could be a Big Deal Read More »

nreal-to-support-windows-computers-with-nebula-app

Nreal to Support Windows Computers With Nebula App

Nreal Air is largely a virtual screen viewer. While it does have native apps designed for AR, it becomes a lot more versatile when displaying content from a tethered device like a mobile phone or game console. Nebula, the app that allows these features, will also now be available for Windows computers.

Changes Coming to Nreal’s Nebula Ecosystem

The job of Nebula is to “project 2D content into an interactive 3D space.” To get much use out of the company’s AR glasses like Nreal Air, you need to have Nebula installed on the device. The glasses launched with support for the Android operating system and subsequent updates brought compatibility to Mac and a number of game consoles.

A recent release from the company confirms that Nebula is coming to Windows. The Windows version of Nebula also comes with enhanced tracking, an optimized aspect ratio, and a curved virtual screen. The 3DoF tracking was specifically touted for helping gamers playing simulation-type games.

A Boon for Gaming (and Maybe Productivity)

As of this writing, the Windows version of the Nebula app is not yet available and no rollout date was included in the release shared with ARPost. Still, the announcement brings some excitement both in productivity and gaming applications.

PC Gaming on a Virtual Screen

The main drive of the update, according to the release, was to catch PC gamers. A recent user survey found that console gaming is the second highest use case just behind streaming media.

use cases of Nreal Air

“We are thrilled to see the growing popularity of Nreal Air among the gaming community, and we are committed to providing gamers with the best possible experience,” co-founder Peng Jin said. “As the gaming industry continues to evolve, we believe that Nebula for Windows is a game-changer for the desktop gaming market.”

I don’t always game, but when I do I use a Windows PC. Many a time I’ve resorted to connecting my laptop to a TV via an HDMI cable to get a bigger screen. Having used Nreal Air for watching videos online, I can definitely see the draw that the giant virtual screen can have for gaming.

Multiple Virtual Screens for Productivity

The third leading use case is productivity. Even though this was downplayed in the release, it’s something that I’m excited to try out.

In my review of the Nreal Air I said that reading fine text in the glasses was still a bit of a chore. However, with the changes to screen aspect and other updates to the Windows version of Nebula, I’d be willing to revisit the glasses for productivity. Being able to glance back and forth between multiple screens instead of opening and closing tabs on my laptop would be great.

Any Day Now

The only thing not to like about the announcement is the lack of a release date. I’m still a little skeptical of how conducive Nreal Air can be to productivity tasks like writing but one way or another this opens up a significant market for these already popular AR glasses.

Nreal to Support Windows Computers With Nebula App Read More »

spatial-releases-toolkit-for-“gaming-and-interactivity”

Spatial Releases Toolkit for “Gaming and Interactivity”

Spatial started out as an enterprise remote collaboration solution. Then, it changed lanes to offer virtual worlds for consumer social uses. Now, it could become an immersive gaming platform. At least, in part.

A Look at the Toolkit

The new “Spatial Creator Toolkit” is a Unity-powered interface that allows users to create custom avatars, items, and “quests.” The quests can be “games and immersive stories” as well as “interactive exhibitions” according to a release shared with ARPost.

Spatial Creator Toolkit

“This evolution to gamified and interactive co-experiences is a natural expansion for the platform and the internet,” said Jinha Lee, CPO and co-founder. “With more than 1 million registered creators on the platform today, and almost 2 million worlds, we are committed to empowering all creators.”

The toolkit also features advanced tools for linking virtual worlds together. All of it is powered by visual scripting as opposed to conventional coding. The company said that this allows “zero learning curve and instant scalability.” During a closed alpha phase that began in December, companies with advanced access including Vogue and McDonald’s broke in the toolkit.

Spatial’s Room to Grow

According to the release, the company hopes to become the YouTube of 3D games. “As Adobe is for 2D video, Unity is the software unlocking 3D games and the new medium of the internet. Spatial is like the YouTube for these games, enabling instant publishing to the mass market,” said CEO and co-founder of Spatial, Anand Agarawala. “Anyone can build, the key is unlocking the capabilities to allow the magic to happen.”

Considering plans for a creator marketplace by the end of the year, the new business model is also similar to platforms like Roblox. That platform is a flagship of the gaming creator economy but has so far stayed away from NFTs.

Having fully embraced NFTs, along with other Web3 building blocks like cross-platform avatar compatibility through Ready Player Me, Spatial has a lot of opportunities and tools at its disposal that platforms like Roblox don’t. These include partnerships in the larger Web3 community, and at least some level of interoperability with other immersive platforms.

In short, we still have to see where this direction takes the company. But, it looks like calling the platform a “YouTube” or a “Roblox” might be selling it short. Both of those are massive creator-driven online marketplaces and communities, but both of them are limited by their own walls and that might not be true of this new side of Spatial.

Let’s See How Far it Goes

Skepticism about what may seem like another blockchain game drive is understandable. However, blockchain games that have let users down in the past were largely trying to shill their own products with questionable infrastructure. Spatial is a proven company with an open ecosystem that has nothing to gain by anyone losing. This should be fun.

Spatial Releases Toolkit for “Gaming and Interactivity” Read More »

nvidia-cloudxr-4.0-enables-developers-to-customize-the-sdk-and-scale-xr-deployment

NVIDIA CloudXR 4.0 Enables Developers to Customize the SDK and Scale XR Deployment

In January, NVIDIA announced new products and innovations at CES 2023. At this year’s NVIDIA GTC, “the developer conference for the era of AI and the metaverse,” NVIDIA announced the latest release of CloudXR. Businesses can definitely look forward to boosting their AR and VR capabilities with the new NVIDIA CloudXR developments, enhanced to bring more flexibility and scalability for XR deployments.

The latest release augurs well for developers looking to improve the customer experience while using their apps whether on the cloud, through 5G Mobile Edge Computing, or corporate networks.

In CloudXR 4.0, new APIs allow flexibility in the development of client apps as well as in using various distribution points to deliver XR experiences. Scalability in multi-platform use is another plus as broader options for CloudXR interface are likewise made available. The new NVIDIA CloudXR also makes it possible for developers to create custom user interfaces through the use of Unity plug-in architecture.

Among the benefits that developers can enjoy with the new NVIDIA CloudXR 4.0 are:

  • No Need for OpenVR or OpenXR Runtime – CloudXR Server API lets developers build CloudXR directly into their applications, although OpenVR API via the SteamVR runtime continues to be fully supported by the new version.
  • More Deployment Options With the Use of the Unity Plug-in – Developers can build on the Unity engine and create a full-featured CloudXR Client using Unity APIs.

NVIDIA CloudXR 4.0 - Unity Plug-in

  • Reduced Lag and Delay Problems Through the l4s Technology – Lags on interactive cloud-based video streaming are reduced as the new NVIDIA CloudXR release makes use of a convenient “togglable” feature in the implementation of the advanced 5G packet delivery optimization.

More Immersive Experiences With the New NVIDIA CloudXR Developments

The new NVIDIA CloudXR developments now make it possible to provide more immersive high-fidelity XR experiences to the users. Developers and businesses can offer high-performance XR streaming to their customers through the most accessible platforms and devices. They can now customize their applications to give the kind of XR experiences their customers are looking for.

“At VMware we’re using NVIDIA CloudXR to enable our customers to stream high-fidelity XR experiences from platforms, like VMware Horizon, to standalone VR devices running VMware Workspace ONE XR Hub,” said VMware Director of Product Management, Matt Coppinger, in a press release shared with ARPost. “This gives our customers the power of a graphics workstation along with the mobility of a standalone VR device.” 

With CloudXR 4.0, developers are able to improve integrations and consequently the overall performance of their apps and solutions.

NVIDIA also revealed strategic partnerships with tech companies like Ericsson and Deutsche Telekom to ensure that integrations, specifically of the L4S, into the new NVIDIA CloudXR developments, are implemented seamlessly.

The availability of high bandwidth, low latency networks for optimal streaming performance are also assured through these alliances. Head of Deutsche Telecom’s Edge Computing, Dominik Schnieders, reiterates how they believe that the CloudXR and L4S optimization is a critical component of streaming XR for both enterprises and consumers in the public 5G network.

Most Requested Features on the New NVIDIA CloudXR Developments

The new version of the CloudXR puts together more “in-demand” features. Among these are: generic controller support, call back-based logging, and flexible stream creation. This demonstrates great responsiveness to the needs of the XR developer community and is perceived as a significant improvement in the distribution of enterprise XR software.

NVIDIA CloudXR 4.0 Enables Developers to Customize the SDK and Scale XR Deployment Read More »

hands-on:-htc’s-new-standalone-vive-tracker-effortlessly-brings-more-of-your-body-into-vr

Hands-on: HTC’s New Standalone Vive Tracker Effortlessly Brings More of Your Body Into VR

With three versions of SteamVR trackers under its belt, HTC has been a leading enabler of full-body tracking in VR. Now the company’s latest tracker could make it even easier to bring your body into VR.

HTC’s new standalone Vive tracker (still unnamed) has a straightforward goal: work like the company’s existing trackers, but easier and on more platforms.

The ‘easier’ part comes thanks to inside-out tracking—using on-board cameras to allow the device to track its own position, rather than external beacons like those used by the company’s prior trackers.

Photo by Road to VR

To that end, things seem really promising so far. I got to demo the new Vive tracker at GDC 2023 this week and was impressed with how well everything went.

Photo by Road to VR

With two of the new Vive trackers strapped to my feet, I donned a Vive XR Elite headset and jumped into a soccer game. When I looked down at my feet, I saw a pair of virtual soccer shoes. And when I moved my feet in real-life, the soccer shoes moved at the same time. It took less than two seconds for my mind to say ‘hey those are my feet!’, and that’s a testament to both the accuracy and latency being very solid with the new tracker.

That’s not a big deal for older trackers that use SteamVR Tracking, which has long been considered the gold standard for VR tracking. But to replicate a similar level of performance in a completely self-contained device that’s small and robust enough to be worn on your feet… that’s a big deal for those who crave the added immersion that comes with bringing more of your body into VR.

Throughout the course of my demo, my feet were always where I expected to see them. I saw no strange spasms or freezing in place, no desync of coordinate planes between the tracker and the headset, and no drifting of the angle of my feet. That allowed me to easily forget that I was wearing anything special on my feet and simply focus on tracking to kick soccer balls into a goal.

While the tracker worked well throughout, the demo had an odd caveat—I had feet but no legs! That makes it kind of weird to try to juggle a soccer ball when you expect to be able to use your shin as a backboard but watch as the ball rolls right over your virtual foot.

Ostensibly this is the very thing that trackers like this should be able to fix; by attaching two more trackers to my knees, I should be able to have a nearly complete representation of my leg movements in VR, making experiences like ‘soccer in VR’ possible when they simply wouldn’t work otherwise.

I’m not sure if the demo app simply wasn’t designed to handle additional tracking points on the knees, or if the trackers are currently limited to just two, but HTC has confirmed the final inside-out Vive tracker will support up to five trackers in addition to the tracked headset and controllers.

Trackers can, of course, be used to track more than just your body, though apps that support these kinds of tracked accessories are rare | Photo by Road to VR

So the inside-out factor is the ‘easier’ part, but what about the other goal of the tracker—to be available on more platforms than just SteamVR Tracking?

Well, the demo I was playing was actually running purely on the standalone Vive XR Elite. To connect the trackers, a small USB-C dongle needs to be connected to the headset to facilitate the proprietary wireless connection between the dongle and the trackers. HTC says the same dongle can plug into a PC and the trackers will work just fine through SteamVR.

The company also says it’s committed to making the trackers OpenXR compatible, which means (in theory) any headset could support them if they wanted.

– – — – –

I only got to use it in one configuration (on my feet) and in one environment (a large office space). So there’s still the question of how robust they will be. For now though, I’m suitably impressed.

If these trackers really work as well as they seem from their first impression, it could open the door to a new wave of people experiencing the added immersion of full-body tracking in VR… but there’s still the lingering question of price, which historically never seems to be quite right consumer VR market when it comes to HTC. Until then, our fingers shall remain crossed.

Hands-on: HTC’s New Standalone Vive Tracker Effortlessly Brings More of Your Body Into VR Read More »

‘another-fisherman’s-tale’-shows-off-more-mind-bending-puzzles-in-new-gameplay-trailer

‘Another Fisherman’s Tale’ Shows Off More Mind-bending Puzzles in New Gameplay Trailer

InnerspaceVR is bringing its sequel to the VR puzzle adventure A Fisherman’s Tale soon, aptly named Another Fisherman’s Tale. And now both InnerspaceVR and publisher Vertigo Games have released a new gameplay video showing off just what awaits. Detachable body parts, galore.

Revealed today at the Future Games Show (FGS) Spring Showcase, the new trailer shows off some of the upcoming VR puzzle game’s mind-bending universe, which this time is said to use the player’s own body as core puzzle mechanic, tasking you with detaching and replacing key body parts to solve puzzles.

Check out the trailer below:

InnerspaceVR says the sequel brings a new chapter to the story of Bob the Fisherman, “weaving a magical and moving narrative about the meaning we create through building and rebuilding our authentic selves.”

In it, the studio says players will do things like throw Bob’s hand across a ravine and then make it crawl to retrieve an object, or send your head elsewhere for a different point of view.

Limbs are also modular, as you replace them with a variety of objects to unlock new skills, such as a pirate hook hand to let you scale walls, a crab’s claw to cut through a rope, and a fish’s tail to improve your swimming ability. Puppeteering hands will also let you pick up distant objects, items and tools.

InnerspaceVR says Another Fisherman’s Tale will be a five to six hour adventure, putting you in the shoes of Nina, the daughter of the series protagonist. Here’s how InnerspaceVR describes it:

“Recollecting Bob’s grandiose stories of pirates, sunken ships, treasures and mystical locations, Nina begins re-enacting his adventures and dives head-first into an imaginative world of memory and fantasy. Will she be able to separate fact from fiction and uncover the hidden truth behind the fisherman’s tale?”

And yes, it appears French comedian Augustin Jacob is reprising his role as the game’s smokey, baritone narrator.

Another Fisherman’s Tale is slated to launch in Q2 of this year, coming to PSVR 2, Meta Quest 2, and PC VR.

‘Another Fisherman’s Tale’ Shows Off More Mind-bending Puzzles in New Gameplay Trailer Read More »

coach-partners-with-zero10-on-ar-try-on-tech-for-metaverse-fashion-week

Coach Partners With ZERO10 on AR Try-On Tech for Metaverse Fashion Week

The second edition of Metaverse Fashion Week (MVFW) is set to take place at the end of this month in Decentraland’s Luxury District, where global brands will feature their digital wearables. MVFW is a four-day-long event that combines fashion and AR try-on technology to offer a unique, immersive experience to attendees.

Metaverse Fashion Week 2023 -Arena

Metaverse Fashion Week, which will run from March 28–31 this year, will see the participation of luxury brand Coach for the first time, showcasing its products in the virtual show. The event brings together top designers and brands, making it an exciting opportunity for Coach to showcase its signature leather-made products in the metaverse.

ZERO10’s AR Try-On Tech Highlights Coach’s Iconic Tabby Bag

In collaboration with ZERO10, Coach will introduce its iconic Tabby bag with a unique AR enhancement as part of its upcoming activation during MVWF. The feature will be accessible via the ZERO10 app, allowing users in Decentraland to try on the product virtually, providing a new and engaging way to experience the brand.

COACH - Tabby bag
Source: Coach

The AR enhancement effect, which makes use of cutting-edge technology, adds a unique touch to the virtual fashion event and provides visitors with a dynamic way to interact with Coach’s products. Using AR try-on, shoppers may virtually try on clothes, accessories, and even cosmetics before making a purchase. Buyers interested in a product can virtually see how they might look in it.

As a global digital fashion platform, ZERO10 offers AR try-on technology to brands and independent creators. Through its iOS app, users can try on digital clothing in real time using their phone camera, collect items in a virtual wardrobe, and create shareable content for social media.

The digital collections are collaborations with both emerging and established fashion brands, designers, musicians, and artists and are released in limited drops within the app. The app’s cloth simulation technology simulates fabric flow, while the body tracking technology lets users try on virtual outfits for unique social media photos and videos.

Blending Tradition and Innovation

This year’s Metaverse Fashion Week theme, “Future Heritage,” encourages both traditional and emerging fashion designers to engage and work together. As part of the upcoming event, brands will conduct interactive virtual experiences both on and off the runway.

Dolce & Gabbana plans to exhibit pieces from its Future Reward digital design competition. Tommy Hilfiger intends to launch new wearables on a daily basis, along with products powered by artificial intelligence. DKNY will have a pop-up art gallery and restaurant called DKNY.3. Adidas, like Coach, will make its MVFW debut this year. For owners of its “Into the Metaverse” non-fungible token (NFT) collection, the sports brand will debut its first set of digital wearables.

Metaverse Fashion Week 2023 brands

Coach will also participate in Brand New Vision (BNV), a Web3 fashion ecosystem that enables attendees to try on wearables from various global brands seamlessly and instantly. BNV has created specifically designed stations to showcase the digital clothing collections created in partnership with top brands such as Tommy Hilfiger, Carolina Herrera, Michael Kors, and Vivienne Tam. Moreover, a newly built “Fashion Plaza” will also exhibit emerging digital fashion possibilities.

MVFW Open Metaverses and Web3 Interoperability

Dr. Giovanna Graziosi Casimiro, Decentraland’s head of MVFW, remarked that they are honored to carry on the Metaverse Fashion Week tradition this year. “We are seeing the return of many luxury fashion houses, and also the emergence and elevation of digitally native fashion. We are excited to see the world’s greatest fashion minds engaging in digital fashion and exploring what it can mean for their brands, and for their communities,” she said.

This year’s MVFW will highlight the force of interoperability across open metaverses while expanding the boundaries of what digital fashion can be. MVFW23, organized by Decentraland and UNXD, is an immersive art and culture event, in association with the Spatial and OVER metaverses, that welcomes fashionistas from all over the globe to gather, mingle, and witness the most recent breakthroughs in digital fashion.

Fashion brands trying on various virtual technologies like AR try-on is a testament to their commitment to staying at the forefront of the latest technology trends and providing their customers with unique and immersive experiences.

Coach Partners With ZERO10 on AR Try-On Tech for Metaverse Fashion Week Read More »

pico’s-big-announcement-at-gdc?-nothing-in-particular

Pico’s Big Announcement at GDC? Nothing in Particular

Pico Interactive took to the Game Developers Conference (GDC) in San Francisco this week with a massive booth, hyping the event as a “treat” that would be a ‘Journey to Infinity’. Well, there’s nothing in particular to announce.

Pico, the creator of the Pico 4 standalone, is arguably one of the biggest untapped threats to Meta’s market supremacy in the consumer VR standalone segment. At least for now.

Owned by TikTok parent company ByteDance, many speculated that the China-based Pico was finally ready to announce the consumer launch of Pico 4 in the United States, a step that many (including us) have been waiting for following a US hiring spree last summer.

At the time, a Protocol report maintained the move would usher in “a major focus on content licensing as well as marketing its hardware to U.S. consumers.”

Image courtesy Pico

So, is Pico 4 coming to US consumers? We spoke to the company at GDC this week, and despite a flashy ‘save the date’ countdown to its ‘Journey to Infinity’ and a massive booth on the show floor, there’s simply nothing to report.

In for a treat at #GDC23? Come say hi at Booth S627, Moscone Center and take a dive into the #JourneytoInfinity with #PICO #PICO4 #VirtualReality #VR #GDC23 #JourneytoInfinity pic.twitter.com/Julujtyilc

— PICO XR (@PICOXR) March 22, 2023

To boot, Pico’s Twitter presence isn’t very large—it has less than 10,000 followers at the time of this writing—but the initial countdown tweet promising to kick off “a new journey” managed to take the record for the company’s most-viewed tweet at more than 67,000 impressions. People were expecting something big from Pico at GDC, and it simply didn’t manifest.

Pico is probably the only company right now with both a capable device and the market stability to directly compete with Meta, the undisputed champion of the consumer standalone segment with Quest 2. Under the wing of the Chinese ByteDance media empire, Pico not only has the sort of cash reserves to subsidize hardware, but also a growing ability to attract developer interest.

Launched in October 2022 and priced at €420 (~$455), Pico 4 is available across Europe, China and a number of APAC countries, including Japan, Singapore, and South Korea. The missing puzzle piece is undoubtedly a North American release.

But, if you’re still waiting to hear about the much-anticipated US rollout, you should adjust your expectations for this week, because the only thing going on at the Pico booth are demos for the headset.


We have boots on the ground at GDC this week, so make sure to check back on all things AR/VR as we dive into developer sessions and see everything on the show floor.

Pico’s Big Announcement at GDC? Nothing in Particular Read More »

one-of-vr’s-most-veteran-studios-has-grown-to-200-employees-while-continuing-to-double-down-on-vr

One of VR’s Most Veteran Studios Has Grown to 200 Employees While Continuing to Double-down on VR

Having been exclusively building VR games since 2013, nDreams stands as one of the most veteran VR-exclusive game studios to date. And with more than 200 people, one of the largest too. The studio’s CEO & founder, Patrick O’Luanaigh, continues to bet his company’s future on the success of VR.

Speaking exclusively to Road to VR ahead of a presentation at GDC 2023, Patrick O’Luanaigh talks about the growing success of nDreams and why he’s still doubling down on VR.

Starting in 2013, O’Luanaigh has navigated his company from the earliest days of the modern VR era to now, which he believes is VR’s biggest moment so far—and growing.

Between the company’s own internal data and some external sources, O’Luanaigh estimates that VR’s install base is around 40 million headsets across the major platforms, excluding the recently launched PSVR 2. At least half of that, he estimates, is made up by 20 million Quest headsets.

While it’s been a challenge to keep all those headsets in regular use, O’Luanaigh says the size of the addressable VR market today is bigger than ever.

That’s why he’s bulked up the company to some 200 employees, nearly doubling over the course of 2022 through hiring and studio acquisitions.

O’Luanaigh says, “this is the biggest we’ve ever been and it’s showing no signs of slowing down. […] In a decade of exclusively making VR games, we’ve never seen that growth before.”

O’Luanaigh knows well that content is key for getting players into their headsets, and to that end his efforts to scale the company are about building bigger and better VR content to keep up with the growth and expectations of the install base, he says.

“Setting up our fully-remote nDreams studios, Orbital and Elevation, was significant for us in establishing a powerful basis for developing multiple projects in parallel,” he says. “It gives us the specialism to develop the variety of VR titles, across multiple genres, that the growing market now demands.”

O’Luanaigh points to nDreams developed and published titles Phantom: Covert Ops (2020), Shooty Fruity (2020), Fracked (2021), and Little Cities (2022) as some of the most successful VR games the studio has launched thus far, with Phantom: Covert Ops specifically finding “important commercial success” on Quest 2.

With the release of those titles over the years and their ongoing sales, O’Luanaigh shares that nDreams doubled its year-over-year revenue over the last 12 months. And with multiple new projects in the works, including Synapse, Ghostbusters: Rise of the Ghost Lord, and other (unannounced) projects, he believes the company is on track to more than double annual revenue again by 2024.

Phantom: Covert Ops | Image courtesy nDreams

Though he’s leading a company of 200 employees, O’Luanaigh calls himself a “massive VR enthusiast,” and is still very clearly in touch with makes VR such a unique and compelling medium.

He says his studio aims to build around five key pillars that make for compelling VR content:

  1. Aspirational roleplay – first-person embodiment of appealing roles or characters
  2. High-agency interaction – tactile 1:1 mechanics in a freely explorable world
  3. Empowering wielding – Feel, hold, and use visceral weapons, tools, and abilities
  4. Emotional amplification – Immersive situations that provoke strong, diverse feelings
  5. Fictional teleportation – Presence within desirable locations, inaccessible in real life

And while O’Luanaigh could easily steer this studio away from VR—to chase a larger non-VR market—he continues to double down on VR as the studio’s unique advantage. Far from moving away from VR, his company is actively trying to bring others into the fold; O’Luanaigh says nDreams continues to expand its publishing operations.

“The success of Little Cities, which has just launched its free ‘Little Citizens’ update, has been a great validation of our investments into third-party publishing and we are actively on the lookout for more amazing indie developers to work with.”

With the scale that VR has now reached, O’Luanaigh believes the market is truly viable for indie developers. And that’s why he’s glad to see the rise of VR publishers (and not just his own company); having the benefit of longstanding expertise in the medium is crucial to shipping a shipping a quality VR title, and that’s why O’Luanaigh believes VR-specific publishers like nDreams will play an important role in bringing more developers and great content to VR.

[ir]

That expertise is increasingly building upon itself in the company’s VR games which have shown impressive mechanical exploration, giving the studio the chance to test lots of VR gameplay to find out what works.

Few in VR have had the gall to prove out something as seemingly wacky as a ‘VR kayak shooter’ and actually take it to market in a large scale production like Phantom: Covert Ops. And you can clearly see the lineage of a game like nDreams’ Fracked shining through in upcoming titles like Synapse. Though the game is an entirely new IP and visual direction, the unique Fracked cover system is making the leap to Synapse; a clear example of leveraging a now battle-tested mechanic to enhance future titles. But more than just a reskin of a prior shooter, nDreams continues to experiment with unique VR mechanics, this time promising to harness the power of PSVR 2’s eye-tracking to give players compelling telekinetic powers.

Synapse | Image courtesy nDreams

To that end, the studio’s lengthy experience in the medium is clearly an asset—and one that can only be earned rather than bought. Where exactly that experience will take them in the long run is unclear, but even after all the ups and downs the industry has seen, O’Luanaigh and nDreams remain all-in on VR.

One of VR’s Most Veteran Studios Has Grown to 200 Employees While Continuing to Double-down on VR Read More »

meta-keeps-the-oculus-name-alive-as-third-party-vr-publisher-becomes-‘oculus-publishing’

Meta Keeps the Oculus Name Alive as Third-party VR Publisher Becomes ‘Oculus Publishing’

Meta has nearly scrubbed all of its products of the Oculus name, however the company today announced its third-party publishing wing is getting a sort of rebrand that will see the Oculus name live on.

Meta announced at the Game Developers Conference (GDC) that it’s naming its third-party publishing arm Oculus Publishing. The company tells us Oculus Studios, its first-party studio, will continue to exist.

To date, Meta’s growing fleet of acquired first-party studios includes Beat Games (Beat Saber), Sanzaru Games (Asgard’s Wrath), Ready at Dawn (Lone Echo & Echo VR), Downpour Interactive (Onward), BigBox VR (Population: One), and Within (Supernatural).

Third-party titles under Oculus Publishing include Among Us VR (Innersloth, Schell Games), Bonelab (Stress Level Zero), The Walking Dead: Saints & Sinners (Skydance Interactive), and Blade & Sorcery: Nomad (Warpfrog).

Notably, there’s little left that sports the Oculus brand since the company made its big metaverse pivot in October 2021. Besides older hardware, the only things most people see with the ‘Oculus’ moniker is the Oculus PC app and Meta’s Oculus web portal, where the company still lists game libraries for Quest, Rift, Go, and Gear VR.

“This year marks a full decade since the inception of the original Oculus Content Team,” the company says in a developer blog post. “From Kickstarter to Quest, Meta has committed hundreds of millions of dollars in third-party content funding and specialized development support to help make the VR games landscape what it is today. Now, we’re excited to unveil an official name for one of the world’s largest VR games programs for developers: Oculus Publishing.”

The company says Oculus Publishing will continue to directly partner with development teams on conceptualization, funding, production, technology advancement, game engineering, promotion and merchandising.

The company says it’s contributed funding to “more than 300 titles,” and that there are another 150 titles in active development today.

Meta Keeps the Oculus Name Alive as Third-party VR Publisher Becomes ‘Oculus Publishing’ Read More »