News

eye-tracking-is-a-game-changer-for-xr-that-goes-far-beyond-foveated-rendering

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering Read More »

the-park-playground’s-newest-vr-experience-for-battle-game-fans-is-here

The Park Playground’s Newest VR Experience for Battle Game Fans Is Here

Europe-based virtual reality experiences provider The Park Playground recently launched its latest offering, a new esports-inspired VR experience called NanoClash Focus. A first in the industry, this new virtual reality game allows two opposing teams to battle it out simultaneously on two independent fields. It’s a fully mobile, free-roam game that aims to provide an immersive and engaging VR experience for players.

An Exciting Combo of VR Experience and Esports

For NanoClash Focus The Park Playground partnered with HTC and Triangle Factory. The company was encouraged by the success of one of its past VR experiences that allowed players to compete remotely in a virtual battlefield in separate cities. This led to the development of NanoClash Focus, which used elements from esports in its design and artificial intelligence technology to ensure an enhanced user experience.

VR experience NanoClash Focus The Park Playground

Players of NanoClash Focus are virtually transported to a hanging platform in a futuristic setting, where they compete against another team in a laser shoot-out, sports battle-style. Each team consists of four to eight people, and the goal is to be the first to reach the arena floor and prevent opponents from doing so by shooting with either laser guns or laser cannons.

Using HTC VIVE Focus 3 wireless headsets that offer greater mobility, players can freely roam the playing field. “Power-ups” are up for grabs in the game, giving teams a solid advantage when they utilize them strategically. The game is a VR experience combined with the exciting features of esports, a move that The Park Playground wants to pursue.

NanoClash Focus VR experience The Park Playground

“With two teams positioned on two independent free roam fields and rewards given for teamwork and strategic thinking, NanoClash Focus is an exciting example of how we’re driving technological innovations in LBE VR alongside our partners,” The Park Playground CEO Peter Vindevogel said in a press release shared with ARPost.

According to Vindevogel, the company plans to develop more experiences that use elements of other gaming formats. NanoClash Focus is an example of a location-based VR experience that is inspired by esports for a more immersive, interactive, and inclusive experience. “We’ll be seeing this cross-pollination between LBE VR and elements more traditionally associated with gaming surfacing much more frequently in the future,” Vindevogel further explained.

Developing More VR Experiences with AI

The release of NanoClash Focus is an exciting achievement for The Park Playground. The company has only recently begun experimenting with AI technology and data management in its game development process. The result is a more efficient way of developing seamless and innovative VR experiences that The Park Playground aims to achieve.

“Tapping into emerging technologies like AI is something LBE VR providers must consider doing to remain relevant and drive innovative VR experiences that keep people coming back for more,” said the company’s CTO Gilles-Adrien Cenni.

With the launch of NanoClash Focus, The Park Playground seems poised to enter more markets in Europe, Australia, and the US. Headquartered in the Netherlands, the VR experience company currently has 13 owned and franchised locations around the world.

It recently opened two new locations in Brisbane, Australia, and Leeds, UK, and is set to open another venue in Birmingham this year. NanoClash Focus is available for players to try out in all of The Park Playground’s locations globally.

AWE USA 2023 giveaway

The Park Playground’s Newest VR Experience for Battle Game Fans Is Here Read More »

new-promo-video-suggests-standalone-‘asgard’s-wrath’-game-coming-to-quest

New Promo Video Suggests Standalone ‘Asgard’s Wrath’ Game Coming to Quest

Meta is putting on its Quest Gaming Showcase in June, and while we’re not certain what standalone goodies the company has in store, a promo video seems to suggest we’ll be getting something from the universe of hit Rift exclusive Asgard’s Wrath (2019).

We say ‘universe’ and not ‘direct port’ because we simply can’t tell for now based on the few seconds of footage, which seems to show Loki’s helmet with what appears to be a shadowy god-like figure in the background.

What suggests the promo may not be a flat-out Rift to Quest port is the desert environment. If you’ve played Asgard’s Wrath on Rift, you may remember some post-credits sequel bait, where you find an Egyptian ankh that suggests a follow-up will take place in an Egyptian-inspired environment.

Meta largely abandoned PC VR gaming almost immediately after releasing Rift S and Asgard’s Wrath in 2019 however, afterwards devoting its clutch of VR gaming studios to produce content for Quest and putting the kibosh on a direct-to-Rift sequel in the process. Maybe the next in the series will live on as a Quest native from the get-go?

It would certainly make more sense than Meta’s Sanzaru Games going back and completely overhauling the original Asgard’s Wrath for Quest, although we haven’t heard anything from the studio since it was acquired by Meta in early 2020. It’s not inconceivable that the original and a sequel could be in the works for Quest.

Meanwhile, we’ll be waiting to hear about the other rash of long-promised Quest content yet to come, including Grand Theft Auto: San Andreas, Assassin’s Creed Nexus, and Vertigo Games’ upcoming work with Deep Silver’s IP, which could be anything from Metro to Dead Island.

Follow along with us on June at at 10 AM PT to find out, as Meta is slated to share over 40 minutes of content, including new game announcements, gameplay first-looks, updates to existing games, and more.

New Promo Video Suggests Standalone ‘Asgard’s Wrath’ Game Coming to Quest Read More »

the-hidden-design-behind-the-ingenious-room-scale-gameplay-in-‘eye-of-the-temple’

The Hidden Design Behind the Ingenious Room-Scale Gameplay in ‘Eye of the Temple’

Eye of the Temple is one of the rare VR games that focuses on not just on pure room-scale movement, but dynamic room-scale movement. The result is a uniquely immersive experience that required some clever design behind the scenes to make it all work. This guest article by developer Rune Skovbo Johansen explains the approach.

Guest Article by Rune Skovbo Johansen

Rune Skovbo Johansen is a Danish independent game developer based in Turku, Finland. His work spans games and other interactive experiences, focused on tech, wonder, and exploration. After positive reception of the 2016 VR game jam game Chrysalis Pyramid, he started working on a more ambitious spiritual successor, Eye of the Temple, and at the end of 2020 he quit his day job to pursue indie game development full-time.

In Eye of the Temple, you move through a vast environment, not by teleportation or artificial locomotion, but by using your own feet. It makes unique use of room-scale VR to deliver an experience of navigating an expansive space.

In Eye of the Temple you move around large environments using your own feet

But how does it work behind the scenes? To mark the upcoming release of Eye of the Temple on Quest 2, I wanted to take the time to explain these aspects of the game’s design that I’ve never fully gone into detail with before. In this article we’ll go over a variety of the tricks the game uses to make it all work. Let’s start with the basics of keeping the player in the play area

Keeping the Player in the Play Area

Say you need to go from one tall pillar in the game to another via a moving platform. You step forward onto the platform, the platform moves, and then you step forward onto the next pillar. But now you’re outside your physical play area.

Moving platforms are positioned in a way to keep players inside the play area

If we instead position the moving platform to the side, it goes like this: You sidestep onto the platform, it moves, and you sidestep onto the next pillar. Since you took a step right, and then left, you’re back where you started in the center of the play area. So the game’s tricks are all about how the platforms are positioned relative to each other.

Now, to get a better sense for it, let’s look at some mixed reality footage (courtesy of Naysy) where a grid representing the play area is overlaid on top.

Mixed reality footage with a grid overlaid on top which represents the play area

Keeping an Overview in the Level Design

Now that we’ve seen how the trick works, let’s take a look at how I keep track of it all when doing the level design for the game. First things first – I made this pattern, which represents the player’s entire play area – or the part of it the game takes advantage of anyway:

A pattern representing the physical play area

As you can see, there’s a thick white border along the edge, and a thick circle in the center.

Every platform in the game has a designated spot in the play area and a pattern overlay that shows what that spot is. For platforms that are a single tile large, it’s generally one of nine positions. The overlay makes it easy to see if a given platform is positioned in the center of the play area, or at an edge or corner.

The play area pattern overlaid on each platform and its end positions make it easy to see if they are lined up correctly in the level design

Additional overlays show a ghostly version of the pattern at both the start and end positions of a moving platform. This is the real trick of keeping track of how the platforms connect together, because these ghostly overlays at the end positions make it trivial to see if the platforms are lined up correctly in the level design when they touch each other. If the adjacent ghostly patterns are continuous like puzzle pieces that fit together, then the platforms work correctly together.

It still took a lot of ingenuity to work out how to position all the platforms so they both fit correctly together and also take the player where they need to go in the virtual world, but now you know how I kept the complexity of it manageable.

Getting the Player’s Cooperation

The whole premise of getting around the world via these moving platforms is based on an understanding that the player should step from one platform to another when they’re lined up, and not at other times. The most basic way the game establishes this is by just telling it outright to the player in safety instructions displayed prior to starting the game.

One of the safety instructions shown before the game begins

This instructions is shown for two reasons:

One is safety. You should avoid jumping over gaps, otherwise you would risk jumping right out of your play area and into a wall, for example.

The other is that the game’s system of traversal only works correctly when stepping from one platform to another when they line up. This is not as critical – I’ll get back to later what happens if stepping onto a platform that’s misaligned – but it still provides the best play experience.

Apart from the explicit instructions, the game also employs more subtle tricks to help ensure the player only steps over when blocks are correctly aligned. Consider the following example of a larger 2 x 2 tile static platform the player can step onto. A moving platform arrives from the side in a way that would allow the player to step off well before the platform has stopped moving, but that would break the game’s traversal logic.

In this room, ‘foot fences’ are used to discourage the player from stepping from one platform to another when they are not correctly aligned

To avoid this, “foot fences” were placed to discourage the player from stepping over onto the static platform (or away from it) at incorrect positions. The fences are purely visual and don’t technically prevent anything. The player can still step over them if they try, or right through them for that matter. However, psychologically it feels like less effort to not step over or through a fence and instead step onto the static platform where there’s a gap in the fence. In this way, a purely non-technical solution is used as part of the game’s arsenal of tricks.

Continued on Page 2: Correcting for Unaligned Platforms »

The Hidden Design Behind the Ingenious Room-Scale Gameplay in ‘Eye of the Temple’ Read More »

immerse-yourself-in-sandbox-vr’s-new-multi-story-vr-game-“dragonfire”

Immerse Yourself in Sandbox VR’s New Multi-Story VR Game “Dragonfire”

Sandbox VR, known for its location-based VR games, announced today the launch of its newest VR experience. Seekers of the Shard: Dragonfire is the company’s first VR game that features multiple storylines, created by the company’s AAA in-house gaming studio.

VR game “Dragonfire”: A New Immersive Experience Each Time

After a successful partnership with Netflix that will see its hit series Squid Game developed into an immersive VR experience later this year, Sandbox VR continues to explore new ways to develop VR worlds for its customers. Dragonfire’s multiple storylines allow players to experience something different each time they play, even after completing it.

Developed by experts in the VR industry, Dragonfire was helmed by the former lead designer on Assassin’s Creed franchise and Sandbox VR’s VP of Content and Creative Director, Michael Hampden.

The new VR game makes full use of Sandbox VR’s proprietary technology that lets players experience a fully immersive, VR-powered adventure. It’s the first game that features branching storylines, making each experience different from others.

Sandbox VR Develops Unique VR Experiences

For Sandbox VR CEO Steve Zhao, developing Dragonfire is a step toward the company’s goal of providing unique VR experiences. “With each experience we create, our goal is to push ourselves to invent new ways to immerse players in virtual worlds,” he stated in a press release shared with ARPost.

Set in a castle in a fantasy world, VR game Seekers of the Shard: Dragonfire sees players come together as a group of explorers. They encounter a variety of enemies and mysteries and go through several different areas to accomplish their mission.

Sandbox VR Multi-Story VR Game Dragonfire

Because of the branching story arcs within the game, players can choose from many options during gameplay. In essence, players can experience different versions of Dragonfire, depending on their choices. In fact, younger gamers will be able to play an age-appropriate version of the game that does not contain violence.

Consequently, such a complex design posed a big challenge for Sandbox VR in the development of Dragonfire.

“There’s a reason why there isn’t anything like Dragonfire available out there, because so many features have to be designed from the ground up,” Hampden said.  “Melee combat, magic weapons and spells, unlockable items, and choosing where to go next are just a few of the new features we have added to make this perhaps the deepest and most replayable location-based VR experience yet.” 

Expanding Sandbox VR’s Virtual Worlds

Sandbox VR is a location-based VR startup with over 35 locations around the world. It provides immersive VR experiences to guests, which the company describes as similar to the fictional “holodecks” popularized by the Star Trek franchise. Up to six guests can participate in each experience.

Sandbox VR uses motion capture technology and high-quality haptics to give players a sense of realism while they explore virtual worlds. Each gameplay is designed to be a social experience where friends and family work together to complete game objectives.

Seekers of the Shard: Dragonfire is the seventh exclusive immersive experience developed by Sandbox VR, along with Amber Sky 2088, Star Trek: Discovery, Deadwood Mansion, Deadwood Valley, UFL: Unbound Fighting League, and Curse of Davy Jones. Squid Game, the highly anticipated VR experience in partnership with Netflix, is currently under development, set to launch in late 2023.

Immerse Yourself in Sandbox VR’s New Multi-Story VR Game “Dragonfire” Read More »

meta-to-host-quest-gaming-showcase-just-days-ahead-of-rumored-apple-headset-announcement

Meta to Host Quest Gaming Showcase Just Days Ahead of Rumored Apple Headset Announcement

Meta announced its third annual Quest Gaming Showcase is arriving next month, coming only a few days before Apple’s rumored XR headset announcement at Worldwide Developers Conference (WWDC).

Meta is livestreaming the Quest Gaming Showcase on June 1st, a bit unusual for the company, as it traditionally holds the annual event in late April.

Calling it their “biggest celebration of the depth and breadth of content across the Meta Quest Platform yet,” Meta is slated to share over 40 minutes of content, including a brand-new pre-show covering game updates and debut trailers, starting 15 minutes before the show begins.

Meta says to expect new game announcements, gameplay first-looks, updates to existing games, and more. There’s also set to be a post-show developer roundtable, which will feature conversation around upcoming games.

There could be at least one clue to what’s in store, as we get a brief glimpse at a horned helmet in the showcase’s promo video, which seems very much like Loki’s helmet from Rift exclusive Asgard’s Wrath (2019). Maybe Meta’s Sanzaru Games has slimmed down the Norse-inspired RPG?

Meanwhile, previous reports maintain Apple is finally set to unveil its long rumored mixed reality headset during the company’s WWDC keynote, taking place on Monday, June 5th.

Provided Apple indeed plans to announce its headset at WWDC, Meta could be looking to generate so called ‘strategic noise’ to better manage market reactions, and potentially offset any negative sentiment prior to Apple’s expected announcement—undoubtedly slated to be a pivotal moment for the entire XR industry.

Meta recently released its Q1 2023 earnings report, showing a consistent investment of around $4 billion per quarter into its XR division Reality Labs. With Apple rumored to be unveiling their own XR headset and a host of apps, reportedly set to include everything from fitness to VR/AR gaming, Meta may want to showcase where some of that investment is going.

Who knows? We may even hear more about Meta’s promised Quest 3 at the gaming showcase, which the company has confirmed will “fire up enthusiasts” when its released at some point this year, notably targeting a higher price point than its Quest 2 headset.

To find out, tune into the Quest Gaming Showcase on June 1st at 10AM PT (local time here), livestreamed across the company’s various channels, including TwitchFacebookYouTube, and in Meta Horizon Worlds.

Meta to Host Quest Gaming Showcase Just Days Ahead of Rumored Apple Headset Announcement Read More »

visually-stunning-adventure-‘hubris’-is-coming-to-psvr-2-soon,-promising-much-needed-improvements

Visually Stunning Adventure ‘Hubris’ is Coming to PSVR 2 Soon, Promising Much Needed Improvements

Cyborn is bringing Hubris (2022), the sci-fi adventure for PC VR headsets, to PSVR 2 this month, promising a host of improvements we hope will buff out more than a few dull spots in the otherwise visually impressive game.

There’s no precise release date yet; Cyborn says in a PS blog post that Hubris will land on PSVR 2 at some point in May, notably bringing along with it “enhanced graphics and gameplay.”

This includes foveated rendering for sharper resolution, haptics & adaptive triggers for swimming and shooting, revamped reloading and aiming, new enemy variations, refined difficulty levels, and 3D audio.

Many of the things above were sorely lacking from the game when we reviewed the game at launch in late 2022, cementing it as a visually stunning, but ultimately pretty flawed shooter from the get-go.

Some of those improvements will eventually be added to the PC VR version, but “probably not on the same day as the PS VR 2 release as it requires a lot more testing on different headsets when it is stable enough,” the studio says in the game’s official Discord. Cyborn has also confirmed Hubris will also launch on Quest 2 at some point, although the studio hasn’t mentioned specifics.

Visually Stunning Adventure ‘Hubris’ is Coming to PSVR 2 Soon, Promising Much Needed Improvements Read More »

“operation-money-grab”:-get-your-spy-gear-ready-for-the-greatest-heist-in-valoarena

“Operation Money Grab”: Get Your Spy Gear Ready for the Greatest Heist in ValoArena

Operation Money Grab is all systems go. The heist is officially on. Valo Motion has launched the latest addition to its growing game library of multiplayer mixed reality games for ValoArena. This heist-themed game is set to challenge players physically and mentally as they become immersed in a spy adventure that’s almost straight out of a Hollywood movie. To complete all the missions, you’d need to keep your wits and work well with your team.

Grab Your Squad and Pull Off the Greatest Heist in MR

Recruit a team of up to six players and attempt to pull off the greatest heist in ValoArena. Be spies or thieves and grab as much money and valuables as you can while outmaneuvering high-tech security systems and evading capture.

Operation Money Grab MR game ValoArena Valo Motion

Upon entering the game arena, be transported to the lobby of the Museum of Money. Enter the elevator and brace yourself for a highly interactive and mentally stimulating spy adventure. As the elevator door opens, figure out how to get past the unique security system and snatch valuable items. Work as a team to solve puzzles and go through physical obstacles on every floor of the building. Once all floors are cleared, head to the roof of the building, then board the helicopter on standby for your grand escape.

Throughout the game, references to popular spy and heist movies add humor and make you feel like the lead star in a Hollywood film. Download the Valo Motion app to create shareable play videos and release your “trailer” on social media.

Powered by Valo Motion Technology

Like all other mixed reality games for Valo Arena, Operation Money Grab is powered by the proprietary motion tracking technology of Valo Motion. With full body tracking, players can roam freely around the game arena and play untethered without using any wearables.

Large digital screens, visually stunning graphics, and spatial audio create a hyper-realistic virtual environment that immerses players in the game.

MR game Operation Money Grab ValoArena Valo Motion

According to Lauri Lehtonen, Lead Developer in Valo Motion, the popularity of the game Groundfall, inspired by The Floor is Lava, made their team think of other playground games they could draw inspiration from. Lehtonen recounts,

“We noticed that quite a few of them are played by moving back and forth in a limited space. This gave rise to the idea of designing a game with a series of different challenges players must go through in the ValoArena play area,” Lehtonen said in a press release shared with ARPost“That idea eventually developed into the espionage and robbery-themed game Operation Money Grab, where the game challenges vary a lot during one game, and the players are taken on a kind of mini-adventure through a building to be robbed.”

Innovative Games That Keep People Active

Along with Astro Blade, Runway Zero, Toywatch Island, and Groundfall, Operation Money Grab highlights Valo Motion’s aim to encourage people to lead active and healthy lives through interactive experiences. “Valo Motion takes great pride in developing ValoArena experiences that get people on their feet and moving,” said Raine Kajastila, CEO and founder of Valo Motion.

By creating innovative games that are entertaining and highly interactive, Valo Motion hopes to make it easier and much more fun for people to develop and maintain healthy habits. We expect to see its game library grow with more themed games that would pique the different interests of players of all ages.

“Operation Money Grab”: Get Your Spy Gear Ready for the Greatest Heist in ValoArena Read More »

popular-quest-2-pc-streaming-software-adds-‘super-resolution’-feature-for-enhanced-visuals

Popular Quest 2 PC Streaming Software Adds ‘Super Resolution’ Feature for Enhanced Visuals

Virtual Desktop has collaborated with Qualcomm to integrate the company’s Snapdragon Game Super Resolution, a software enhancement squarely targeted at increasing the wireless streaming quality and latency of PC visuals to Quest 2 and Pico devices.

Virtual Desktop is a great tool not only because it provides standalone headset users wireless access to their computers, but because its developer, Guy Godin, is constantly adding in new features to tempt users away from using built-in solutions, e.g. Air Link.

That’s a tall order since built-in stuff like Air Link are typically free and usually pretty great, letting Quest and Pico users connect to their VR-ready PCs to play games like Half-Life: Alyx, but Virtual Desktop goes a few steps further. With its PC native application developed for high quality wireless streaming, you can do things like cycle through multiple physical monitors and even connect to up to four separate computers—a feature set you probably won’t see on the Air Link change log.

Now Godin has worked with Qualcomm to integrate the company’s Snapdragon Game Super Resolution for built-in upscaling, essentially creating higher resolution images from lower resolution inputs so it can be served up to standalone headsets in higher fidelity. Check out the results below:

Because producing clearer visuals with fewer resources is the name of the game, Qualcomm says in a blog post that its techniques can also reduce wireless bandwidth, system pressure, memory, and provide power requirements.

Godin says in a Reddit post that the new upscaling works with “Potato, Low, Medium quality (up to 120fps) and High (up to 90fps), and it upscales to Ultra resolution under the hood. It can work with SSW enabled as well and doesn’t introduce any additional latency.”

You can get Virtual Desktop on Quest over at the Quest Store, priced at $20. It’s also available on Pico Neo 3 and Pico 4, and will also soon arrive on Vive Focus 3 and XR Elite too, Godin says.

Update (10: 30 ET): Guy Godin reached out to Road to VR to correct that the new Snapdragon Game Super Resolution is available on Quest, Pico, and will soon come to Vive Focus 3 and XR Elite. We’ve included that in the body of the article.

Popular Quest 2 PC Streaming Software Adds ‘Super Resolution’ Feature for Enhanced Visuals Read More »

meta-reaffirms-commitment-to-metaverse-vision,-has-no-plans-to-slow-billions-in-reality-labs-investments

Meta Reaffirms Commitment to Metaverse Vision, Has No Plans to Slow Billions in Reality Labs Investments

Meta announced its latest quarterly results, revealing that the company’s Reality Labs metaverse division is again reporting a loss of nearly $4 billion. The bright side? Meta’s still investing billions into XR, and it’s not showing any signs of stopping.

Meta revealed in its Q1 2023 financial results that its family of apps is now being used by over 3 billion people, an increase of 5% year-over-year, but its metaverse investments are still operating at heavy losses.

Reality Labs is responsible for R&D for its most forward-looking projects, including the Quest virtual reality headset platform, and its work in augmented reality and artificial intelligence. Meta CEO Mark Zuckerberg has warned shareholders in the past that Meta’s XR investments may not flourish until 2030.

Here’s a look at the related income losses and revenue for Reality Labs since it was formed as a distinct entity in Q4 2020:

Image created by Road to VR using data courtesy Meta

Meta reports Reality Labs generated $339 million in revenue during its first quarter of the year, a small fraction of the company’s 28.65 billion quarterly revenue. The bulk of that was generated from its family of apps—Facebook, Messenger, Instagram, and WhatsApp.

While the $3.99 billion loss may show the company is tightening its belt in contrast to Q4 2022, which was at an eye-watering $4.28 billion, Meta says we should still expect those losses to continue to increase year-over-year in 2023.

This follows the company’s second big round of layoffs, the most recent of which this month has affected VR teams at Reality Labs, Downpour Interactive (Onward) and Ready at Dawn (Lone Echo, Echo VR). The company says a third round is due to come in May, which will affect the company’s business groups.

Dubbed by Zuckerberg as the company’s “year of efficiency,” the Meta founder and chief said this during the earning call regarding the company’s layoffs:

“This has been a difficult process. But after this is done, I think we’re going to have a much more stable environment for our employees. For the rest of the year, I expect us to focus on improving our distributed work model, delivering AI tools to improve productivity, and removing unnecessary processes across the company.”

Beyond its investment in AI, Zuckerberg says the recent characterization claiming the company has somehow moved away from focusing on the metaverse is “not accurate.”

“We’ve been focusing on both AI and the metaverse for years now, and we will continue to focus on both,” Zuckerberg says, noting that breakthroughs in both areas are essentially shared, such as computer vision, procedurally generated virtual worlds, and its work on AR glasses.

Notably, Zuckerberg says the number of titles in the Quest store with at least $25 million in revenue has doubled since last year, with more than half of Quest daily actives now spend more than an hour using their device.

The company previously confirmed a Quest 3 headset is set to release this year, which is said to be slightly pricier than the $400 Quest 2 headset with features “designed to appeal to VR enthusiasts.”

Meta Reaffirms Commitment to Metaverse Vision, Has No Plans to Slow Billions in Reality Labs Investments Read More »

one-of-vr’s-smartest-room-scale-games-finally-comes-to-quest-2

One of VR’s Smartest Room-scale Games Finally Comes to Quest 2

Room-scale puzzle Eye of the Temple (2021) is available on Quest 2 starting today, bringing one of VR’s most clever room-scale experiences to a platform where it probably makes the most sense.

Update (April 27th, 2023): Eye of the Temple is now live on the Quest Store for Quest 2, bringing its innovative room-scale puzzling to the standalone headset.

Ported to Quest with the help of Salmi Games, Eye of the Temple lets you explore a vast and treacherous temple and uncover the ancient legend of the Eye. Just make sure to have plenty of space in your room for plenty of walking, whipping, and hopefully no tripping.

Check out the new launch trailer, linked below:

Original Article (April 13th, 2023): Released on SteamVR headsets in 2021 by indie developer Rune Skovbo Johansen, Eye of the Temple is a unique puzzle that we haven’t seen before or since.

The game’s innovative locomotion style lets you explore a massive temple complex with your own two feet, ushering you to jump onto moving platforms of all shapes and sizes, which importantly takes place within a 2×2m physical space.

What results is a mechanically pleasing and immersive experience that teleportation or even joystick-controller smooth locomotion simply can’t provide. We liked it so much at the time, we even gave it Road to VR’s 2021 Excellence in Locomotion award.

Skovbo Johansen says the secret to the unique locomotion style is keeping the player in the center of the play area, which he says are “all about how the platforms are positioned relative to each other.”

Take a look at how it works in the explainer video below:

While most PC VR tethers provide enough slack to get around the required 2×2m play area, the amount of turning and jumping you’ll do in the physical space really pushes the user’s ability to ‘tune out’ the cable to the limit, as you have to unwind yourself and hop over the tether constantly—something you might not notice as much in less physical games.

There’s no word on when we can expect Eye of the Temple to release on Quest 2, which critically removes any cable faffing woes you may have.

In the meanwhile, catch the trailer below, and follow along with Skovbo Johansen on Twitter where he regularly posts updates on the game’s development.

One of VR’s Smartest Room-scale Games Finally Comes to Quest 2 Read More »

‘propagation-vr’-sequel-coming-to-quest-&-steamvr-next-week,-gameplay-trailer-here

‘Propagation VR’ Sequel Coming to Quest & SteamVR Next Week, Gameplay Trailer Here

Propagation VR (2020), the VR survival horror game for PC VR headsets, is getting a sequel called Propagation: Paradise Hotel, and it’s coming next week.

Update (April 27th, 2023): WanadevStudio announced Propagation: Paradise Hotel is coming on May 4th to Quest 2 and SteamVR headsets. You can now wishlist it on the Quest Store and Steam.

In Propagation: Paradise Hotel you are a solo adventurer taking on the role of Emily Diaz, who must explore the Paradise Hotel’s dark surroundings to find her lost twin sister Ashley. Use items, weapons, and tools as you progress through the story, which is filled with savage creatures thanks to a strange illness.

Check out the final gameplay trailer below:

Original Article (December 3rd, 2021): During Upload VR’s showcase, developer WanadevStudio unveiled the upcoming sequel, which promises to be an “intense VR survival horror adventure with thrilling storytelling, in which you will explore dark environments, make terrifying encounters and get your adrenaline pumping.”

WanadevStudio says the sequel will be a single-player adventure taking place in the Propagation universe, which will serve up a story that focuses on exploration, stealth, and action. And plenty of zombies and mutants.

Propagation VR launched for free on Steam back in September 2020, garnering it an ‘Overwhelmingly Positive’ user rating on the platform for its visceral zombie-shooting experience.

Wanadev estimates a late 2022 release on SteamVR headsets for Paradise Hotel (see update). The studio hasn’t mentioned whether the game is coming to other platforms besides SteamVR, however it has done so with its previous title Ragnarock (2021), a Viking-themed rhythm game launched for both SteamVR and Oculus Quest.

‘Propagation VR’ Sequel Coming to Quest & SteamVR Next Week, Gameplay Trailer Here Read More »