News

‘another-fisherman’s-tale’-review-–-a-captivating-sequel-with-more-of-everything

‘Another Fisherman’s Tale’ Review – A Captivating Sequel with More of Everything

We’re back for another dose of mind-bending puzzles à la Bob the Fisherman. The sequel offers up a longer, more emotionally complex story while packing in a ton of new puzzle mechanics that makes Another Fisherman’s Tale feel leagues ahead of the original in almost every sense.

Another Fisherman’s Tale Details:

Available On:  SteamVRQuest 2, PSVR 2

Release Date:  May 11th, 2023

Price: $30

Developer: Innerspace VR

Publisher: Vertigo Games

Reviewed On:  Quest 2

Gameplay

Bob’s tall tales are taller than ever this time around, as the adventure swells to scale up to new emotional depths which reveal more about the real Bob and his family. I won’t spoil the story behind Another Fisherman’s Tale, because it’s really something you should unfold yourself. It talks about love, loss, responsibility, freedom—it’s way heavier than the first, and often strays outside of original’s safe storybook narration. Where you might have ignored some of the angsty Dad drama of the original Fisherman’s Tale, and just got on with the game’s smorgasbord of mind-bending puzzles, this time around the narrative takes more of a center stage, all while presenting new and innovative mechanics to keep you guessing.

The most prominent mechanic on display is the new ability to detach, replace and control your hands—like physically pop off your hands, trade them for more useful ones, and shoot them out to solve a variety of puzzles that only a little crawling (or swimming) remote controlled hand-beast might. Although you really only have two other hand styles regularly at your disposal besides your wooden digits, a hook for climbing and a claw for snipping, the puzzle variations are impressively wide.

Image courtesy Innerspace VR, Vertigo Games

While there are a ton of one-off puzzles to compliment, a constant throughout the game is the need remotely control your hands, which is done by twisting your motion controller in the correct relative direction and pressing the trigger to move them forward. You’ll be pulling levers, crawling your digits through circuitous routes, and grabbing key items before snapping them back to your arms where they belong. This took some getting used to, as oftentimes you’ll need to control your hands from afar while actually moving your body to different locations to get a better viewpoint, which can be confusing at times since your body’s relative position changes and your hands sort of wig out.

And with separable hands, you might as well be able to pop your head off too. Simply press two buttons (‘B’ and ‘Y’ on Quest) and you’ll launch your head forward to reach far flung areas for a better point of view. It’s basically plays out like a dash teleportation that follows a predictable arch as opposed to physically picking up your head and tossing it around, which wouldn’t be terribly comfortable. More on comfort below though.

Image courtesy Innerspace VR, Vertigo Games

While I wouldn’t call any of the puzzles particularly hard, they’re always creative and rewarding. I had hints enabled, although you can turn them off in the settings, which mutes a few of Bob’s timely and helpful lines. Still, Bob isn’t overbearing in how or when he delivers hints, making him feel much more like a dad who wants you to figure something out on your own than a ‘helpful robot’ that just wants you to get on with the puzzle already.

My personal playtime was just under four hours, which puts it nearly four times longer than the original game. I didn’t feel like any of it was filler either, which is a testament to the game’s deeper story and puzzle variations that require the player to develop skills that are useful throughout—essentially everything I wanted from the original but didn’t get when it initially released in early 2019.

Immersion

The star of the show is undoubtedly Bob, who is brought to life by the whisky-soaked tones of the probably never-not-smoking French comedian Augustin Jacob. In my review of the first game, I called Jacob’s interpretation more akin to a kitschy short that you typically see before a proper Pixar movie—charming, but not enough.

Image courtesy Innerspace VR, Vertigo Games

Here we get a full-fat dose of Bob as well as a new cast of characters that are equally engaging, making it feel more like that Pixar adventure I wished it could have been in the first place. One thing that hasn’t changed though is the inclusion of a sweeping score, which perfectly frames the game’s linear, sometimes storybook style adventure.

While the story goes off the rails at points, snapping you back to reality, it isn’t a large, open world with a ton of freedom of movement, or even puzzle creativity. Another Fisherman’s Tale is compartmentalized into chapters, each of which have a number of linear areas to unlock. The physical variety of the spaces though makes it feel less like a long series of closed escape rooms, which might otherwise feel a little too repetitive. That’s simply not the case here, as you’re always left guessing at what your next adventure will be, and where you’ll go next.

Image courtesy Innerspace VR, Vertigo Games

Set pieces are thoughtfully designed, and the cartoony nature of the game looks generally very good, even on the game’s most humble target platform, Quest 2. Object interaction is very basic, although that doesn’t detract too much since it’s mostly levers and a smaller momentary puzzle bits we’re dealing with here. As a result, there’s no inventory to speak of since all tools will be presented to you as needed, and thoughtfully poofed back into existence in case you lose them.

Comfort

Another Fisherman’s Tale has what we’d now consider the standard swath of comfort settings, which will let most anyone play with relative ease. There are moments though that personally make me feel a little iffy—not ‘hang my head in the toilet bad, but I’ve played VR long enough to know my triggers.

In lieu of cutscenes, the game loves to do sweep your POV around slowly, which is mostly fine, although much of the time you’re being shrunk down so scenes can appear bigger. I generally dislike this lack of control, even if it only happens maybe once per chapter. There are also moments when your POV will be upside-down, however this too is a pretty rare occurrence. All things considered, it’s a pretty comfortable game that basically anyone can play without feeling anything but few momentary bits of weirdness.

‘Another Fisherman’s Tale’ Comfort Settings – May 11th, 2023

Turning
Artificial turning
Snap-turn
Quick-turn
Smooth-turn
Movement
Artificial movement
Teleport-move
Dash-move
Smooth-move
Blinders
Head-based
Controller-based
Swappable movement hand
Posture
Standing mode
Seated mode
Artificial crouch
Real crouch
Accessibility
Subtitles
Languages

English, Italian, German, French, Spanish, Japanese, Korean, Traditional Chinese, Simplified Chinese

Dialogue audio
Languages English, French
Adjustable difficulty
Two hands required
Real crouch required
Hearing required
Adjustable player height

‘Another Fisherman’s Tale’ Review – A Captivating Sequel with More of Everything Read More »

social-vr-platform-‘vrchat’-to-lose-quest-1-support-next-month

Social VR Platform ‘VRChat’ to Lose Quest 1 Support Next Month

As a part of its big send off, Meta has already deprecated first-party social features on the original 2019 Quest, which includes access to Parties and Meta Horizon Home. If you thought you could keep using Quest 1 with other social VR platforms though, your choices are about to get even more limited, as VRChat is soon dropping support too.

The studio announced it was dropping support in a recent developer update, stating that VRChat will no longer provide support for the Meta Quest 1 headset after June 30th, 2023.

“This is primarily due to Meta’s deprecation of the Quest 1 SDK, which will prevent us from keeping VRChat updated properly on the device,” the studio says. “You can continue to use Quest 1 with Quest Link, Virtual Desktop, ALVR, or other tethering methods to connect your Quest 1 to a Windows PC. Please note that those other applications may also be deprecating the Quest 1, so keep an eye on their announcements and news posts.”

VRChat isn’t the only app losing Quest support. One of the first to go, Meta’s BigBox VR already dropped Quest 1 support for its battle royale shooter Population: One late last year. Then followed Rec Room in early January 2023.

Meta’s Downpour Interactive announced in February its mil-sim shooter Onward will be dropping Quest 1 support on July 31st. Other games seeing upcoming support freezes include Myst, Zenith: The Lost City, and Synth Riders. We expect to see many more in the coming months, as the back catalogue of games is slowly phased or completely dropped.

Social VR Platform ‘VRChat’ to Lose Quest 1 Support Next Month Read More »

spring-has-sprung-for-niantic-and-8th-wall

Spring Has Sprung for Niantic and 8th Wall

It’s already been a year since Niantic acquired 8th Wall. While acquisitions can be a scary thing in the tech world, both companies are growing and strengthening through their partnership.

Pillars of the Earth

Niantic and 8th Wall are both AR companies that might be bigger and more important than some realize. However, they both come at AR architecture and accessibility from different perspectives. Their coming together was a game changer that’s hard to understate.

Niantic Senior Director of AR Product Marketing, Caitlin Lacey, helps us understand what the companies are doing in their own products and projects as well as how they are helping each other grow and develop.

“I joined Niantic a year ago primarily to focus on Lightship, and one of the things that I was really excited about coming in was the acquisition,” said Lacey. “Having 8th Wall as part of the Niantic family has definitely made it better.”

Niantic

For some readers, Niantic is synonymous with Pokémon Go. If you Ctrl+K “Niantic”, Google Docs suggests the Pokémon Go website as a link option. Other readers will recognize this as a gross misrepresentation. Pokémon Go may have made Niantic a household name, but it only scratches the surface of what the historic and storied company actually does.

In addition to games (including the just released AR real-world pet game Peridot), Niantic has probably the largest and most detailed working virtual map of the world ever. A few years ago, that was a neat trick. As devices become more powerful and AR gains traction, it’s increasingly becoming something a lot more.

Niantic Peridot AR pet game

Niantic games gather data for this virtual map of the world, but they also have a dedicated platform called Lightship that developers use to fill in the empty spots, add detail, and create their own experiences. Whether you’re building or playing, you’re using an app.

8th Wall

Like its parent company, readers have probably seen the 8th Wall logo on an AR experience but might not realize the magnitude of the operation. Also like its parent company, users can experience 8th Wall both through experiences that they enjoy or through developer tools.

Over the years, 8th Wall has been building out their developer tools and experiences making them easier to use and accessible on more devices. The company has tools for augmenting the world around a user, as well as for augmenting users themselves through lenses and filters.

8th Wall’s experiences and developer tools are web-based. No app installation required, they’re well-positioned to run on pretty much any connected device.

Web and Apps

Apps have a certain gravity bringing obstacles and opportunities. People know how apps work and they know what to expect. Apps can run larger and more in-depth experiences, but they only do one thing at a time. These two necessary strengths are at odds when people expect an experience to do everything and do it well – an unrealistic expectation called “the metaverse.”

“It took a long time to train people how to use apps, but now they’re trained,” said Lacey. However, as she points out, “if you’re thinking about a future where all of these mobile technologies have AR capabilities”, opening and switching apps can become a hassle.

WebAR is getting better all the time, but it’s still limited in terms of the experiences it can run. Thinking about being out and about, this compounds as people are away from stable home networks and relying on burdened public networks or potentially spotty data coverage.

“There are still limitations to experience and file size that the web just can’t handle,” said Lacey. “As computing power continues to grow and get stronger, we’ll see better experiences across platforms.”

In the meantime, both companies are working on leveraging their strengths in app and webAR respectively trying to achieve the best of both worlds in both worlds.

“On the Lightship side, there was tons of tech that was very app-based … we took that and asked, ‘What do you want, and how do we bring it to the web?’” said Lacey. “And then, on the other side, bringing things from the web to Lightship.”

Updates and Releases From Niantic and 8th Wall

In the last few weeks, some exciting changes have come for developers using both developer platforms – including some of those updates that look a lot like a cross-pollination between the two platforms.

Sky and World Effects

First, Sky Effects and World Tracking came to 8th Wall. These are two separate developer tools that allow an AR experience to augment the sky itself, or to help AR elements realistically appear in the physical world. However, when used together, a single experience can bridge the earth and heavens in new and immersive ways.

“With sky and world effects, an object drops from the sky, recognizes the environment, and can interact with that environment,” said Lacey. “We’re seeing that happen across the board and there’s more coming.”

To celebrate the launch, 8th Wall held the “Sky Effects Challenge” which invited developers to use the new technology in interesting and inventive ways. Creators turned the sky into a canvas, mapped the planets, and more.

“We are consistently amazed by what our community builds,” said Lacey.

A Cross-Device Scanning Framework

A new Scanning Framework for Lightship AR Developer’s Kit 2.5 allows users to virtually reconstruct physical spaces and objects without LiDAR. LiDAR is one of two common methods for capturing spatial data on mobile devices, but it’s only available on higher-end iOS devices. Opening the Scanning Framework to other methods greatly increases accessibility.

“We’ve continually heard the feedback, and we’re listening,” said Lacey. “We really want to be a consistent partner to developers in the AR space. We do believe that AR can help make the world more interesting and fun.”

Two New Games

8th Wall doesn’t do so much in the games category – again, games still work better as full apps for now. However, a big theme in this article is that the line between the two companies can be a little foggy these days – at least in terms of user experience. These apps likely benefited from 8th Wall technology and 8th Wall will likely benefit from what the apps learn and earn for Niantic.

Early this year, Niantic launched NBA All World. The app includes basketball mechanics and an NBA partnership, and grows to incorporate elements that make it more than just a game.

“Our version of an NBA basketball game starts with exciting one-on-one gameplay and expands from there to include the major elements of basketball culture, including music, fashion, sneakers, and more, all of which are integrated into real-world locations,” Niantic founder and CEO John Hanke said in a blog post.

If that wasn’t enough, by the time you read this, Peridot will be live. The highly anticipated game encourages players to nurture an AI-powered virtual pet, including feeding it, petting it, and playing with it. Players can also use Niantic’s social platform Campfire to meet with other players and breed new and unique Peridots (or Dots).

Spring Has Sprung for Niantic and 8th Wall

I’m not a huge basketball fan and Pokémon is a chapter of my life that closed a long time ago, but I’ve had my Dot Erin for a few days now. Erin mainly hangs out by my desk eating sandwiches, but was pretty excited to see the spring flowers in my backyard the other day.

Peridot AR pet game Niantic - Jon's Dot Erin

Much More to Come

Lacey advised that a lot more updates to Niantic and 8th Wall will continue to reinforce both platforms for the benefit of developers and end-users alike. There are also some interesting artistic activations coming in the next few weeks. And, of course, we’re excited about Peridot becoming publicly available. There’s definitely a lot more to come from this power pair.

Spring Has Sprung for Niantic and 8th Wall Read More »

wonderland-engine-is-here-to-make-webxr-development-faster-and-easier

Wonderland Engine Is Here to Make WebXR Development Faster and Easier

WebXR development is increasingly popular. Developers want to create content that users can enjoy without having to install apps or check the compatibility of their devices.

One of the companies working for the advancement of immersive technologies, Wonderland GmbH, based in Cologne, Germany, has recently announced one giant leap forward in this process. They have recently released Wonderland Engine 1.0.0, a WebXR development platform already vouched for by top content creators.

Wonderland Engine 1.0.0 – Bringing Native XR Performance to WebXR Development

What is special about the new engine launched by Wonderland? Its first benefit is the ability to mimic native XR performance. Before its launch, Wonderland Engine 1.0.0 passed the test of content creators.

WebXR development platform Wonderland Engine editor vr website with browser

Vhite Rabbit XR and Paradowski Creative, two companies creating XR games, used the engine to develop content. The Escape Artist, an upcoming title by Paradowski Creative, is created with Wonderland Engine 1.0.p0, and its developers say that it matches native games in terms of polish and quality.

“We’re excited to announce this foundational version of Wonderland Engine, as we seek to bridge the gap between native XR app development and WebXR,” said the CEO and founder of Wonderland, Jonathan Hale, in a press release shared with ARPost. “We see a bright future for the WebXR community, for its developers, hardware, support, and content.”

Top Features of Wonderland Engine 1.0.0

The developers who choose Wonderland GmbH’s WebXR development platform to create content will be able to use the following:

  • Full 8th Wall integration – complete integration of 8th Wall AR tracking features such as face tracking, image tracking, SLAM, and VPS;
  • Runtime API rewrite – better code completion, static checks for bugs before running the code, and complete isolation for integration with other libraries;
  • Translation tools – necessary for the localization of WebXR content;
  • Benchmarking framework – to check for content performance on various devices.

Developers can find the complete list of features and bug fixes on the official release page.

According to the company, Wonderland Engine users can launch their first running app into the browser in less than two minutes. With a bit of experience, users can build a multi-user environment that supports VR, AR, and 3D in 10 minutes, as demonstrated in this video.

The XR Development Platform Is Optimized for VR Browsers

To indicate their commitment to helping content creators, Wonderland GmbH is optimizing their tool specifically for the most popular VR browsers: Meta Quest Browser, Pico Browser, and Wolvic.  

Wonderland Engine WebXR meta browser

Wonderland Engine-based apps support any headset that has a browser available. Also, any headset released in the future will automatically be supported, if it has a browser. Apps created with Wonderland Engine can also run on mobile devices through the browser, as Progressive Web Apps (PWA), which also allows them to run offline.

Apart from the two game development companies mentioned above, the company is also working with various content creators.

“It was crucial to bring the whole ecosystem with us to test and validate the changes we made. This resulted in a highly reliable base to build upon in upcoming versions,” Hale said. “By making it easier to build XR on the web we hope to attract developers and content creators to WebXR. We see WebXR truly being able to rival native apps and offer consumers a rich world of rapidly accessible content to enjoy.”

Meet the Wonderland Team at AWE USA 2023

The creators of Wonderland Engine 1.0.0 will present the WebXR development platform at AWE USA 2023 (use ARPost’s discount code 23ARPOSTD for 20% off your ticket), which is taking place in Santa Clara, CA between May 31 and June 2.

The company is one of the sponsors of the event and will also be present at the event in booth no. 605.

Wonderland Engine Is Here to Make WebXR Development Faster and Easier Read More »

arcade-boxer-‘creed:-rise-to-glory’-takes-top-spot-in-psvr-2-download-chart

Arcade Boxer ‘Creed: Rise to Glory’ Takes Top Spot in PSVR 2 Download Chart

PlayStation VR 2 is a little over two months old now, and the charts are still very much in flux thanks to a rash of new and upgraded games. Last month, Survios’ high-profile boxing title took the top spot in the US and Canada, and fared pretty well across other regions too.

Creed: Rise to Glory – Championship Edition is an overhauled version of Creed: Rise to Glory (2018) for PSVR 2, bringing new characters and a new location from the Creed III film in addition to new features, quality of life upgrades and PvP cross-platform support.

There’s a ton of movement in the charts, so we’ve included some new symbols to help show just how the games are faring month-to-month.

The chart below is counting PS Store purchases and not bundled or upgraded content, which notably includes big titles such as Horizon Call of the MountainGran Turismo 7, and Resident Evil Village.

PSVR 2 April Top Downloads

US/Canada EU Japan
1 Creed: Rise to Glory – Championship Edition (new) Pavlov (≡)

Kayak VR: Mirage (≡)

2 Pavlov (↓1) Creed: Rise to Glory – Championship Edition (new)

Creed: Rise to Glory – Championship Edition (new)

3 The Walking Dead: Saints & Sinners – Chapter 2: Retribution (↑1) Kayak VR: Mirage (↓1)

Onogoro Monogatari ~The Tale of Onogoro~ (↑4)

4 Job Simulator (↑6) The Walking Dead: Saints & Sinners – Chapter 2: Retribution (↓1)

LES MILLS BODYCOMBAT (↑?)

5 Kayak VR: Mirage (↓2) Job Simulator (↑5)

Horizon Call of the Mountain (↓2)

6 The Dark Pictures: Switchback (↓4) The Dark Pictures: Switchback (↑?)

The Dark Pictures: Switchback VR (↓4)

7 Star Wars: Tales from the Galaxy’s Edge (↓2) Star Wars: Tales from the Galaxy’s Edge (↓3)

After the Fall – Complete Edition (↑?)

8 Synth Riders (↑?) Synth Riders (↓1)

Dyschronia:Chronos Alternate Episode I (↓4)

9 PISTOL WHIP (↓3) Swordsman VR (↓1) Swordsman VR (↑1)
10 Swordsman VR (↓2) PISTOL WHIP (↓4) Drums Rock (↓6)

Arcade Boxer ‘Creed: Rise to Glory’ Takes Top Spot in PSVR 2 Download Chart Read More »

talespin-releases-ai-powered,-web-accessible-no-code-creator-platform

Talespin Releases AI-powered, Web-Accessible No-Code Creator Platform

To prepare professionals for tomorrow’s workplace, you need to be able to leverage tomorrow’s technology. Talespin was already doing this with their immersive AI-powered VR simulation and training modules.

Now, they’re taking it a step further by turning over a web-based no-code creator tool. To learn more, we reconnected with Talespin CEO Kyle Jackson to talk about the future of his company and the future of work.

The Road So Far

Talespin has existed as an idea for about ten years. That includes a few years before they started turning out experiences in 2015. In 2019, the company started leveraging AI technology for more nuanced storytelling and more believable virtual characters.

CoPilot Designer 3.0 Talespin

CoPilot Designer, the company’s content creation platform, released in 2021. Since then, it’s gone through big and small updates.

That brings us to the release of CoPilot Designer 3.0 – probably the biggest single change that’s come to the platform so far. This third major version of the tool is accessible on the web rather than as a downloaded app. We’ve already seen what the designer can do, as Talespin has been using it internally, including in its recent intricate story world in partnership with Pearson.

“Our North Star was how do you get the ability to create content into the hands of people who have the knowledge,” Jackson told ARPost this March. “The no-code platform was built in service of that but we decided we had to eat our own dogfood.”

In addition to being completely no-code, CoPilot Designer 3.0 has more AI tools than ever. It also features direct publishing to Quest 2, PC VR headsets, and Mac devices via streaming with support for Lenovo ThinkReality headsets and the Quest Pro coming soon.

Understanding AI in the Designer

The AI that powers CoPilot Designer 3.0 comes in two flavors – the tools that help the creator build the experience, and the tools that help the learner become immersed in the experience.

More generative 3D tools (tools that help the creator build environments and characters) is coming soon. The tools really developing in this iteration of CoPilot Designer are large language models (LLMs) and neural voices.

Talespin CoPilot Designer 3.0

Jackson described LLMs as the context of the content and neural voices as the expression of the content. After all, the average Talespin module could exist as a text-only interaction. But, an experience meant to teach soft skills is a lot more impactful when the situations and characters feel real. That means that the content can’t just be good, it has to be delivered in a moving way.

The Future of Work – and Talespin

While AI develops, Jackson said that the thing that he’s waiting for the most isn’t a new capability of AI. It’s trust.

“Right now, I would say that there’s not much trust in enterprise for this stuff, so we’re working very diligently,” Jackson told ARPost. “Learning and marketing have been two areas that are more flexible … I think that’s going to be where we really see this stuff break out first.”

Right now, that diligence includes maintaining the human component and limiting AI involvement where necessary. Where AI might help creators apply learning material, that learning material is still originally authored by human experts. One day AI might help to write the content too, but that isn’t happening so far.

“If our goal is achieved where we’re actually developing learning on the fly,” said Jackson, “we need to be sure that what it’s producing is good.”

Much of the inspiration behind Talespin in the first place was that as more manual jobs get automated, necessary workplace skills will pivot to soft skills. In short, humans won’t be replaced by machines, but the work that humans do will change.

As his own company relies more on AI for content generation, Jackson has already seen this prediction coming true for his team. As they’ve exponentially decreased the time that it takes for them to create content, they’re more able to work with customers and partners as opposed to largely serving as a platform to create and host content that companies made themselves.

Talepsin CoPilot Designer 3.0 - XR Content Creation Time Graph

Solving the Content Problem

To some degree, Talespin being a pioneer in the AI space is a necessary evolution of the company’s having been an XR pioneer. Some aspects of XR’s frontier struggles are already a thing of the past, but others have a lot to gain from leaning on other emerging technologies.

“At least on the enterprise side, there’s really no one doubting the validity of this technology anymore … Now it’s just a question of how we get that content more distributed,” said Jackson. “It feels like there’s a confluence of major events that are driving us along.”

Talespin Releases AI-powered, Web-Accessible No-Code Creator Platform Read More »

an-effort-to-hack-psvr-2-to-support-pc-vr-has-been-put-on-indefinite-hold

An Effort to Hack PSVR 2 to Support PC VR Has Been Put on Indefinite Hold

The creator of a PC VR driver which includes support for the original PSVR 1 headset says it is are stepping away from hacking PSVR 2 to work with PC VR, citing frustrating technical, financial, and social challenges.

Mediator Software, the developer of a PSVR-to-PC SteamVR driver called iVRy, says it is putting efforts to hack PSVR 2 for PC VR compatibility on hold. Just days after saying it had managed to authenticate PSVR 2 on PC, the developer says the project is now on ice.

“I’m walking away from this project for the time being. Between spiralling costs, a never ending set of obstacles put forward by the PSVR2, unrealistic hype in blogs, abusive commenters and accusations of fraud, it has ceased to be fun. I’ll be back. Some time,” reads the announcement.

The creators also shared screenshots showing what kind of social media strife they were facing, apparently with regards to Mediator Software seeking financial support for the project from the community.

Aside from the social challenges, struggling to get PSVR 2 working on SteamVR isn’t surprising. Despite their best efforts, the iVRy developers themselves previously said it was “unlikely” that PSVR 2 would be useable for PC VR “within five years of its release,” if ever.

That’s a shame considering PSVR 2 is one of the market’s best consumer headsets to date, and even has the basic ability to act like a proper display when plugged into a PC.

While we’d love to see PSVR 2 work with PC VR, the reality is that Sony has little incentive to let it happen.


Thanks to our pal Daniel Fearon for the tip!

An Effort to Hack PSVR 2 to Support PC VR Has Been Put on Indefinite Hold Read More »

ea’s-‘f1-23’-racer-coming-to-pc-vr-headsets-next-month,-psvr-2-still-uncertain

EA’s ‘F1 23’ Racer Coming to PC VR Headsets Next Month, PSVR 2 Still Uncertain

Codemasters, the EA-owned developer behind the F1 racing franchise, announced F1 23 is coming to consoles and PC next month, again bringing its high-profile racing game to VR.

F1 23 is coming to PlayStation 4|5, Xbox Series X|S, Xbox One and PC on June 16th, which is confirmed to include VR support on PC.

Codemasters hasn’t said whether it’s also coming to PSVR 2 on PS5, so we’ll just have to wait and see. As it is now, F1 22 only supports PC VR headsets, and not PSVR.

Here’s how the studio describes the upcoming installment:

A new chapter in the thrilling “Braking Point” story mode delivers high-speed drama and heated rivalries. Race wheel-to-wheel at new Las Vegas and Qatar circuits, and earn rewards and upgrades in F1 World. New Red Flags add an authentic strategic element, and the 35% Race Distance feature delivers more action and excitement. Drive updated 2023 cars with the official F1 lineup of your favorite 20 drivers and 10 teams. Create your dream team and race to win in My Team Career Mode, compete in split-screen or in the expanded cross-platform multiplayer, and be more social with new Racenet Leagues.

Preorders are now available, priced at $70 across Steam, Epic Games, and EA Play.

EA’s ‘F1 23’ Racer Coming to PC VR Headsets Next Month, PSVR 2 Still Uncertain Read More »

one-of-vr’s-most-hardcore-apocalyptic-survival-games-is-getting-a-sequel

One of VR’s Most Hardcore Apocalyptic Survival Games is Getting a Sequel

Into the Radius is a cult favorite for a reason, as it offers up some of the most hardcore gameplay in a very Stalker-inspired post-apocalyptic world, making for an absolutely unforgiving experience in the anomaly-ridden wasteland. Now developers CM Games say a sequel is in the works.

In a community update, CM Games says a second chapter to Into the Radius is currently being developed.

“We are in the pre-production phase, and will follow an Early Access development model like before when the time is right,” the studio says. “The original [Into the Radius] is a testament to how much our community has helped us in developing the game, and we want to continue this trend in the sequel.”

The studio says many user suggestions and ideas are currently being considered for the newest installment, although they’re not revealing anything beyond that right now. Into the Radius is currently available on SteamVR and Quest 2. The developers have said in the past that it’s also in development for other headsets, although there’s still no word on whether it’s coming to PSVR 2.

We’ll be following Into the Radius and its upcoming seqeuel via the game’s Discord (invite link), as CM Games is due to publish an FAQ soon that may answer more questions about the next chapter.

One of VR’s Most Hardcore Apocalyptic Survival Games is Getting a Sequel Read More »

‘propagation:-paradise-hotel’-review-–-a-pretty-ok-impression-of-‘resident-evil’

‘Propagation: Paradise Hotel’ Review – A Pretty Ok Impression of ‘Resident Evil’

Propagation: Paradise Hotel offers some patently terrifying moments of horror, but between the ever-lingering danger of zombified attacks and a few giant bosses—making for a very Resident Evil-inspired experience—there’s a bit of clunk that tarnishes what could have been a more memorable and cohesive experience. Still, it’s functionally a pretty solid zombie adventure that makes a clear departure from the franchise’s roots as a static wave shooter.

Propagation: Paradise Hotel Details:

Available On:  SteamVR, Quest

Release Date:  May 4th, 2023

Price: $20

Developer: WanadevStudio

Reviewed On:  Quest 2 via PC Link

Gameplay

It’s the zombie apocalypse, and you’re bumming around the bowels of a non-descript hotel in some non-descript part of the world. You won’t have a lot of time to hang with your fatherly security guard pal though because you have to go and find your sister, who is gone for some reason. Okay, so the setup isn’t spectacular, but at least the zombie kill’n is pretty good, right? I’ll lead with an emphatic “yeah, mostly!”

Rule number one of zombies: shoot them in the head. That’s the ironclad directive you’re probably most familiar with, but there are some caveats in Propagation: Paradise Hotel. Shooting zombies in the head multiple times with a pistol makes them very sleepy. No, really. Shoot a zombie three times in the head and they’ll quietly lay down on the ground for a while. Sometime later, usually when you’ve tripped another lurch forward in the narrative, he’ll pop back up at a patently inopportune time to bother you once again.

As clear of a departure from zombie orthodoxy this is, the effect it had on me was something I can’t say I’ve felt in a zombie shooter before. Instead of worrying about walkers popping out from the ceiling (there are a few) or shambolically oozing out from closed doors or windows, you become much more fixated on every single corpse laying in the hallway, of which there are many. You aren’t roaming through an infinite hellscape either, as you’ll be backtracking, learning the layout of the hotel, and tip-toeing around zombies whilst pointing a gun in their face, lest they reawaken and start harassing you again.

Image courtesy WanadevStudio

In effect, any one of them could be waiting for you to lower your guard, open their eyes and grab onto your ankle. Sadly, a preemptive shot in the head is completely ineffectual, which is a letdown in the Immersion department, but more on that below. Just the same, you’ll be cautious because you can’t discount a single corpse, which is a new type of creepy that really kept me on edge. Knowing this, I would have loved the option to cut off some heads to put an end to the contant revisitations, but that’s just not in the cards.

That’s basically the case until you get a shotgun in the latter half of the game, and then those walkers lay down for good because you’ve effectively stumpified their infected brains (finally). What was previously one-on-one battles ramp up to three-on-one battles, putting the game’s only other gun (and most powerful) to the test. You’ll also start to rush through a few new classes of zombie in addition to some more difficult baddies, which offers some interesting variety in difficulty. Will you run into three walkers? One scorpion-style zombie? A ripped dude that can take a ton of shots to the face?

Image courtesy WanadevStudio

While there are a few difficult and unique zombies, there’s really only one true boss in the game. Full disclosure: I disliked it, and while I won’t spoil anything here, rest assured you’ll probably be frustrated too with how to take him down. He is ultra lame, and you’ll want to mute the game just so you don’t have to hear your character constantly shout ad nauseum “I need to knock him out!”

Ok. No. I will spoil the boss. Skip this paragraph if you want to avoid the spoiler: What does “I need to knock him out” even mean?? Don’t I need to kill this bastard? Do I need to knock him out before I kill him? Is that a hint? Do I need something to do that? Maybe I need to call the elevator and rig up something to knock him out? Maybe I need to escape the lobby and head up the stairs to get something I missed? Maybe I need to explode a fire extinguisher in his face to knock him out? Nope. My hand phases right through those, so it can’t be that. Maybe I need to die a dozen times before I learn he has a specific attack pattern with a singular weak point, hit it three times and meander my way to sequel-bait then the end credits? Yup, that’s it.

Anyway, many of the mechanical bits of Propagation: Paradise Hotel are very functional, and work well. The body-based inventory system isn’t overloaded, so you always have what you need, like a medical spray on your left flank, a flashlight that you can clip to your chest or hold in your hand, your 9mm pistol on your right, or the shogun over your shoulder. It’s all there and easy to grab. This compliments a 2D menu that you can pull up, which as mission-essential items, the map, settings, etc.

It’s not easy changing up a user’s expectation of level design when we all know what to expect more or less in a space as familiar as a hotel. Still, the game throws a few curveballs your way to keep you from mechanically looking through every room in the hotel, which spans seven levels. Still, the story itself didn’t feel like a compelling enough driver to keep you moving forward. The found notes add a little flavor, but don’t do enough to flesh out the background of what’s actually going on, leaving you to mostly just bump your head against each task until its complete so you can move onto the next.

Finally, the game, which took me around 3.5 hours to complete, also includes a few puzzles, although all solutions are published in found notes, so you just have to be thorough in your shelf-opening game.

Immersion

Everything about Propagation looks the part, but very little is actually interactable, making it feel more like a flatscreen game than it probably should. There are a few key items you can pick up and use, but everything else is pure set dressing. I don’t want to underplay just how good the game looks, as it offers a visual acuity and variety that makes each room unique, and not at all the sort of copy-paste experience you’d logically expect from a motif that is basically supposed to look extremely uniform. Still, you can’t grab that fire extinguisher, or even pick up a bottle of detergent. You can only open doors and drawers, and interact with keys, key cards, and important notes.

That already feels pretty gamey enough, but just as things start getting good, you grab for an item and a big achievement pops up to ruin the atmosphere—because apparently you need to be constantly reminded that you just collected nine out of 30 secret items. I’d like my full field of view please, since I’m under constant threat of death and everything.

Image captured by Road to VR

One of the big narrative drivers is the game’s found notes, and I generally like the mechanic for its ability to either drive the narrative or unobtrusively flavor its back story. In VR, they can be especially immersive since you’re handling something that’s more of a physical artifact than just a bunch of text on a screen. This is where Propagation fails somewhat, as all notes feature a physical ‘next’ button at the bottom that you have to click, making it more like interacting with an eReader than something that was actually written by someone who lived, survived, and maybe even died in the hotel.

Maybe the notes could be shorter? Maybe they could have used the back of the paper? Maybe a different font? Whatever the case, interacting with a piece of paper shouldn’t feel this unnatural in a VR game.

Image captured by Road to VR

And the wacky unorthodoxy doesn’t stop there. While reloading weapons is a pretty standard experience, the gun’s ammo counter system is definitely not standard. The number indicated isn’t how many bullets you have in the gun, it’s how many are in the magazine. So, if you have 15 bullets in a fresh mag, as soon as you chamber one, the counter says 14.

That’s all well and good for the pistol, but if you forget it when using the shotgun, you may find yourself in deep dog doo-doo as you unintentionally cycle a live round out of the weapon by mistake. Provided you’ve loaded up the shotgun and chambered a round, you may have 3/4 rounds displayed. Once you’re in a tense battle though, and you’re displaying 0/4, you simply can’t be sure whether that 0/4 means you still have one in the chamber, or you don’t. You’ll load back up, shell by shell, until you’re at 4/4 again, but you don’t have any discernible visual indication whether you still have an empty chamber or not, so you cycle the pump just in case. An unspent shell flies out, lands on the floor, and disappears.

While it’s visually interesting and a mostly serviceable shooter despite those inherent flaws, the cherry on the cake is undoubtedly the game’s voice acting, which was clearly farmed out to native French speakers putting on their best American accents. This ranges from “I went to high school in Ohio for a year and picked up the accent pretty well,” to “How do you do, fellow American?” It’s more of an eccentricity than a knock per se, but it leaves me questioning where the hell I am on planet Earth.

Comfort

As a 100 percent walking-based experience that doesn’t include forced locomotion, like on a rollercoaster or similar vehicle, the game proves to be very comfortable, save a single moment when there is some camera shaking. With a wide range of standard comfort options, most everyone will be able to play Propagation: Paradise Hotel without too much issue.

Propagation: Paradise Hotel’ Comfort Settings – May 8th, 2023

Turning
Artificial turning
Snap-turn
Quick-turn
Smooth-turn
Movement
Artificial movement
Teleport-move
Dash-move
Smooth-move
Blinders
Head-based
Controller-based
Swappable movement hand
Posture
Standing mode
Seated mode
Artificial crouch
Real crouch
Accessibility
Subtitles Yes
Languages

English, Italian, German, French, Spanish, Japanese, Korean, Portuguese, Russian, Simplified Chinese, Polish

Dialogue audio Yes
Languages English
Adjustable difficulty
Two hands required
Real crouch required
Hearing required
Adjustable player height

‘Propagation: Paradise Hotel’ Review – A Pretty Ok Impression of ‘Resident Evil’ Read More »

digital-artist-behind-iconic-ps5-campaign-launches-evolving-vr-art-gallery

Digital Artist Behind Iconic PS5 Campaign Launches Evolving VR Art Gallery

You might not recognize the name Maxim Zhestkov, but if you paid any attention to the launch of PlayStation 5, you’ll almost certainly recognize his iconic digital art which accompanied the reveal of the console. Now Zhestkov has launched a virtual gallery that he says will feature and ever-growing collection of his digital works.

Maxim Zhestkov is the artist behind the satisfying swarm of particles that that accompanied the reveal of PS5 back in 2020.

Much of Zhestkov’s work similarly employs space, motion, shapes, and sound, which makes virtual reality the perfect medium for others to experience it.

To that end Zhestkov has released a new VR experience called Modules, a virtual gallery where he’s shared 11 different works which users can explore at their own pace and from any angle, complete with artist commentary on each piece.

Modules is rendered in real-time and available on both Quest headsets and PC VR (as well as non-VR via Steam). Ironically, despite Zhestkov’s work on the PS5 reveal, the project isn’t available on PSVR 2.

Zhestkov says that Modules will “expand to contain [my] entire body of work.”

One of the scenes in ‘Modules’ | Image courtesy Maxim Zhestkov

“Over the course of years, the project will grow as the artist grows, expanding into new territories and blurring the boundaries between art, games, and reality,” he says.

The project’s website contains a roadmap of future expansions, with an ‘Interactive’ segment coming in Fall 2023, followed by ‘Collaborative’ and ‘Creative’ segments next year.

Digital Artist Behind Iconic PS5 Campaign Launches Evolving VR Art Gallery Read More »

eye-tracking-is-a-game-changer-for-xr-that-goes-far-beyond-foveated-rendering

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering Read More »