There are two displays inside the Vision Pro, one for each eye. Each offers just under 4K resolution.
Samuel Axon
This is the infamous battery pack. It’s about the size of an iPhone (but a little thicker), and it has a USB-C port for external power sources.
Samuel Axon
There are two buttons for the Vision Pro, both on the top.
Samuel Axon
You can see the front-facing cameras that handle passthrough video just above the downward-facing cameras that read your hand gestures here.
Samuel Axon
Apple offers several variations of the light seal to fit different face shapes.
Samuel Axon
You can get a lot of work done while wearing Apple’s Vision Pro and have fun doing it—but it’s not yet at the stage where most of us will want to fully embrace spatial computing as the new way of working.
I spent more than a week working almost exclusively in the Vision Pro. I carried on Slack conversations, dialed into Zoom video calls, edited Google Docs, wrote articles, and did everything else I do within my day-to-day responsibilities as an editor at Ars Technica.
Throughout the experience, I never stopped thinking about how cool it was, like I was a character in a cyberpunk novel. The Vision Pro opens some new ways of approaching day-to-day work that could appeal to folks with certain sensibilities, and it offers access to some amenities that someone who hasn’t already invested a lot into their home office setup might not already have.
At the same time, though, I never quite zeroed in on a specific application or use case that made me think my normal habit of working on a MacBook Pro with three external monitors would be replaced. If you don’t already have a setup like that—that is to say, if you’ve just been working on a laptop on its own—then the Vision Pro can add a lot of value.
I plan to explore more use cases in the future, like gaming, but this is the last major piece in a series of sub-reviews of the Vision Pro that I’ve done on various applications, like entertainment or as an on-the-go mobile device.
My goal has been to see if the Vision Pro’s myriad use cases add up to $3,500 of value for today’s computing enthusiast. Productivity is front and center in how Apple markets the device, so this is an important one. Let’s see how it holds up.
The basics
Outside the realm of entertainment, visionOS and its apps are mostly about flat windows floating in 3D space. There are very few apps that make use of the device’s 3D capabilities in new ways that are relevant to productivity.
There are two types of visionOS apps: spatial apps and “Compatible Apps.” The former are apps designed to take advantage of the Vision Pro’s spatial computing capabilities, whereas Compatible Apps are simply iPad apps that work just fine as flat windows within the visionOS environment.
Enlarge/ Let’s find out if the Vision Pro can be an adequate replacement for this, my usual work space.
Samuel Axon
In either case, though, you’re usually just getting the ability to put windows around you. For example, I started out by sitting at my kitchen table and putting my writing app in front of me, Slack and my email app off to the side, and a browser window with a YouTube video playing on the other side. This felt a bit like using several large computer monitors, each with an app maximized. It’s cool, and the ability to shift between your real environment and fully immersive virtual ones can help with focus, especially if you do intensive creative work like writing.
If there’s one thing Apple has nailed better than any of its predecessors in the mixed reality space, it’s the interface. Wherever your eyes are looking, a UI element will glow to let you know it’s the item you’ll interact with if you click. Clicking is done by simply tapping two of your fingers together almost anywhere around your body; the headset has cameras all over, so you don’t have to hold your hands up or in front of you to do this. There are also simple pinching-and-moving gestures for scrolling or zooming.
Dive into the comments on those videos and you’ll see a consistent ratio: about 20 percent of the commenters herald this as the future, and the other 80 mock it with vehement derision. “I’ve never had as much desire to disconnect from reality as this guy does,” one reads.
Over the next few weeks, I’m going all-in on trying the Vision Pro in all sorts of situations to see which ones it suits. Last week, I talked about replacing a home theater system with it—at least when traveling away from home. Today, I’m going over my experience trying to find a use for it out on the streets of Chicago.
I’m setting out to answer a few questions here: Does it feel weird wearing it in public spaces? Will people judge you or react negatively when you wear it—and if so, will that become less common over time? Does it truly disconnect you from reality, and has Apple succeeded in solving virtual reality’s isolationist tendencies? Does it provide enough value to be worth wearing?
As it turns out, all these questions are closely related.
The potential of AR in the wild
I was excited about the Vision Pro in the lead-up to its launch. I was impressed by the demo I saw at WWDC 2023, even though I was aware that it was offered in an ideal setting: a private, well-lit room with lots of space to move around.
Part of my excitement was about things I didn’t see in that demo but that I’ve seen augmented reality developers explore in smartphone augmented reality (AR) and niche platforms like HoloLens and Xreal. Some smart folks have already produced a wide variety of neat tech demos showing what you can do with a good consumer AR headset, and many of the most exciting ideas work outside the home or office.
I’ve seen demonstrations of real-time directions provided with markers along the street while you walk around town, virtual assistant avatars guiding you through the airport, menus and Yelp reviews overlaid on the doors of every restaurant on a city strip, public art projects pieced together by multiple participants who each get to add an element to a virtual statue, and much more.
Of course, all those ideas—and most others for AR—make a lot more sense for unintrusive glasses than they do for something that is essentially a VR headset with passthrough. Nonetheless, I was hoping to get a glimpse at that eventuality with the Vision Pro.
Enlarge/ The most interesting thing about AMD’s Ryzen 7 8700G CPU is the Radeon 780M GPU that’s attached to it.
Andrew Cunningham
Put me on the short list of people who can get excited about the humble, much-derided integrated GPU.
Yes, most of them are afterthoughts, designed for office desktops and laptops that will spend most of their lives rendering 2D images to a single monitor. But when integrated graphics push forward, it can open up possibilities for people who want to play games but can only afford a cheap desktop (or who have to make do with whatever their parents will pay for, which was the big limiter on my PC gaming experience as a kid).
That, plus an unrelated but accordant interest in building small mini-ITX-based desktops, has kept me interested in AMD’s G-series Ryzen desktop chips (which it sometimes calls “APUs,” to distinguish them from the Ryzen CPUs). And the Ryzen 8000G chips are a big upgrade from the 5000G series that immediately preceded them (this makes sense, because as we all know the number 8 immediately follows the number 5).
We’re jumping up an entire processor socket, one CPU architecture, three GPU architectures, and up to a new generation of much faster memory; especially for graphics, it’s a pretty dramatic leap. It’s an integrated GPU that can credibly beat the lowest tier of currently available graphics cards, replacing a $100–$200 part with something a lot more energy-efficient.
As with so many current-gen Ryzen chips, still-elevated pricing for the socket AM5 platform and the DDR5 memory it requires limit the 8000G series’ appeal, at least for now.
From laptop to desktop
Enlarge/ AMD’s first Ryzen 8000 desktop processors are what the company used to call “APUs,” a combination of a fast integrated GPU and a reasonably capable CPU.
AMD
The 8000G chips use the same Zen 4 CPU architecture as the Ryzen 7000 desktop chips, but the way the rest of the chip is put together is pretty different. Like past APUs, these are actually laptop silicon (in this case, the Ryzen 7040/8040 series, codenamed Phoenix and Phoenix 2) repackaged for a desktop processor socket.
Generally, the real-world impact of this is pretty mild; in most ways, the 8700G and 8600G will perform a lot like any other Zen 4 CPU with the same number of cores (our benchmarks mostly bear this out). But to the extent that there is a difference, the Phoenix silicon will consistently perform just a little worse, because it has half as much L3 cache. AMD’s Ryzen X3D chips revolve around the performance benefits of tons of cache, so you can see why having less would be detrimental.
The other missing feature from the Ryzen 7000 desktop chips is PCI Express 5.0 support—Ryzen 8000G tops out at PCIe 4.0. This might, maybe, one day in the distant future, eventually lead to some kind of user-observable performance difference. Some recent GPUs use an 8-lane PCIe 4.0 interface instead of the typical 16 lanes, which limits performance slightly. But PCIe 5.0 SSDs remain rare (and PCIe 4.0 peripherals remain extremely fast), so it probably shouldn’t top your list of concerns.
The Ryzen 5 8500G is a lot different from the 8700G and 8600G, since some of the CPU cores in the Phoenix 2 chips are based on Zen 4c rather than Zen 4. These cores have all the same capabilities as regular Zen 4 ones—unlike Intel’s E-cores—but they’re optimized to take up less space rather than hit high clock speeds. They were initially made for servers, where cramming lots of cores into a small amount of space is more important than having a smaller number of faster cores, but AMD is also using them to make some of its low-end consumer chips physically smaller and presumably cheaper to produce. AMD didn’t send us a Ryzen 8500G for review, so we can’t see exactly how Phoenix 2 stacks up in a desktop.
The 8700G and 8600G chips are also the only ones that come with AMD’s “Ryzen AI” feature, the brand AMD is using to refer to processors with a neural processing unit (NPU) included. Sort of like GPUs or video encoding/decoding blocks, these are additional bits built into the chip that handle things that CPUs can’t do very efficiently—in this case, machine learning and AI workloads.
Most PCs still don’t have NPUs, and as such they are only barely used in current versions of Windows (Windows 11 offers some webcam effects that will take advantage of NPU acceleration, but for now that’s mostly it). But expect this to change as they become more common and as more AI-accelerated text, image, and video creating and editing capabilities are built into modern operating systems.
The last major difference is the GPU. Ryzen 7000 includes a pair of RDNA2 compute units that perform more or less like Intel’s desktop integrated graphics: good enough to render your desktop on a monitor or two, but not much else. The Ryzen 8000G chips include up to 12 RDNA3 CUs, which—as we’ve already seen in laptops and portable gaming systems like the Asus ROG Ally that use the same silicon—is enough to run most games, if just barely in some cases.
That gives AMD’s desktop APUs a unique niche. You can use them in cases where you can’t afford a dedicated GPU—for a time during the big graphics card shortage in 2020 and 2021, a Ryzen 5700G was actually one of the only ways to build a budget gaming PC. Or you can use them in cases where a dedicated GPU won’t fit, like super-small mini ITX-based desktops.
The main argument that AMD makes is the affordability one, comparing the price of a Ryzen 8700G to the price of an Intel Core i5-13400F and a GeForce GTX 1650 GPU (this card is nearly five years old, but it remains Nvidia’s newest and best GPU available for less than $200).
Let’s check on performance first, and then we’ll revisit pricing.
There are a lot of possible interactions in virtual reality. The standard Quest 2 controllers just don’t always cut it anymore. Fortunately, there’s a large market of accessories manufacturers making adapters for different games and use cases. Not least among them is YOGES.
YOGES at It Again
YOGES specializes in accessories for the Meta Quest 2 headset and Quest 2 controllers. We’ve already reviewed one of their head strap alternatives for the device and found it to be comfortable and competitively priced. When they invited us to try out their “handle attachments” of course we were curious.
Before we jump into the playthroughs, let’s look at what’s in the box.
Unboxing
The minimal YOGES packaging for the handle attachments packs one handle for each controller, one detachable lanyard for each controller, and a connector piece turning the whole set into one two-headed controller. There are also two extra velcro ties to hold the controllers into the adapters – just in case. A set of directions is included as well, but it’s a simple setup.
The standard Quest 2 controller sits into the adapters, which are each labeled “L” or “R”. Then, a velcro tab secures the controller into the adapter via the tracking ring – so, likely not compatible with the Quest Pro controllers. The bottoms of each adapter are threaded. Screw on a lanyard attachment or screw one of the adapters into either end of the connector piece.
The lightweight adapters are hollow core encased in durable-feeling molded foam. That hollow core keeps the weight and probably the cost down, but it also means that you can insert your Quest 2 controllers without removing the lanyards from them. That’s a handy feature because you might not want these adapters for everything that you do in VR.
The full rig measures in at almost exactly two feet. Each controller in a separate adapter with the lanyard attachment measures in at about ten inches – that’s some five-and-a-half inches longer than the Quest 2 controller by itself.
The adapters extend the Quest 2 controllers but don’t allow you to interact with them in any way. That is, you’ve still got to be holding the controller to press buttons and triggers. Fortunately, the lanyard on the end is long enough that you can put it around your wrist and still reach over the entire adapter to reach the controller.
Playtesting the Adapters for Quest 2 Controllers
I was worried that that length was going to throw off my game. It seems to me that if the adapter adds a few inches, that means that the Quest 2 thinks that my arm is a few inches longer than it is – right? This shouldn’t make much difference saber beating or gorilla tagging, but I was all set for playing pickleball to be a nightmare.
Playin Pickleball
But then, it wasn’t. I don’t know if the Quest 2 is smarter than I gave it credit for or if my brain was a lot more ready to accept the extended controller as a part of my arm, but I had no trouble hitting the ball reliably into targets in a practice mode.
layin Pickleball also might be the game that has seen the most flying Quest 2 controllers in my home – lanyards are a must. However, I didn’t use the lanyards to play with the YOGES adapter – the extra length and the molded foam made it significantly easier to hold onto a paddle.
Kizuna AI – Touch the Beat!
I had a bit more of a time getting used to the adapters when I played a round of Kizuna AI – Touch the Beat!. If you haven’t played the game, it’s very similar to Beat Saber but with smaller targets, smaller sabers, and different motion challenges.
Things took some more getting used to, possibly because the sabers are narrower than a pickleball paddle so my movements needed to be even more precise. I did also hit my overhead light at least once, though I’m not entirely sure that that was because of the adapter. Still, by the end of the first song, I had a pretty memorable streak going.
Bait!
From here, I really wanted to use the adapter as a sword handle in Battle Talent, but in Battle Talent you need to hold the trigger to hold the weapon, so that was a no-go. You also pump both arms and use the joysticks to run, so I couldn’t just leave a controller down and dedicate myself to two-handed weapons. I wondered about how the handle might work as a fishing rod in Bait!.
In Bait! you hold the rod and cast with one hand but use the trigger on the other controller to real it in. I let the left-hand controller (sans adapter) hang off of my left wrist as I used the right controller (with adapter) to do a double-handed cast. It was a little awkward because Bait! was still tracking the left-hand controller as it flopped through the air, but the cast was beautiful.
Is it Worth the Price?
Depending on where, when, and how you buy the YOGES Handle Attachments, they run between $18.58 (the price on Amazon at the time of writing) and $33.98 (the price currently listed on the YOGES website). That’s fairly competitive for adapters of this kind – and most adapter sets don’t include the connector piece.
As always, whether or not that’s worth the price depends on the games that you play. For as many games as I found improved by the adapters, I have at least as many that wouldn’t work. Maybe that’s not the case for you. Or maybe it is but you feel really passionate about improving your VR fishing cast or your virtual pickleball game.
I will say that on all of the games that were compatible with these adapters for Quest 2 controllers (and Bait!) my game was improved – or at least felt improved.
Parting Thoughts
So far, I continue to be pleased with YOGES. The Quest 2 Controller Handle Attachments, like the headset strap, are lightweight and low-cost comfortable adapters. While they may not be for all people or in all cases, they certainly have their place in the VR accessories ecosystem.
What better place to play a game about an alien invasion in your backyard than in your backyard? When a game studio offered to stage an alien invasion right here in my neck of the woods, I shelved my concerns about violent video games and picked up my mobile phone to see what Alien Invasion AR FPS is all about.
Resisting an Alien Invasion in Augmented Reality
Set in the not-too-distant future, Alien Invasion AR FPS by Stary, tells the story of an insidious and subtle alien foe. The aliens, nicknamed “Jackers” came in peace and even brought gifts. However, the gifts were sabotaged and the aliens quickly showed their true colors and effectively took over the planet.
In Alien Invasion AR FPS, you play the part of a resistance fighter in this sort of Sci-Fi “Red Dawn” situation. Use limited resources and unlimited resourcefulness to take back your home from the Jackers. But, how does it all play out?
Narrative and Gameplay
Alien Invasion AR FPS unlocks level-by-level in an unfolding linear narrative starring you and your “commanding officer” in the resistance. The introductory video as well as your mission brief at the beginning of each stage involves some compelling art but some humdrum voicework.
As you are a resistance fighter, most of the early missions involve tasks like planting explosives or setting up defensive positions. The mission brief at the beginning of each mission starts out by explaining how the success of the previous mission shifted the balance of the overarching conflict, which helps to give a sense of purpose to the gameplay, which can feel repetitive.
As the game progresses, your victories unlock more resources for the resistance, including new weapons. The beginning of many of the early levels has a brief tutorial on how to use any new equipment that you have unlocked. You have unlimited ammunition, but health and grenades are limited and need to be sourced from throughout the levels.
The game currently consists of four levels of four stages each plus the intro video. I haven’t beaten the whole game yet, but the names of the levels and material provided by the game’s publisher suggest that the resistance does eventually succeed in driving the Jackers from Earth.
Playing Alien Invasion AR FPS
Alien Invasion AR FPS is a free app download for iOS 12 and newer, and for Android 8.0 and newer, and it’s surprisingly agile. The app is still in its early days – maybe one day it will have a marketplace for buying extra supplies, or maybe it will use the AR ad formats Niantic is exploring. But for now, it’s really just free.
From the technical perspective, the game plays out in a series of digital sets that you place in your physical environment. The game recommends a play area of almost 50 square feet, so it recommends playing outside. Even outside, I don’t think that I ever played in an area that big, but my backyard was big enough.
Once your mobile device recognizes that you’re in a large enough space, you tap the ground to place the virtual elements. Getting the angle exactly right is tricky and if you don’t figure it out pretty well, those virtual elements can be too high or too low, which kind of ruins the effect and impacts playability.
Once the stage is set, you navigate through the space by physically moving through your environment. If the area isn’t large enough, you can pause the game, move to a new position, and resume the game. Typically, you perform some initial task, move to cover, and confirm that you’re in place. Then, the wave of Jackers comes for you.
Buttons on the screen manage your various healing kits, your weapons and firing, and additional equipment that you gradually unlock and use, like hand grenades.
Letdowns and Triumphs
Unfortunately, what the stage looks like doesn’t change based on your physical environment. My backyard has a shed and some stone retaining walls, so it would have been cool if the game had recognized these and incorporated them into the stage design – but I understand that that’s a huge ask for a free mobile app.
Ducking and moving from cover to cover is effective and feels right. You also have to explore each stage a little if you want to collect resources like health kits. And your health kits don’t replenish at the beginning of each stage, so at least taking a good look around before the first wave comes is highly recommended.
My general strategy was to hunker down wherever I started the level and fight in place. Although, at one point, the last Jacker in a stage refused to leave his cover, so I got up and charged through the map firing my SMG. There was definitely a moment of thinking “This is exactly the way that an AR FPS is supposed to feel.”
Speaking of “feel,” Alien Invasion AR FPS doesn’t have haptic support – the phone doesn’t vibrate when I fire a gun or get shot. This feels like a huge missed opportunity, but it can’t just be something that the developers never thought of, so I’m confident that it will come in an update at some point.
Compromises Paid Off Overall
We’ve already seen one area where the choice to make the AR FPS affordable and accessible might have meant going without some potentially more immersive features. There’s one more big thing about this app that I didn’t mention that likely fits in the same camp: it doesn’t require data or Wi-Fi. At least, not yet. The game’s roadmap includes multiplayer that probably will.
For me, this is a huge win – and it makes a lot of sense for a game that was designed to be played outdoors. As someone who’s seen too many Pokèmon trainers throwing balls into their bathtubs because they didn’t have connections outside of their homes, an AR game that doesn’t require connectivity feels like a breath of fresh air.
Again, that’s with the understanding that other AR games can do things that this one can’t. As a technical showpiece for AR, this game might not blow picky critics out of the water. But, as an artistic showcase for AR, this game elevates an enjoyable and well-executed first-person shooter onto a new level of play.
But How Did it Make Me Feel?
I mentioned at the top of this piece that I’m historically not a fan of violence in video games – particularly XR video games. It was something that I struggled with as I approached Peaky Blinders: The King’s Ransom. In my playthrough, I found that that game managed graphic content in such a way that it was able to be a part of the story without overwhelming the player.
I feel similarly about AR use in Alien Invasion AR FPS. It also helps that in Alien Invasion I’m killing aliens instead of Englishmen – that sits better with me. But, the aliens aren’t rendered in such quality that I have to intimately consider their death – they don’t even bleed like the gang members and political agitators that I virtually shot down in London and Birmingham.
Returning to Alien Invasion’s use of AR as an artistic medium rather than strictly as a game development tool, there’s a lot to be said for the way that AR tells this story about, well, an alien invasion.
Early in the game, I load an anti-aircraft gun that shoots down an alien ship – and it happens over my backyard. As I watched the airship go down behind my laundry line, I imagined it crashing down the road from my house and blocking traffic. It was another one of those moments that felt like a win for the development studio: this is what an AR FPS can do.
It’s Free
Are there things that I would like to see in updates to Alien Invasion AR FPS? Yes. Are there things that I can complain about from the game? Not really. As a lightweight, connection-optional mobile-based AR FPS that you can download and play for free, I really can’t think of any reason not to recommend that you at least give the game a try.
Amidst the sea of booths, one exhibit captured sustained attention—the Spacetop laptop by Sightful. Throughout the day, from early morning until the closing hours, its stand was constantly buzzing with activity.
Long lines to try Sightful’s Spacetop AR; Source: AWE
Face-To-Face With The Spacetop
Spacetop’s uniqueness stems from its design—it shuns the traditional physical screen and employs a pair of AR glasses as the display medium. The glasses are not proprietary but are a product of Sightful’s collaboration with XREAL (formerly Nreal), who provided an existing AR solution tailored specifically for Spacetop.
Source: Sightful – Spacetop press kit
Field of View
With its sleek and futuristic design, the laptop certainly looks promising at a glance. However, a set of issues quickly surfaced during my hands-on experience. The most significant one is the limited field of view that’s insufficient to accommodate the entire screen.
The glasses’ restricted field of view necessitates constant head tilting which undermines the entire purpose of having large virtual monitors and results in what is known as “windowing”—a term used in spatial computing when virtual objects fail to fully overlay and appear cut off.
Attempted solutions like moving the virtual monitor further away were not effective due to the glasses’ 1080p (1920×1080) resolution. Push the screen too far back and the text becomes difficult to read. Therefore, users are forced to deal with near-placed screens that, while clear and readable, outsize Spacetop’s field of view.
Input Solutions and Design
The laptop also lacks hand tracking, a disappointing omission considering the advancements in the field. Users are left with a trackpad, navigating a vast spatial spectrum with a traditional cursor, a process that can feel slow and inadequate. Monica Chin from The Verge has reported instances of losing the cursor among the screens, then struggling to locate it – a problem no doubt amplified by the limited FOV.
Low-precision tasks such as moving tabs or resizing that could be done in fractions of a second with either touchscreen or hand tracking, here took exponentially longer. It made the whole experience of using Spacetop feel frustrating.
There are also other less obvious quibbles. For example, no screen means the webcam must be positioned down on the keyboard. This suboptimal positioning creates an unflattering, spycam-like angle.
Although users can lower their virtual screen to align with the webcam, mitigating gaze-switching between the screen and camera, ultimately the very design of the Spacetop laptop necessitates certain compromises.
Sightful in It for the Long Haul
I asked a Sightful representative about the low field of view and was informed that the company is aware of these display limitations. They assured me that they are prepared to iterate in tandem with the industry.
It seems Sightful is conscious not to portray Spacetop as a purely AR device. More than anything else, Spacetop is a screen-less laptop with a proprietary operating system, Spacetop OS (based on Android), and a unique set of AR-specific features.
In the future, the team may design the laptop to work with any glasses they deem suitable for their purpose. This is their first product and instead of playing catch-up, Sightful is eager to start early and keep perfecting the experience as better, newer glasses come into the market.
However, as things stand today, it’s hard to avoid the obvious question: Why would one choose to splash $2,000 on a Spacetop when one could simply spend $379 on the XREAL glasses (or $488 bundled with the XREAL Beam) and use them to stream from any device? The Spacetop team attempts to answer this by emphasizing their AR-first design and focus.
For instance, executing a three-finger swipe on the touchpad moves screens spatially between closer and further planes. There is also a Reality Mode button that turns the AR off allowing for full pass-through, and a range of shortcuts that enable you to snap screens in place, re-center them, and more. While these improvements and enhancements are handy, they don’t quite seem to justify the substantial premium.
Author believers that Spacetop’s form factor makes it socially acceptable.
Potential Is There
Initially, I had planned to log into my Twitter account from within the Spacetop, take a screenshot with its webcam, and do a live tweet, heralding the dawn of a new era in spatial laptop computing.
However, the realization that the Spacetop still has some distance to cover before it can be deemed fully user-friendly made it challenging to compose a strictly positive and genuine tweet (time constraints and burdensome trackpad navigation played a role as well).
The potential is undoubtedly there. Large field-of-view, high-resolution AR displays, along with some ultralight tracking solutions, were already being showcased at this year’s AWE and might be integrated into the next generation of glasses.
During my brief encounter with the Spacetop, I could easily envision it becoming a preferred work tool for many, not just for those working from home, but also in cafes or co-working spaces. Moreover, there’s an inherent benefit of privacy. For stock traders, artists, or anyone who values personal workspace, the ability to work on non-public screens adds a lot of appeal.
Its form factor is among the most socially acceptable options available – there’s something about having AR glasses paired with a clearly visible laptop or tablet that makes the entire setup immediately understandable to onlookers. It doesn’t seem to invite confusion or ridicule; if anything, it might invite desirability.
The author thinks that promotional materials feel misleading; Source: Spacetop press kit
For now, however, Spacetop’s primary promise of being a superior alternative to traditional laptops falls short. Its promotional materials, which depict users encircled by screen panels, feel misleading.
The current iteration is hampered by a lack of hand-tracking, a limited field of view, and clunky user interface solutions. Moreover, the price point does not seem to correspond with the value provided. However, with improvements and upgrades coming, it’s worth keeping an eye on Sightful.
Guest Post
About the Guest Author(s)
Mat Pawluczuk
Mat Pawluczuk is an XR / VR writer and content creator.
Before my life as a technology journalist, I worked in a university’s biomedical engineering research lab. Every now and then, in my current career, I encounter something that I wish had been around 10 years ago. Nanome, an app for spatially visualizing molecules in MR and VR, is exactly such an experience.
Meet Nanome
Nanome is a visualization and collaboration platform available on all major VR headsets. It’s partially funded by Meta, but founders got in with Oculus co-founder Michael Antonov long before Facebook bought the company (and subsequently changed the name of both companies to “Meta”).
“Because we were part of Oculus for Business as an ISV [Independent Software Vendor], our relationship has deepened and we have co-authored multiple case studies together, including for Nimbus and Novartis,” Nanome co-founder and CEO Steve McCloskey told ARPost.
Nanome was a launch title on the Quest Pro, but it is also available on Viveport and Steam. The platform runs in VR on most headsets, but also makes full use of the full-color passthrough on the Quest Pro. The company is looking at the emerging AR glasses hardware market, but still needs controllers for the time being.
“Current hand tracking technology does not meet the needs that 6DoF controllers can provide, which consumer AR glasses don’t,” said McCloskey. “Additionally, the limited FOV makes it challenging to get a closer view of molecules in the context of a protein binding pocket which is essential for many of our users.”
If you don’t know what a “protein binding pocket” is, don’t feel like Nanome is too advanced for you. Just like chemistry in general, you can start wherever you are and go from there. You can also watch educational videos on chemistry’s big ideas directly within Nanome.
“Every user has unique needs and workflows, and we aim to provide a tool that can adapt to those needs, rather than forcing users to adapt their workflows to our tool,” said McCloskey. “This is why we continually work to improve and expand our features, to provide an ever-more intuitive, collaborative, and integrative experience for our users.”
Subscription Options
Nanome comes in a free version for personal use, as well as academic, research, and enterprise subscription tiers. Virtually all of the platform’s major functionalities work in the free version, though the academic subscription allows meeting in private rooms and saving workspaces. The benefits of the remaining tiers come largely from hosting and server options.
Insights From the In-App Demo
I met with McCloskey and fellow co-founder Sam Hessenauer within a free trial of the platform’s academic version. Creating an account is fast and easy, and automatically uses your Meta avatar, though you can join with a number of default avatars if you’re using a borrowed or communal headset.
Start Building Molecules – Even Impossible Ones
The virtual space is initially empty, inviting users to start building their own molecules from scratch, using common building blocks already in the app, or bringing in completed structures. The app supports a number of commonly used visualization tools, so work started on conventional software can be brought directly into VR.
Molecules have specific shapes – something about the constituent atoms attracting and repelling each other – I’m pretty sure that my university chemistry textbook has a whole section on figuring out bond angles based on valence electrons. The point is, the app does that for you. And, when you create a molecule that couldn’t possibly exist, the app lets you know.
So, if you want to play comic book super genius and create fantastic chemical structures, you can! And the app will let you know which parts of the molecule break the laws of physics, and which laws they break. You can also view the models in several color-coded visualization methods.
“Because we’re on the VR app store and the basic version of the software can replace Intro-to-Chem ball and stick models, we’re very popular among universities and libraries as the go-to chemistry app in XR,” said McCloskey. “You never run out of chemistry kit parts in XR!”
Building new chemical structures in the app is huge. While you certainly can use Nanome to practice and learn chemistry within its established boundaries, there are people using the platform to design new chemicals, like groundbreaking prescription medications. In fact, early feedback from Novartis went into the first widely available iteration of the platform.
“For other major biopharma companies, we meet scientists and IT folks at various industry conferences,” said McCloskey. “We have landed some deals from scientists who bought a Quest for the holidays and wanted to use Nanome at their workplace.”
Everyone Gather Around the Giant Protein
Visualizing chemicals isn’t only important when designing new ones. One giant model that McCloskey, Hessenauer, and I viewed within the space was a large protein. At that massive scale, something that was just an idea before suddenly seemed tangible and understandable – a solid thing with its own charitable topography waiting to be explored.
Prion proteins in the brain can fold incorrectly, leading to neurological disorders like dementia. I remember my middle school science teacher trying to communicate that by scribbling on the chalkboard, but how exactly a protein could be foldable never really made sense to me. Seeing that giant protein in VR, that fifteen-year-old lesson came back and clicked instantly.
McCloskey and Hessenauer were able to point out caves in the giant protein where part of another chemical – like a medication designed by one of the companies using the app – could fit into the protein and bind to it. I usually do demos like this to learn about XR, but this time I felt like I got a lesson in chemistry with XR in the margins – which is how it’s supposed to feel.
“As a design tool, we aim for Nanome to be as intuitive as possible. This is where XR and the user interface come into play,” said McCloskey. “We want our users to focus more on their scientific explorations and less on learning how to use the tool.”
Suppose I want to go back and watch our demo again. I can. But, not just a flat recording. Nanome allows spatial recordings that viewers can walk through later. If someone pointed at a point on the model and I missed it live, I can go back and watch it in VR standing right in their shoes as I relive the moment from their viewing angle.
At Least Take a Look
If you ever even think about chemistry, there’s literally no reason not to check out the free version of the app. If you’re a student, learning institution, or researcher, the platform can grow with you. The sheer number of things that Nanome can do are honestly overwhelming at first, but helpful explainer videos and easy-to-pick-up controls make it second nature in minutes.
Mindfulness can exist in the virtual world. Mindway is a VR app that promotes mindfulness – both in virtual worlds of unearthly peace and through lessons that you can carry with you when you put the headset down. I gave the app a test run by incorporating it into my own stressful work week.
A Modular Subscription-Based App
A good first place to start an app review is talking about the specs, like the size of the app and how much it costs. That’s not really how Mindway works.
First, the app is free to download from the Quest App Lab, though a number of elements of the app require a monthly or annual subscription – or you can buy the whole package once and for all for $50.
Further, the initial app download is small but individual modules within the app come as independent downloads. That might make things complicated if you’re trying to decide whether you have space for the experience, but it also means that you can really effectively pick and choose which modules you want to keep on your device.
The app doesn’t currently have a comfort rating. That might be because the individual experiences are so different. Each module explains the position in which it works best. Some encourage you to be seated, while others that deal more with mindful movement require you to be standing. Still, none of the modules that I tried made me too uncomfortable.
The app is compatible with the whole Quest product line from the original Quest to the Quest Pro, but I used my Quest 2. Controllers are required to navigate menus and carry out simple interactions in some of the practices, but there aren’t any complex controls. The thumbsticks can be used for snap turns, but there’s no movement and head-tracking is sufficient.
Mindway’s Major Components
When you first enter Mindway, you find yourself in a calm virtual environment reminiscent of a Quest Home. In front of you are three main menu items: ASMR, Mindfulness, and Sleep.
If you turn to your right, there’s also a room where you can join public or private sessions. This is used for scheduled group events, but you can also go in alone to sit by a calming VR campfire or use an invite code to share the space with friends.
If you aren’t familiar with mindfulness, it’s an approach to mind-body wellness that promotes active awareness of your physical state and thought-life in the present moment instead of dwelling on the past or being anxious about the future. A text explanation will never really do justice, so consider checking out the introductory journey in the app.
ASMR
I’ve become something of an ASMR aficionado over the years, and let me just say that I’m hoping for more from this selection in the future. The selection currently consists of soft-spoken stories and a marble-maze mini-game that plays with some audio effects. (Take out your Conquest VR if you’ve got one.)
While Sleep and Mindfulness sessions usually last between eight and 15 minutes, some of the ASMR sections go on until you exit the session, making them ideal if you want to use them as the base of longer meditations.
The marble game is fun and the soft-spoken stories are great, but I didn’t get big ASMR vibes. There’s a whole category of ASMR that uses visual cues but I haven’t really experimented with it because I usually listen to ASMR when I’m trying to sleep, so seeing more visually-based ASMR in this mindfulness VR app has a lot of potential.
Sleep
Speaking of sleep, you might have gotten curious at the idea of sleep modules in a VR app. I know I did. If you’re imagining drifting off with your headset on, that’s not what’s happening here.
These experiences might help make you a little drowsy, but what they’re really doing is stocking a mental toolbox with mindfulness tools that you can take with you to bed. This is actually one of my favorite recurring elements of Mindway as an app overall, so we’ll return to a larger discussion of this later.
Mindfulness
Mindfulness is the heart and soul of Mindway. As such, this is the most populated section with (in my opinion) the best content. This content is split into “Journeys”, “Practice”, and “Build Your Own”.
Practices are shorter sessions that you can do independently of one another, while each journey is a series of sessions on a related topic that build on one another. The Build Your Own section allows you to create a practice session by selecting a topic, a world, and a soundtrack. Whether part of a journey or an individual practice, sessions are between eight and 15 minutes.
On the other hand, if you are familiar with mindfulness as a practice, I hope that you won’t be too skeptical of a VR-based mindfulness application. Mindway uses VR very cleverly to facilitate common mindfulness exercises. You can even select the “Science” button in the home environment to learn about how Mindway develops their sessions.
During body scans, a sparkling mist gradually rises up around you. When focusing your attention on a fire, the fire begins to die down if you get distracted by the environment for too long. During breathing exercises, particles seem to flow into you when it’s time to inhale and flow out when it’s time to exhale. Reach high up to grab an apple from a tree during a stretch.
A Week of Mindfulness
I used Mindway for about a week during the course of writing this article. There were stretches where I used it every day, there were days that I didn’t use it at all, and there were days that I kept going back in for multiple sessions.
I discovered mindfulness in college and it was a big part of my life for a good couple of years but at some point, I really got away from it. The first thing that I noticed in Mindway was how deeply I’m still able to breathe. I can breathe pretty heavily in my headset when I’m boxing in VR, but that’s different from long, slow, deliberate breath – something I didn’t realize I missed.
I liked some sessions more than others, but there was nothing that I encountered in Mindway that I didn’t enjoy. My favorite content is the “Boost Your Energy” Journey. The three-part journey has practices for starting the day with focus without being overwhelmed, for regaining your energy as you go through your day, and for winding down when it’s time to relax.
While I like knowing that I can pop on the headset for a reasonably short session whenever I want throughout my day, the narrations do often remind you that you can take things like breathing exercises and meditation models with you wherever you go. While the visualizations are nice and might be helpful for people newer to mindfulness, Mindway is very educational.
An Unanticipated Promotion
The child in my life doesn’t really understand what I do for a living, but she knows that sometimes we get to play with neat tech, like an AR narrative puzzle. Sometimes, I set up my headset for her to enjoy some supervised offline play. (I lock apps, so I know that she’s playing Bait!, not Peaky Blinders.)
The other day, she saw the new Mindway thumbnail in my apps library and asked about it. I told her that she could check it out if she wanted to, but I warned her that it wasn’t exactly a “game.”
It turns out that she loved it. She was able to navigate the simple menus by herself and tried out a number of experiences. Hearing the audio of the guided meditations through the Quest 2’s native off-ear speakers, I was able to watch – admittedly a little stunned – as the energetic eight-year-old sat through around a half-hour of various mindfulness exercises.
I’m not a doctor, and I’m never going to advocate that any VR headset become “an electronic babysitter”, but it seems to me a curious kid could do a lot worse things in VR than mindful breathing.
Peaceful Periods in VR
Hitting the mat in the third round, assassinating communist informants in the back of a bar, betraying your crewmates in space – VR experiences can be pretty intense. While those experiences can be a lot of fun, it’s nice to know that Mindway provides a corner of the immersive world where you can have a little peace and quiet before getting back to your day.
Metaversed: See Beyond the Hype is the new book by Samantha G. Wolfe and Luis Bravo Martins introducing the metaverse stripped of its over-inflated, pie-in-the-sky expectation cloud built up by marketers. The book presents a practical and balanced approach to using the metaverse as it exists today and preparing for how it might exist tomorrow.
ARPost received a copy of Metaversed and had the opportunity to interview the authors on how it came together and what they hope it will achieve.
Preparing for the Metaverse
Metaversed begins with an important and common question in the industry: how do we prepare for the metaverse when we can’t agree on what it is?
“Taking the internet and bringing another dimension to it and setting it free in the phygital world […] it’s almost impossible to fully understand the extent of this shift.”
– Chapter One: Predictions
Early on, the authors present a working definition of the metaverse. This isn’t for the authors to throw their definition into the war of words already taking place around the metaverse, but rather so that everyone reading Metaversed has a common starting point.
“To the authors, the metaverse is the next stage of the internet and results from the evolution of a wide variety of emerging exponential technologies maturing simultaneously, converging and enabling a new interconnected relationship between physical and digital.”
– Chapter One: Predictions
Metaversed isn’t just about technology, but how technology impacts us as a society and as individuals – and about the societal trends that are helping to usher in the metaverse. These include movements towards remote work and education, decentralization, social media, and the creator economy.
“The challenges we’re about to face will need a multidisciplinary effort. Business professionals from all areas, teachers, lawyers, scientists, historians, and sociologists, everyone can contribute with their experience and knowledge so we can start preparing for this tremendous shift.”
– Chapter One: Predictions
A Book Written for Anybody
Metaversed is written for a reader in any profession to encompass the entire metaverse. Chapter two presents all of the technologies playing into the development of the metaverse. That includes immersive technologies like the spatial web, XR hardware, and digital twins. It also includes Web3 and blockchain, cloud computing, and AI and ML.
“I feel like we went through a hype cycle of ‘the metaverse’ as a term and now we’re kind of past that. People are looking beyond that and asking, ‘What is this, really?’” said Wolfe. “I’m hoping that as people get past all of that hype they can ask ‘What does this mean to me, and what does this mean to my business?’”
Readers of ARPost might be principally interested in immersive technologies. Understanding the role that these technologies will play in larger shifts in the coming years requires an understanding of other technologies even though they may feel removed.
“The main topic is to bring in people that are not in on all of the metaverse discussion,” said Martins. “We need to have those people. We need to have a version of the metaverse that isn’t just created by technologists like us.”
The book also discusses governments and standards organizations furthering the metaverse through protecting users and ensuring interoperability respectively. A lot of the value of the metaverse will be created by users – much as with the current web, but more equitable.
“A true creator economy has been set in motion where communities are not only spawning creators but overall helping them to remain independent and relevant.[…] With several new platforms available in the gaming industry and in the so-called Web3 businesses, new avenues for distributing digital products and content are being envisioned and built.”
– Chapter Four: New Rules
Life and Work in the Metaverse
The largest single chapter in the book, “Metaversed Markets” is an exhaustive exploration of how different industries are using the extant iteration of the metaverse and how they may adapt to its development. While the bulk of Metaversed discussed opportunities in the metaverse and how to realize them, four chapters are dedicated exclusively to challenges in the metaverse.
“When living in a hybrid reality of digital and physical objects, spaces, and people that we seemingly use and own, will it all be real? The memories of our time immersed in those worlds won’t tell us otherwise. […] We can pick up our lessons learned of the risks involved and plan ahead for a better, positive metaverse. But, to do that, we need to first identify key challenges.”
– Chapter Nine: Understanding Reality
These challenges have some to do with technologies that haven’t yet been realized or optimized, but mainly pertain to the human experience of adapting to and living in the metaverse.
“The whole purpose is exactly that – to try to shed light on not just the potential of the metaverse […] but more than that to try to pass on the challenges of the metaverse,” said Martins. “Presenting the challenges is not negative – it’s facing those challenges […] At the end of the day, what we want is to contribute to a more ethical metaverse.”
Metaversed expresses hope that governments and organizations like the XR Safety Initiative will help to mitigate some risks. It also recognizes that a lot of responsibility will be put on users themselves.
“Even if it’s uncomfortable, we need to discuss how emerging tech can be monitored and regulated. We don’t have to cross our fingers and hope that big tech companies figure it out themselves (again).”
– Chapter Ten: Privacy and Safety in the Metaverse
“Unanswered Questions”
“Because we’re faced with so many unanswered questions and unsolved technical challenges, there should be no shame in saying ‘I don’t know,’ or ‘We don’t know’ when asked about the future […] for better or worse, we’re in this together.”
– Chapter Twelve: The New Humanity
The thing that struck me the most about Metaversed was its honesty. The authors are confident in their predictions but never present those predictions as already being facts. Overall, it feels like a conversation rather than a keynote or a sales pitch.
“At the end of the day, tech runs so quickly and changes so completely unexpectedly […] it’s sort of an exercise,” said Martins. “Hopefully what we can offer is more of the logic of thought.”
How “Metaversed” Came to Be
Wolfe and Martins have a long history, despite having yet to meet in person. The two began talking after Martins read “Marketing New Realities,” which Wolfe co-wrote with Cathy Hackl in 2017. Then, Martins was a guest speaker at Wolfe’s courses at New York University’s Steinhardt School. Martins was invited to write a book and knew who to talk to for a coauthor.
“It started with this opportunity that came about from the publisher. Around that time there was this huge push regarding the metaverse and I was thinking about doing something on the flipside, focusing entirely on the challenges,” said Martins. “I decided that that approach wouldn’t be the best possible way to explain to people who don’t know much or aren’t as involved.”
Wolfe’s coming on board provided the balance that Martins was looking for. It also expanded the vast network of experts that contributed their insights to Metaversed.
“He wanted to write this book about what can go wrong but I tend to be quite positive,” said Wolfe. “I also tend to look at how all of this applies to businesses.”
Despite being based in different countries and working on the book largely asynchronously, the two decided to write Metaversed with one voice, rather than passing chapters back and forth. While the book doesn’t feel divided (at least, not to people who don’t know the authors very closely) both of them have chapters that they feel they put more into.
“In the end, I think we were all very involved in doing the writing and – of course – the research,” said Martins. “There were chapters which were being run by one of us or by the other one, and some – particularly the chapters in the beginning – were very consensual.”
A Digestible Book, if Not in One Sitting
Metaversed: See Beyond the Hype is currently available on Amazon. The book, weighing in at over 300 pages, may or may not be a lot to read from cover to cover depending on where you are on your metaverse journey. However, the book was also designed to be incredibly navigable, making it easy to read or reread as you see fit.
Peaky Blinders: The King’s Ransom is a new VR game from Maze Theory inspired by the popular period crime drama. I’m reporting from a virtually reconstructed Garrison pub, so confess – Jon and ARPost are listening.
This review covers major game elements (avoiding juicy spoilers), how VR is implemented, and some ethical considerations. After all, if you’re familiar with Peaky Blinders, you know that some of the content can be pretty challenging – particularly in VR. I almost didn’t want to play it, and ARPost almost didn’t want to cover it.
Welcome to The Peaky Blinders
Peaky Blinders: The King’s Ransom takes place during season five. If you aren’t to that point there’s at least one spoiler. Fortunately, the events of the game don’t have a lot to do with the events of the show, so if you aren’t a fan of the show you can still enjoy the game – just not as much.
A lot of the game felt like fan service. I happen to be a fan of the Peaky Blinders TV show, so I appreciated it. From “Red Right Hand” playing as I walked down Garrison Lane to the bottomless pack of cigarettes in your inventory, some more stylized elements of the game might only feel right if you’re familiar with the show.
In the game, you play a war vet working with the Peaky Blinders in hopes that they will clear your criminal record. Your aunt, an NPC in the game, is a family friend of the Shelbys and a good word from Arthur got you in. Tommy tests your loyalty by asking you to shoot a hooded man in the Garrison about 10 minutes into the game, so things move pretty fast.
You’re tasked with finding one of the Peaky Blinders who went missing on the trail of Winston Churchhill’s stolen dispatch box. This sends you to Limehouse, a majority Chinese neighborhood in East London.
There, you find yourself in a serial killer’s crime scene. I was hoping that this would turn into an investigation like the “Blood on the Ice” quest in Skyrim, but you find out pretty quickly that the serial killer happens to be the rival gang leader with the dispatch box.
Gameplay
Peaky Blinders: The King’s Ransom is rated “comfortable” with options to play sitting or roomscale. Analog sticks enable snap turns, though you can also do this by turning your head or your whole body in roomscale.
They also control movement, including crouching, walking, or teleporting. I played most of the game walking, but sometimes you need to teleport to jump. Using teleport to move from cover to cover during a firefight can also be handy. Some items in the environment appear lighter when you can interact with them, for example, crates that you can move out of your way.
I did get a little sick playing the game but if you read my reviews you know that I’m particularly susceptible to VR motion sickness, so I blame my physiology and play style before I blame the developers in most cases.
One of the cooler elements of the game was that your character doesn’t speak. NPCs just accept this as a quirk. You don’t have to hear someone else’s voice coming out of your head, but it also avoids dialogue options – you respond with your actions.
Items and Interactions
A guide to the controls looked intimidating but the controls are very intuitive. Your inventory is arranged around your field of view. Reach over your shoulder and grab to get your journal, reach down and grab to get your gun, reach left and grab to get a cigarette, and reach right and grab to get your lighter.
Your lighter is handy for lighting endless cigarettes, but you also use it to do things like light lamps that help you navigate some of the darker scenes in the game. You also defuse bombs, rebuild radios, open a safe, and uncork bottles of gin.
Drinking and smoking don’t impact gameplay. I think it might’ve been cool if smoking slowed down time or drinking made you less susceptible to injury, but they’re just props. You can also find vials of “Tokyo” (that’s “cocaine” in Peaky Blinders lingo) but they’re just collectible easter eggs.
One of the most common item interactions is reloading your 1911 semi-automatic pistol (sorry Peaky Blinders, no Webleys). This involves loading a clip into the bottom of the gun.
There’s no believable way to hold the gun with two hands because of the controllers and because Peaky Blinders are too cool for stable shooting stances, but you can pass the gun from hand to hand to shoot around cover. You can’t carry extra clips, so you have to look for ammunition boxes in the levels. Count rounds if you want, but I just reloaded whenever I could.
One forced story interaction involves your gun being empty no matter how many rounds you should logically have left in the clip when you enter the interaction, so keeping count just kind of frustrated me. Or, maybe the gun jammed because you have a 1911 instead of a Webley.
Navigating Environments
The environments were the biggest draw for me buying this game. I’m a fan of Peaky Blinders largely because of the settings. Being able to explore faithful reconstructions of some of the iconic locations of the show really scratched an itch for me and the game’s original locations feel authentic and well-developed too. Major playable locations include:
Garrison Lane including The Garrison and a garage;
Watery Lane including The Shelby Betting Shop and Polly Shelby’s apartment;
Charlie Strong’s Boatyard;
Limehouse, including a boatyard, a neighborhood, and a rival gang’s operation.
The game never tells you about lighter items being movable, so my first major navigation snag was wandering around an alleyway until I realized I could move a crate blocking my way. One level in Limehouse is also needlessly tricky. I think it was trying to incorporate some puzzle elements, but it didn’t really land for me.
Later in the game, you fight your way out of a burning building while carrying the dispatch box. This level brought all of the game’s control mechanics into play beautifully. You have to teleport to jump over holes in the floor, balance as you walk over beams, and put down the dispatch box to reload.
Finding collectible easter eggs in the game often involves finding tools in the environment to smash open crates. Some of the levels have dysfunctioning radios. Finding the parts, plugging them in, and tuning the radio unlocks radio programs that give you additional context about the level.
After beating the game, the levels remain explorable. I found at least one area that either wasn’t available during story play or I didn’t find the first time around. Either way, there’s a lot to explore.
I Killed a Man in VR Because Tommy Shelby Told Me To
I had reservations about this game. Until now, the most violent thing I’d ever done in VR was knock someone out in Thrill of the Fight. There were situations in this game that made me uncomfortable but it wasn’t as bad as I was expecting.
For one thing, the character animation didn’t blow me away (I was playing on Quest 2, not PC). Sets and items look great but people in the game leave a little to be desired. Further, the violence isn’t terribly graphic. A cartoony blood burst lets you know you hit someone but it isn’t gory. And, all of the violence that you perpetrate is done at a distance, which I think helps.
Most of the times that I did feel uncomfortable, it wasn’t because of graphical believability or a feeling of embodiment. It was because the writing of the game successfully made me ask myself questions about what I was doing and why.
In one sequence, your character is tied to a chair so you can look around and see your bound hands, which is a little unnerving (you can’t see your body, which is unnerving for a different reason). I don’t know if it was a predictable point in the game or my deep trust in the Peaky Blinders, but I wasn’t afraid at this point – I knew someone would come just in time.
I still think that we should be careful about how and why violence is used in virtual reality entertainment. As far as this game goes, I think that restraint on the part of the developers helped to balance violence as a plot device without going over the top.
Final Thoughts
I was pleased to see that Peaky Blinders: The King’s Ransom only costs $30, but that also meant that I wasn’t too let down that it’s only about three-and-a-half hours of gameplay. The game has already been updated since it was released, so fans can hope for more to come.
True crime has become a national pastime. From documentaries and dramatized biopics to endless podcasts and YouTube channels, folks can’t get enough of diving into real-life murders and missing person cases – some solved, some apparently never to be solved. And now, you can explore a new file of cases in augmented reality with CrimeTrip.
Taking a CrimeTrip
CrimeTrip is a true crime AR game for iOS and Android from developer studio Prologue. The experience, viewed and navigated entirely through a mobile phone, puts you in the middle of painstakingly reproduced crime scenes from six unsolved crimes from the seventies and eighties including heists, mob hits, and more. Three are available now, with three coming soon.
The app download is free, but, after playing through a free “prologue” and tutorial, you have the option to buy an individual story for $3.99, or bundles of stories going up to the complete game for $12.99. According to the app, this allows the platform to be supported entirely ad-free.
Hands-On the Game
If you have enough open space, you can navigate within the game by walking to some degree. However, the game worlds are big enough that no matter how big and clean your living room is you’ll have to use the on-screen controls eventually. (My biggest issue with the game was accidentally holding my phone in a way that covered the camera and lost my tracking.)
CrimeTrip is split between crime scenes and a pretty expansive police department office. The office includes resources that you will need to dive into the case, including the cork board where you put it all together.
“[CrimeTrip] revisits the podcast genre, following non-linear routes, constantly shifting between the present and the past,” Prologue founder and creative director Jonathan Rouxel said in a Medium post. “Designers elevate the status of the audience who is no longer perceived as a community of passive listeners but as active participants.”
On-screen controls don’t do all the work. Sometimes the best way to view a scene or the only way to find an item is by physically getting on your hands and knees. A good portion of the game might be “played” on the various online communities as you compare notes with other true crime enthusiasts.
While scenes and clues are accurately created with great detail, the characters and events in the stories appear as luminous point clouds – so there’s no unsettling blood and gore to deal with. The cheeky, fourth-wall-breaking game narration should be amusing to true crime enthusiasts and not too stressful for people new to the genre.
A Careful Handling of a Touchy Subject
True crime is a sensitive subject – and people can be very sensitive to it. Stories can be emotionally challenging to hear and research, and living people are sometimes affected by true crime commentators jumping on a story still in development. CrimeTrip avoids both of these problems in two important ways.
First, the graphical style and the narration style of the game keep things from being too heavy. We saw a similar approach with USA Today’s Accused experience last year. Second, the cases in this experience are old enough that all of the suspects have passed away so players can enjoy the puzzling stories without stressing the impact on survivors too much.
The fact that the cases are so old and so cold helps add allure to the game as well. There are no bad guys left to catch so it’s okay that even AR-enabled sleuths aren’t able to conclusively agree on whodunit. In ongoing cases, it would be great if the culprit could be caught and taken off the streets. But these forty and fifty-year-old tales can remain unfinished puzzles forever.
Check it Out if You Dare
So, is CrimeTrip worth your money? Check it out. The free app includes free previews to all available episodes – and that’s not just gameplay videos, you get to play the game. Still not sure? You can buy the cases one at a time. So, if you’re even remotely interested in true crime, it can’t hurt to check out.
So, in the past few weeks we have seen big tech, including Meta and Microsoft, announce massive layoffs, mostly to their XR division and at the same time, pivoting towards artificial intelligence and generative content.
Despite the news, this year’s DEAL expo was as busy as ever. Teeming with an array of VR devices, games, contraptions, and a myriad of other VR-related gizmos that filled the halls, it, quite frankly, surpassed expectations.
It’s clear that there’s an appetite for virtual reality and that the VR industry as a whole has no intention of slowing down. Here’s a short rundown of the most interesting things that caught my eye.
Meta4 Interactive
Meta4 Interactive were on the floor showcasing their arena player vs. player battler based on the well-known Transformers IP. I had the chance to battle it out with the CBDO of Meta4, Sylvain Croteau, as well as other members of the team.
They were all great at the game, which might seem obvious since it’s their product after all, but you would be surprised how often management is actually disconnected from their games or brands. It was nice to see that in this case the team is not only up to speed with their products, but also plays Transformers: VR Battle Arena for fun.
The hardware consisted of blue HTC Vive Pros connected to the gaming PCs above. This kept the headsets tethered without me actually feeling the tether as the cables were suspended from above. Also, the game is stationary, as you teleport from platform to platform (not unlike Tower Tag). I dodged bullets and turned in all directions without any issues.
Meta4 Interactive booth
Transformers VR: Battle Arena was originally developed in 2019 but I only had a chance to try it out recently. On their website, Meta4 claims their games run at a 90hz refresh rate, but it felt like less. Perhaps more like 60fps or 45fps with reprojection to 90hz?
The HTC Vive Pros were tethered, so there was no latency but the game did have a peculiar dense, dreamlike aspect to it, which often stems from lower refresh rates. I would gladly play a slightly stripped-down version of the game if it meant running it at 120hz native.
PvP arenas are very engaging thanks to their competitive aspect but it also means they work best for groups of friends, gamers, and people who want to compete and see who’s the best. All in all, I had fun and can’t wait to see what Meta4 has in store next.
VEX Solutions
VEX Solutions showcased two turnkey solutions side by side. The first one, “VEX Adventure,” offered a more comprehensive LBE-type experience with a motorized floor, wind and heat, onboarding, and full cooperative plot-driven gameplay. The other one, “VEX Arena,” is a lighter, less premium version, aimed at higher throughput.
VEX Arena (front) and VEX Adventure (back)
Both setups used haptic vests but otherwise, the hardware differed significantly. VR Arena used a Vive Focus 3, while the more premium VR Adventure opted for some kind of hybrid setup. It looked like Pico Neo 3, combined with SteamVR tracking, hand tracking, and Pico 3 controllers that were mounted into the guns. A true patchwork of all the different technologies.
Pico Neo 3 together with Vive trackers, hand tracking, and Pico 3 controllers
The VEX representative declined to discuss hardware aspects, which I can understand. When it works, it works. However, having a multitude of varying components increases the number of potential failure points, which is not desirable. To that end, their other less premium offering, VR Arena, seemed a bit more manageable, but again I didn’t have the chance to ask about any specifics.
WARPOINT
For those looking to get into VR arenas on the cheap, WARPOINT had their own super basic solution. All it requires is 10 Meta Quest 2 headsets and a tablet. This must be the most affordable turnkey solution I saw at DEAL 2023. You could even forgo buying brand-new headsets and instead opt for second-hand ones to lower the costs even further.
WARPOINT: Meta Quest 2s equipped with power banks are waiting their turn
All the Meta Quests operate in standalone mode using software developed by WARPOINT themselves. All the modes and maps are designed for PvP combat and marketed as a form of e-sport.
WARPOINT booth in action
Moviemex3D
Moving on, I encountered Moviemex3D. It’s a company that specializes in VR movies and VR simulators, but they also offer an arena experience called VR Labyrinth. It’s a popup box that features redirected walking, gaming elements, and even some motorized rumble effects.
Yours truly, stepping onto a VR elevator (with rumble effects)
If you ever tried TraVRsal or Tea For God, you will know what the deal is. Even if the pop-up box looks small, the VR game area is much larger thanks to redirected walking, elevators, and so on. Expect traps, action, and shooting. Overall it’s a fun single-player experience.
From a hardware perspective, Moviemex3D used a Pico Neo 3 headset streaming from a PCVR computer. It’s not a bad solution but again, just like with the Transformers VR, I felt like the game wasn’t running at high enough refresh rates, making the entire experience feel heavy and dreamlike.
FuninVR
FuninVR had this pretty, eye-catching centerpiece.
UFO-shaped VR simulator (FuninVR)
It’s a massive UFO-shaped VR simulator. Not exactly a VR arena, but I had to try it out. The team was running a variety of experiences. People before me tried some kind of moon landing. In my case, it was a fantasy-themed rollercoaster.
Unfortunately, the VR simulator was out of sync with the roller coaster animation. Sometimes the track would turn left but the UFO didn’t – forcing me to either turn my head 90 degrees or face the walls. This left many participants nauseous.
There were also other questionable elements, like sudden impacts that made the roller coaster stop in place — basically, it’s like the developers deliberately broke every established rule on what not to do. I have strong VR legs, but still felt queasy.
Each participant had buttons on each side of the seat, which we used to shoot enemies, dragons, and other baddies. Aiming was done with head-tracking and it was a fun interactive element that I enjoyed, even if the shooting was only done for theatrical purposes — we were all running separate instances of the roller-coaster animation and there was no way to stop the simulator from progressing further.
Perhaps it would be better if I had tried the moon landing demo because, unfortunately, the roller-coaster had too many sync and motion issues to be enjoyable.
Hero Zone VR
One last turnkey VR solution worth mentioning is Hero Zone VR. It’s another fully standalone arena experience, this time running on a Vive Focus 3. This allowed the developers to take advantage of the headset’s larger resolution and higher-clocked XR2+ chipset.
Participants getting ready to try out Hero Zone VR
There was a queue of people waiting to try out Hero Zone VR, so I didn’t get the chance to try it myself, but it looked like there was a selection of both cooperative and PvP games. I spoke briefly to one of the team members and he seemed to be quite proud of what they managed to achieve by going full standalone.
No Beat Saber?
There were also a lot of usual suspects: VR cabinets, VR kiosks, and VR arcades. Notably, Beat Saber was nowhere to be found. I wonder if it became too expensive to license or perhaps it has lost a bit of its novelty value. Instead, Synth Riders came in to fill the void, along with other fast-paced games like Zombieland VR.
Synth Riders. By all accounts a great rhythm game.
One arcade cabinet I really enjoyed was VR Shotgun by VR 360 Action. You step into a minigun cart and it’s basically laser shooting reinvented. Spooks and baddies come from left and right and you just have to blast away.
VR Shotgun by VR 360 Action
The minigun prop felt heavy and it had some nice haptic feedback to it. I was also happy with the decision to use the HP Reverb G2 headset, which still presents a very high bar when it comes to clarity and resolution. The game was running buttery smooth. Of course, the gameplay was pretty unsophisticated and there was no locomotion (it would be nice to have some kind of on-rails movement to break the mold), but VR Shotgun did make me wish all the arena and LBE software would have this level of visual comfort and fluidity.
Summary and Takeaways
So, what are my main takeaways from this year’s DEAL?
» Even if the early days are behind us, we’re still in the days of rapid innovation and prototyping. This makes it hard for VR arcades and arenas to keep up. I saw almost every kind of headset this year, from the oldest Oculus Rift CV1s, through Vive Pro, Windows Mixed Reality, Quest 2, Pico 4, and Vive Focus 3, running standalone, streaming, and wired.
» Meta presence in the amusement and entertainment sector could be stronger. Despite spending egregious amounts, most of the money went towards metaverse and mixed reality — both of which do not gel very well with the arcade environment. The Quest Pro was nowhere to be seen but maybe because it’s such a fresh headset.
» Virtual reality is becoming more and more normalized. At least 30-40% of the booths were virtual reality oriented. With everyone around wearing and trying headsets, people have stopped feeling insecure about how they look with a headset on and instead enjoy their experiences, treating VR like any other tech.
Have fun and keep on rocking in the virtual world!
About the Guest Author(s)
Mat Pawluczuk
Mat Pawluczuk is an XR / VR writer and content creator.