featured

space-invaders-celebrates-45th-anniversary-with-a-new-ar-game

Space Invaders Celebrates 45th Anniversary With a New AR Game

Space Invaders is a shooting video game created by Tomohiro Nishikado in 1978 and manufactured and sold by TAITO. It was the first fixed shooter video game and is considered one of the most iconic arcade games ever. As Space Invaders turns 45, TAITO teams up with Google and UNIT9 to give its players an elevated AR gaming experience with Google’s ARCore Geospatial API.

TAITO and Google partnered with global production and innovation studio UNIT9 to transform Space Invaders into an immersive AR game in honor of its 45th anniversary. Players can defend their real-world neighborhoods from 3D invaders emerging from nearby buildings and landmarks.

Meet “SPACE INVADERS: World Defense” AR Game

The reimagined iconic video game is SPACE INVADERS: World Defense, a sequel to the original game. It gives players access to enhanced weapons so they can defend their neighborhoods more effectively. New music and sound effects were also added for a more exhilarating and immersive experience.

Space Invaders AR game gameplay

The most remarkable update, however, is the real-time response to location-specific patterns and nearby buildings. It means that the AR game adapts to the player’s real-life surroundings. For example, if it’s raining, the virtual environment may also show rain, and if there’s a tall building at the player’s location, there will also be a tall building in the AR realm where an Invader may emerge from.

SPACE INVADERS: World Defense Gameplay

The original game’s classic characters and high-score mechanics are preserved in the AR game SPACE INVADERS: World Defense. The difference is that players should explore their virtual neighborhoods to find Space Invaders and defeat them. They can unlock special power-ups, compete with their friends within their location, and take an AR selfie to post on social media.

AR game Space Invaders

Players can easily switch between the World Dimension and Invaders Dimension via a portal. The virtual, 3D Invader world changes in sync with the natural environment, allowing players to complete missions in both the virtual world and the natural world’s AR view.

Harnessing the Power of Google’s ARCore and Geospatial API

UNIT9 harnessed the power of Google’s ARCore and Geospatial API to develop the next-level AR gaming experience of SPACE INVADERS: World Defense. ARCore is a software development kit (SDK) developers use to create AR applications across multiple platforms, including iOS, Android, Unity, and the Web. It seamlessly merges the digital and physical worlds, allowing users to interact with virtual objects in the AR adaptation of their natural surroundings.

As one of the top AR SDKs, the other prominent capabilities of ARCore include tracking the orientation and position of the user’s device, matching the lighting of virtual objects with their surroundings, detecting the location and size of various surface types, and integrating with existing tools like Unreal and Unity.

Phone screens Space Invaders - Invader Dimension

Combined with Geospatial API, which remotely attaches content to any area Google Street View covers, ARCore integrates geometric data from Google Maps Street View into SPACE INVADERS: World Defense, displaying accurate terrain and building information within a 100-meter radius of the player’s location.

The Beginning of an Exciting New Era

According to UNIT9’s Head of Digital, Media Ridha, Google Geospatial API’s launch marks the beginning of an exciting new era for digital experiences tied to real-world locations that are not only limited to games but for any brand experience linked to a specific place. “It was an honor to work with Google and TAITO to translate one of the most famous IPs out there into the next wave of AR gaming and create an experience that fans of all ages around the world can enjoy,” Ridha said in a press release shared with ARPost.

Matthieu Lorrain, global head of creative innovation at Google Labs Partnerships, is excited to see more developers leverage their platform to push the boundaries of geolocalized experiences. “[Google’s Geospatial API] allowed us to celebrate the iconic Space Invaders game by turning the world into a global playground,” said Lorrain.

SPACE INVADERS: World Defense officially launched on July 17, 2023, and is available on iOS and Android. Players in key markets, including Europe, Japan, and the USA, can download the AR game on their mobile devices and defeat Invaders in the real world, made more immersive with augmented reality.

Space Invaders Celebrates 45th Anniversary With a New AR Game Read More »

“the-future-of-business-travel”-report-by-booking.com-gives-metaverse-predictions

“The Future of Business Travel” Report by Booking.com Gives Metaverse Predictions

The metaverse can be summed up as the augmented world. So, naturally, it has implications for travel. How and when people travel may both seriously change as spatial communication and digital twins make some kinds of travel less likely, while AR and automation reimagine the travel that we do engage in. A report by Booking.com for Business, titled “The Future of Business Travel” explores the next 30 years of travel.

AR and Space Hotels

The report begins with “A Timeline of Future Business Travel Predictions.” To the potential dismay of augmented reality enthusiasts, the report puts AR in 2027 – the same year as “space hotels”. The report acknowledges existing AR use cases including augmenting areas with contextual information. However, the authors are waiting for something better.

“Right now, AR is limited, lacking a wide field-of-view and having resolution, battery, and 3D sensing issues,” reads the report. “It’s thought that by 2027 people will have access to unconstrained, immersive AR experiences and the associated advantages for travel professionals.”

Why 2027? The paper doesn’t explicitly mention powerful AR wearables, but the time frame and their insistence on “unconstrained” experiences suggest that this is what the authors are waiting for. We already have consumer AR glasses, with limited FoV, but these are almost exclusively “viewers” for virtual screens that can’t offer the real-time contextual information people want.

In a recent interview with ARPost Lumus VP of Optics David Goldman placed a consumer AR device based on Z-Lens around 2025, with 2027 seeing models with 50-degree FoV eventually getting as wide as 70 or 80 degrees. That sounds like it’s getting more in line with people’s expectations for AR travel.

More Interest in VR?

Augmented travel is one thing, but virtual travel is another. Virtual reality has higher immersion due to a heads-up interface, greater graphical fidelity, and wider field of view. Further, VR hardware is becoming increasingly accessible, affordable, and popular with consumers.

The report also included a collection of the most-searched business travel trends, which included virtual travel in the top three. A ranking of the most talked about travel trends in the media also includes “hotel metaverse” at number three and “hotel virtual events” at number eight.

The authors attribute this to virtual travel “reducing the necessary number of business trips and giving corporate travelers the chance to explore the world with VR and metaverse experience.” Specific use cases anticipated in the report include immersive tours prior to booking, virtual conferences and events, virtual site visits to digital twins, and immersive in-flight entertainment.

More to the Metaverse

Immersive technology is first in our minds and hearts here at ARPost, but the metaverse is about more than just display technologies. The report also includes predictions related to other emerging technologies including artificial intelligence and blockchain.

For example, the authors predict blockchain technology becoming standard in hotels the year before they anticipate AR kicking off. And, around the beginning of the next decade, the authors predict “guest comfort and energy efficiency will be managed and optimized by AI in most hotels.”

Other predictions, including hotel-specific crypto-driven rewards programs and robot assistants, can be found in the full report.

A Lot to Look Forward To

All predictions should be taken with a healthy dose of salt – and that’s particularly true of predictions based on when to expect a given development. Disclaimers aside, Booking.com has presented a very interesting look at trends regarding what people want out of the metaverse when it comes to travel.

“The Future of Business Travel” Report by Booking.com Gives Metaverse Predictions Read More »

official-amazevr-concerts-app-launches-with-an-exclusive-zara-larsson-concert

Official AmazeVR Concerts App Launches With an Exclusive Zara Larsson Concert

Do you remember missing an amazing concert by your favorite artist because you could not travel to another country or continent to attend it? This is no longer a problem. Thanks to AmazeVR, anyone can experience live shows using their newly-launched VR Concerts app.

Drawing on their previous experience working with artists like Megan Thee Stallion and Ceraadi, the company is celebrating the launch of their AmazeVR Concerts app with “Zara Larsson VR Concert”, the one-of-a-kind show by Swedish pop star Zara Larsson. Now, anyone can install the AmazeVR Concerts app and attend any concert available on the platform from the comfort of their home.

Virtual Events – the Future of Entertainment

The global health crisis we experienced made us rethink all types of interactions, from healthcare appointments and business meetings to concerts and theater shows. The VR concerts app developed by AmazeVR is one of the latest additions to immersive and interactive tools for entertainment.

This is a huge step forward both for artists and audiences. For artists, VR shows allow them to interact with more fans and monetize their work in new ways. For music fans, the barriers represented by long distances and finances for traveling suddenly disappear.

Zara Larsson Excited to Collaborate with AmazeVR

Known for hits such as “Lush Life”, “Ain’t My Fault”, and “End of Time”, Swedish pop star Zara Larsson exuded enthusiasm for collaborating with AmazeVR for the launch of  AmazeVR Concerts app.

“I’ve always believed that live music has the power to unite and transcend boundaries. As an artist, finding new ways to connect with my fans and deliver a truly immersive and unforgettable experience is super important to me,” she said in a press release shared with ARPost. “I’m thrilled to be working with AmazeVR to break through the fourth wall, and directly into the homes of fans around the world.”

Bringing Artists and Fans Together in the Virtual World

For AmazeVR, their VR Concerts app, available on Meta Quest 2 (App Lab) and SteamVR, is the crowning of years of developing and improving immersive solutions for the entertainment industry. Creating the first VR concerts and measuring the public response to them showed them that they were on the right path.

At AmazeVR we are ushering a new wave of innovation for music experiences, by providing artists with extraordinary and unparalleled avenues to be up close and personal with their fans,” said AmazeVR co-CEO and co-founder Steve Lee. “It is an honor to be launching the AmazeVR app alongside such an incredible artist like Zara. Her creativity has come together to create a showstopping performance and we can’t wait for her fans to enjoy the experience.”

A Busy Schedule for the Newly Launched AmazeVR Concerts App

The virtual reality concert experience app is set to attract fans of all types of music, including pop-rock, hip-hop, K-pop, rap, and more. Right now, the app is downloadable for free and offers one free song per artist. For the exclusive Zara Larsson VR concert, fans can purchase access for one year at an exclusive launch price of $6.99.

Official AmazeVR Concerts App Launches With an Exclusive Zara Larsson Concert Read More »

a-tech-investor’s-take-on-the-apple-vision-pro

A Tech Investor’s Take on the Apple Vision Pro

Apple’s entrance into the XR space via their announcement of the Apple Vision Pro was one of the most anticipated events in recent XR history. After years of hype build-up and frequent delays and rumors, the device itself left a lot of questions to be answered. I mulled over those questions with Neso Brands CTO Paulo Almeida in the 10th episode of the XR Talks podcast.

Years of Watching Apple

Neso Brands is an investment company specializing in tech-augmented eyewear. Naturally, their Chief Technical Officer has been carefully watching the Apple Vision Pro rumors for longer than many companies have been involved in XR at all.

“Somehow I’ve been following it for the last seven years,” said Almeida. “The job descriptions first showed up across the internet and that started the rumors, I’m going to say back in 2014, 2015.”

However, Almeida said that he had been waiting for something like an Apple headset for even longer than that – ever since he was, like so many others, disenchanted by the Google Glass device that launched in 2013. While this device first piqued Almeida’s interest in the future of XR wearables, he doesn’t think that it’s the real spiritual predecessor of consumer XR.

“The real pioneers of this are definitely Magic Leap and their waveguide lens is the future that I was expecting Apple to actually achieve some kind of breakthrough on,” said Almeida. “Magic Leap have been the true pioneers of trying to miniaturize all of these components and to make it into something that we could call smart glasses… not over-the-head displays.” 

A Difference of Displays

Waveguide displays use a light engine and specialized lenses to project virtual content in front of the eye. Depending on the application, the wearer then looks at that content for a virtual screen application or through the lens to see their physical surroundings augmented by the virtual elements. It’s currently the dominant approach in lighter-weight, lower-cost AR devices.

As far as something like smart glasses go, Magic Leap is still pretty big and pretty expensive – and a fairly exclusive enterprise device. Campfire, also an enterprise-focused company, offers a smaller and lightweight headset that can offer VR as well as MR via the addition of a magnetic plate over the normally transparent lenses.

Micro-OLED displays essentially just put a screen in front of the wearer’s eyes. This approach has clear benefits in terms of image quality, brightness, field-of-view, and some other considerations at the trade-off of being more expensive and much heavier than waveguide displays. This is currently the dominant approach in VR devices – including the Apple Vision Pro.

Because the wearer can’t see through an OLED screen, AR and MR experiences are enabled through passthrough – which displays a video feed of the wearer’s surroundings which can then be augmented with virtual content. This is how virtually all AR/VR headsets – including the Apple Vision Pro – are able to offer both of those experiences on the same opaque screen.

An Extra Screen

There is one place where the benefits of a Micro-OLED display just don’t seem to justify the tradeoffs in terms of weight, cost, and power usage. That’s the giant screen on the front of the Apple Vision Pro that shows a virtual reproduction of the wearer’s eyes.

“I think it’s quite a useless feature, per se. Let’s put it this way: I think Apple just needed to put something there that would make the wow effect,” said Almeida. “They’re taking what I would call ‘The Tesla Approach’ of showing what they’re capable of and then saying ‘now wait a few more years, we’re now going to go into mass production, design something simple.’”

The fact that the headset is called “Pro” has led many to the conclusion that Apple is going to release a standard version of the headset, which might well do without some bells and whistles – like the virtual eyes.

“I definitely think that they should offer options with and without the front feature and I can almost guarantee most people would go without because it would probably be a good $500 or $600 cheaper, the battery would probably last a little bit longer,” said Almeida. “I wouldn’t be surprised if there was a version called the ‘Apple Vision Mini’ or ‘Apple Vision S’.”

Controller-Free Design

While the Apple Vision Pro has more screens than most VR headsets, it has fewer controllers. In fact, Apple is adamant about the Pro not having or needing controllers but getting along with eye and hand tracking as its only inputs.

“That’s one of the points where they’re definitely innovating,” said Almeida. “If there are a few hidden gems on the Apple Vision Pro, eye tracking and hand tracking are among them.”

I specifically brought up gaming as one use case where I feel that a controller is still necessary. Almeida had his perspective but toward the end of the episode, I also invited 3lbXR and 3lb Games founder and CEO Robin Moulder up from the audience knowing that she has an interesting perspective on game input.

Hardware Integrations and Third-Party Companies

A recurring point in the conversation was the room around the Apple Vision Pro itself for accessories, whether from Apple or from 3rd party developers. Almeida sees gaming and input in general as one of these opportunities.

“To play a game, having the feeling of a real-life controller in your hand is something that you need,” said Almeida. “I also think that they’re opening a path for haptic gloves.”

Almeida envisions a whole collection of different controllers for different kinds of games and interactions similar to that for headsets like VIVE. Earlier in the conversation, he had also mentioned the Apple Vision Pro’s battery life as one area that could be expanded through partnerships.

“If Apple is smart, they’re going to open the market to third-party companies for the existing hardware to expand over the existing options,” said Almeida. “In order to achieve more market and to grow as a company, Apple needs to invest in breakthrough technology and for that, they need to let third-party partners come to complement the ecosystem.”

Calling on Developers

Moulder had a different perspective on the Apple Vision Pro and seemed eager for the opportunity to develop controller-free applications.

“I am super stoked about Apple and from my perspective, there’s a whole lot to unpack,” said Moulder. “On the input side, the thing that I keep bringing up to people is that video showed the woman with her hand in her lap. (…) I’m looking at that thinking ‘If I could move my hands around in that kind of field-of-view, hand-tracking works a lot better.’”

Here we’re not talking about field-of-view in terms of what the wearer sees in the headset, we’re talking about what the headset sees around the wearer. Headsets have the ability to track the location of controllers no matter where they are but can only track the hands when the hands are in view of the cameras, which puts huge limitations on how effective hand-tracking can be.

“I don’t have to predict the location of where the hands are going to be in a bunch of nonsensical math just to make up for that limitation of the technology now,” said Moulder. “That’s really nice for us from the gaming perspective because that means we can lean into hand-tracking even more than we’re doing right now.”

Moulder said that this would mean “working with the user to train them” on more nuanced hand interactions, but it also sounds very optimistic for a controller-free headset that doesn’t feel like the 3DoF models of yesteryear. The wide range of the Apple Vision Pro does require a huge number of cameras, so it’s likely that other makers won’t emulate the move any time soon.

Parting Thoughts on the Apple Vision Pro

The Apple Vision Pro still remains something of a mystery to me. But, I have another perspective on where Apple might be going with a potential future product line built around it, and that’s exciting. I also have a whole new perspective on controller-free headsets in general, and that’s very exciting.

You can listen to the whole “XR Talks with ARPost: Episode 10 – Another Take on Apple’s Entrance” below, or on Spotify.

A Tech Investor’s Take on the Apple Vision Pro Read More »

how-nex-is-flipping-ar-games,-and-why-that’s-a-great-thing

How Nex Is Flipping AR Games, and Why That’s a Great Thing

Augmented reality has a lot of promise for social and active gaming applications. An AR game’s use of the individual and their actual surroundings invites a connection to others and to physical space itself that tends to be absent from other kinds of gaming – including VR gaming. However, XR games are typically either social or active. Nex thinks that games should be both.

Meet Nex

Nex is a hardware and software developer making “motion games.” That is AR games that use motion as the only input. This isn’t entirely new. For example, once the level is started, games like Beat Saber only register motion – that motion is tracked with a controller, but the controller doesn’t provide other forms of input.

Nex AR games

“Our games only require a camera and a device with sufficient processing power,” Nex CEO and co-founder David Lee said in an interview with ARPost. “Today, that processing power is reaching living room entertainment devices.”

That includes connecting compatible televisions to a mobile phone or another connected camera and compute box, but it also increasingly includes televisions with their own built-in cameras. Nex software can recognize multiple people with a single camera for AR games played together and on the same screen.

The two main offerings from Nex are a hardware camera and compute box currently in pre-production, and games created by the company’s four internal game studios and six outside partners using the “Motion Development Kit.”

Is it XR?

Something about Nex feels like it can’t be XR. That’s possibly because there’s no near-to-eye display. There’s no head-worn device – there’s not even an arm’s-length screen. However, if we think about the way that we’ve always defined XR, those aren’t things that we insist on.

We say that AR is virtual elements overlaid over a live view of the physical world. We often think of viewing that through a lens as with head-mounted AR, or through a camera as with mobile-based AR. Nex admittedly flips that standard model – but it still fits the bill. And it has its advantages over “conventional AR.”

“We flip it around so the phone sees you […] and leveraging the biggest screen that most people have,” said Lee. “You can have the effect of a bigger screen by mounting it on your head but that’s not a communal experience.”

Those who have been around the tech world for a few 24 hours may recognize this approach. Over ten years ago, PlayStation Move used a similar model, as did Xbox Kinect. If the camera-flipped AR game is the future, why is the past littered with these experiences? In part because AR isn’t the only tech involved. Nex also relies on artificial intelligence that wasn’t around in 2010.

“At the time, there was no AI, so they had to have a more complicated camera system,” said Lee. “What was missing from those previous generations of games was the NPU – the neural processing unit.”

Those games were fun – and ground-breaking at the time – but their reliance on a console limited their success and led to unsustainable upkeep burdens on the companies. Neither of those constraints is true of Nex.

A Look at Nex Games

I haven’t yet had the opportunity to play Nex games myself. I did get to watch Lee and one of his colleagues playing some of the games on a live video call.

Party Fowl is a collection of party mini-games that looks similar to JackBox. The package will be available as an annual subscription and includes a mix of AR games and what Lee called “VR-like experiences.”

In one AR game, rotating your hips flies a helicopter. In another game, players represented on screen as a chicken squat to lay eggs and fill a basket.

Nex AR games Air Racer and Party Fowl

Another game, Air Racer, is a “flight simulator” in which players pilot an airplane through an obstacle course by moving their hands. Controls include direction, speed, and elevation.

While Nex is focused on games at the moment, I might be more interested in a fitness application from the company. Lee doesn’t see them as separate experiences.

“Movement is a natural way to play. As human beings, we’ve been playing for a very long time, and most of our games involve movement,” said Lee. “These games invite you to move more and also deliver those benefits in a gamified way.”

One experience really spoke to me as a potential showcase of a whole genre of experiences. The game was an episode of the children’s show Peppa Pig, in which gamers chose characters from the show and engaged in their favorite activity – jumping up and down in muddy puddles. The game was created with partner Hasbro.

“It’s not just watching – the family can be invited to join in the fun as well,” said Lee, who described the experience as “productive, independent playtime for the kids.”

Lee further described “the highlight of his career” as when his daughter got his mother into Nex games so that they could play together.

Experiencing Nex AR Games

I hope to get the opportunity to try out Nex AR games, and it sounds like I’ll get the opportunity soon enough – one way or another.

Nex AR games including Party Fowl and Sky Racers are already shipping as pre-installed apps on the Sky Live interactive camera. In fact, most of the motion games available on the camera are by Nex. For Apple users, Nex also works with the Continuity Camera feature.

Nex Playground – a camera box for Nex games compatible with most modern smart TVs – is currently in pre-order with the first orders scheduled to ship before this year’s holiday season. But, one day, external devices won’t be necessary at all as televisions ship with cameras and more computing power onboard.

Nex playgroung

“TVs don’t have really good processing yet. The memory is still quite limited but this is the beginning of these use cases,” said Lee. “This will be in a lot of living rooms and it begins with Nex pioneering this technology and showing the world what is possible.”

“The iPhone Moment for TV”

From AR games, to fitness applications, to just using hand gestures to navigate traditional media, Lee and Nex have an exciting vision for the future of television. The whole thing does feel like AI and XR reaching back into history to pull some of entertainment’s near-misses into the future where they belong.

How Nex Is Flipping AR Games, and Why That’s a Great Thing Read More »

unveiling-the-future-of-driving:-mercedes-benz-vision-one-eleven-concept-car-uses-magic-leap-2

Unveiling the Future of Driving: Mercedes-Benz Vision One-Eleven Concept Car Uses Magic Leap 2

The German luxury automaker Mercedes-Benz recently introduced its Vision One-Eleven concept car the Vision One-Eleven. On top of incorporating sustainability with its electric motor engine alongside a dynamic redesign, Vision-One Eleven uses Magic Leap 2 AR glasses for a more immersive car experience.

This approach reflects Mercedes-Benz’s commitment to creating better cars that provide the best possible driving experience to consumers while accommodating concerns about sustainable driving and introducing new tech. By partnering with Mercedes-Benz, Magic Leap also takes another step towards making AR experiences a part of everyday life.

Vision One-Eleven: A New Twist on an Old Classic

The Vision One-Eleven is a revisited concept car built on another beloved Mercedes-Benz classic, the C 111. The C 111 concept car incorporated iconic gullwing doors for a truly one-of-a-kind design in its day. Combined with its modern interiors, it proved to be an appealing concept car that influenced modern luxury vehicles.

Vision One-Eleven concept car

With the Vision One-Eleven, Mercedes-Benz further improves on the characteristics that set the C 111 apart, blending luxury interiors with intelligent design for a truly futuristic car. A sports vehicle with a lounge interior and a sleek body, the Vision One-Eleven is an exciting peek as to what the cars of the future may look like—from looks all the way to its electric motor.

The Capabilities of AR Glasses on the Road

Aside from visual and engineering overhauls, Vision One-Eleven also incorporates another rapidly growing technology: augmented reality. Since the adoption of full AR experiences has been slow in the larger market, XR companies like Magic Leap pivoted to a slower but steadier approach by bringing tech like the Magic Leap 2 into specific industries.

Drivers often have to manage a large amount of information to navigate and keep safe on the road. With the integration of technology such as built-in navigation or car sensors, drivers can rely on various tools that can help improve their driving efficiency.

This isn’t just progress for the sake of progress either: the introduction of AR technologies to drivers has plenty of benefits, from reducing the cognitive load to helping them navigate hazardous driving conditions.

While these applications have yet to be fully adopted by the market, the partnership between Mercedes-Benz and Magic Leap shows that this is an avenue both AR companies and car manufacturers can benefit from.

An Augmented and Seamless Driving Experience

Specific details about how Magic Leap 2 will integrate with Vision One-Eleven’s driving systems have yet to be released. Still, the goal is to create a configurable, immersive AR interface between the driver and their vehicle. This interface can display information about driving conditions on-demand, from the selected drive mode to information about the driver’s destination and current location.

Vision One-Eleven concept car and Magic Leap AR glasses

With Magic Leap 2, this system transforms the conventional dashboards of cars into a dynamic cockpit where drivers can fully use their field of vision to navigate the roads better. This drastically helps improve both the driving experience and car safety for car owners, passengers, and passersby—while also implementing an intelligent driving model that may potentially reinvent the way people drive.

A Partnership Built On Innovation

The Vision One-Eleven isn’t the first collaboration between Magic Leap and Mercedes-Benz: the two companies worked together in 2019 for the Mercedes Immersive Roadshow. While Magic Leap’s role in that collaboration was to enrich the viewing experience by augmenting the visual aesthetic of the exhibit, their new partnership with Vision One-Eleven shows Mercedes-Benz’s confidence in the potential of AR experiences.

Given the increasing entry rate of other competitors into the AR market, Mercedes-Benz and Magic Leap have secured themselves a lead over the competition when introducing AR into the driving experience. Whether they can hold on to this head start is something else altogether—but for now, the Vision One-Eleven holds the spotlight as a blend of technology and good car design.

What’s Next?

The Mercedes-Benz Vision One-Eleven, like most concept cars, is unlikely to be produced in its current form. However, its design, technology, and engineering innovations will undoubtedly be integrated into future Mercedes-Benz production vehicles. And it’s pretty certain that XR technology will find its place in those vehicles.

According to Mercedes-Benz, “The spatial user interface is a beacon for a Mercedes-Benz user experience that is unencumbered by technology. It is part of a wider vision that looks towards extended reality, whereby technology and hardware cease to be the focal point; instead becoming fully integrated and seamless facilitators of user needs and wishes.”

As for Magic Leap 2, the company shows no signs of slowing down with potential partnerships with established brands. Some of its latest potential forays include a partnership with Audi, as well as early talks with tech giant Meta, perhaps looking to expand towards more consumers in the AR space.

As for the future of AR driving? It’s difficult to tell, but one thing’s certain: everyone will be in for an interesting ride.

Unveiling the Future of Driving: Mercedes-Benz Vision One-Eleven Concept Car Uses Magic Leap 2 Read More »

european-council-publishes-web-4.0-strategy

European Council Publishes Web 4.0 Strategy

The European Commission is already setting out to tackle Web 4.0. There’s quite a bit to unpack here, including the EC approach, the 4-point plan that they recently published, and – of course – what they mean by Web 4.0.

What Is Web 4.0?

It’s not a typo and you’re not asleep at the wheel. While most of us haven’t gotten the hang of Web 3.0 yet, Europe is already setting the table for Web 4.0. Don’t worry, this is just a new terminology for something that’s already on your radar.

“Beyond the currently developing third generation of the internet, Web 3.0, whose main features are openness, decentralization, and user empowerment, the next generation, Web 4.0, will allow an integration between digital and real objects and environments and enhanced interactions between humans and machines,” reads the EC’s report.

So, essentially, “Web 4.0” is the metaverse. But, why not just call it that?

Webs and the Metaverse

The metaverse discussion at least started out as being largely a conversation within the world of immersive technology, with discussions of Web3 largely being topics within the blockchain and crypto spaces. (“Web3” and “Web 3.0” aren’t exactly the same concept, but both largely revolve around decentralization, so they’re more-or-less interchangeable for most levels of discussion.)

As voices from the cryptocurrency and blockchain communities promised that these technologies would be the future of a cross-platform, self-owned online future, Web3 and the metaverse were increasingly mentioned in the same breath with both being apparently convergent visions of the future.

A short-lived explosion of interest in the metaverse was so short-lived largely because – while the pieces are certainly falling into place – one connected metaverse hasn’t fully realized. While there are more-or-less realized metaverse spaces or use cases, the all-encompassing digital layer of reality isn’t here yet. Web3, while struggling with adoption, is largely functional today.

While some may groan at the introduction of yet another idealistic tech concept, “Web 4.0” does offer some clarity at least with regard to what the EC is talking about. First, it respects that the metaverse is still a thing of the (near?) future. Second, it ties in the themes of openness and decentralization that were lacking in many metaverse discussions.

Finally, it ties in “interactions between humans and machines.” While some technologists have long included this aspect in their discussions of the metaverse, recent developments in AI have led to increased interest in this field even since blockchain and the metaverse had their moments in the media over the last few years.

Bracing for Web 4.0

While it’s easy to feel like much of the world is still catching up with the previous generation of the internet, how is Europe planning to get ahead of the next generation of the internet? A lot of it has to do with knowing where current experts are and creating pathways for future builders.

To make that happen, the report outlines four “Key Strategy Pillars”:

  1. Empowering people and reinforcing skills to foster awareness, access to trustworthy information, and building a talent pool of virtual world specialists.
  2. Supporting a European Web 4.0 industrial ecosystem to scale up excellence and address fragmentation.
  3. Supporting local progress and virtual public services to leverage the opportunities virtual worlds can offer.
  4. Shaping global standards for open and interoperable virtual worlds and Web 4.0, ensuring they will not be dominated by a few big players.

One of the reasons that so much of the strategy has to do with ideas like “empowering people” and “leveraging opportunities” might be that much of the document was distilled from an earlier workshop of 150 randomly selected European citizens. The average person is likely feeling left behind Web 2.0 and out of the loop on Web 3.0.

The European Perspective

“Ensuring that [virtual worlds] will not be dominated by a few big players” may not be a uniquely European feeling, but it’s interesting to note. Meta, in particular, has gotten into trouble in EU member countries like Germany for the equivalent of antitrust concerns, which has opened the way for Pico to make headway in European markets free from its US political struggles.

At the most recent Augmented World Expo – just before Apple announced their first XR headset – some speakers even expressed concern that Apple will be able to throw its weight around the industry in a way that not even Meta enjoys.

Apple currently holds so much power that they could say ‘This is the way we’re going to go.’ and the Metaverse Standards Forum could stand up and say ‘No.’,” XRSI founder and CEO Kavya Pearlman said during a panel discussion at this year’s AWE.

Standards are a concern everywhere, but this is another area where the approach is somewhat different across the Atlantic. A number of standards groups have formed in the US, but all of them are independent groups rather than governmental initiatives – though some groups are calling for regulators to step into the space over concerns like privacy.

Thinking Globally About Web 4.0

“Europe is, in many ways, a first mover on metaverse policy, and it is putting forward a positive vision for the future of immersive technology,” the XRA’s VP of Public Policy Joan O’Hara said in an email to ARPost. “We very much appreciate the [European Commission’s] approach to balancing user protection and wellbeing with the desire to support innovation and adoption.”

The headquarters of Web 3.0 and Web 4.0 companies might be in one country or another, but most of them are offering international services. Unless they want to have different (and potentially incompatible) versions of those services available for different countries, it behooves those companies to have services that fit all national standards.

So, in the absence of officially codified US standards for immersive worlds, it is likely that the services offered to American audiences might fit into the shape described by groups like the European Commission. Fortunately, most of the organizations already looking at these problems are also international in nature and work with and between national governments.

“This will serve as a model going forward,” said O’Hara. “The XRA has been actively engaged with both European and British colleagues on these issues, and we believe the US interests are largely aligned with those of our friends across the Atlantic.”

Thinking Ahead

US discussions of Web 3.0 have largely spiraled around the nation’s failure to prepare for or recover from Web 2.0. The fact that Europe is already looking forward to Web 4.0 is definitely something to consider. In emerging tech, looking backward instead of forward is a dangerous strategy.

European Council Publishes Web 4.0 Strategy Read More »

“privacy-lost”:-new-short-film-shows-metaverse-concerns

“PRIVACY LOST”: New Short Film Shows Metaverse Concerns

Experts have been warning that, as exciting as AI and the metaverse are, these emerging technologies may have negative effects if used improperly. However, it seems like the promise of these technologies may be easier to convey than some of the concerns. A new short film, titled PRIVACY LOST, is a theatrical exploration of some of those concerns.

To learn more, ARPost talked with the writer of PRIVACY LOST – CEO and Chief Scientist of Unanimous AI and a long-time emerging technology engineer and commentator, Dr. Louis Rosenberg.

PRIVACY LOST

Parents and their son sit in a restaurant. The parents are wearing slim AR glasses while the child plays on a tablet.

As the parents argue with one another, their glasses display readouts of the other’s emotional state. The husband is made aware when his wife is getting angry and the wife is made aware when her husband is lying.

privacy lost movie emotions

A waiter appears and the child puts down the tablet and puts on a pair of AR glasses. The actual waiter never appears on screen but appears to the husband as a pleasant-looking tropical server, to the wife as a fit surf-bro, and to the child as an animated stuffed bear.

privacy lost movie sales

Just as the husband and wife used emotional information about one another to try to navigate their argument, the waiter uses emotional information to try to most effectively sell menu items – aided through 3D visual samples. The waiter takes drink orders and leaves. The couple resumes arguing.

privacy lost movie purchase probability

PRIVACY LOST presents what could be a fairly typical scene in the near future. But, should it be?

“It’s short and clean and simple, which is exactly what we aimed for – a quick way to take the complex concept of AI-powered manipulation and make it easily digestible by anyone,” Rosenberg says of PRIVACY LOST.

Creating the Film

“I’ve been developing VR, AR, and AI for over 30 years because I am convinced they will make computing more natural and human,” said Rosenberg. “I’m also keenly aware that these technologies can be abused in very dangerous ways.”

For as long as Rosenberg has been developing these technologies, he has been warning about their potential societal ramifications. However, for much of that career, people have viewed his concerns as largely theoretical. As first the metaverse and now AI have developed and attained their moments in the media, Rosenberg’s concerns take on a new urgency.

“ChatGPT happened and suddenly these risks no longer seemed theoretical,” said Rosenberg. “Almost immediately, I got flooded by interest from policymakers and regulators who wanted to better understand the potential for AI-powered manipulation in the metaverse.”

Rosenberg reached out to the Responsible Metaverse Alliance. With support from them, the XR Guild, and XRSI, Rosenberg wrote a script for PRIVACY LOST, which was produced with help from Minderoo Pictures and HeadQ Production & Post.

“The goal of the video, first and foremost, is to educate and motivate policymakers and regulators about the manipulative dangers that will emerge as AI technologies are unleashed in immersive environments,” said Rosenberg. “At the same time, the video aims to get the public thinking about these issues because it’s the public that motivates policymakers.”

Finding Middle Ground

While Rosenberg is far from the only person calling for regulation in emerging tech, that concept is still one that many see as problematic.

“Some people think regulation is a dirty word that will hurt the industry. I see it the opposite way,” said Rosenberg. “The one thing that would hurt the industry most of all is if the public loses trust. If regulation makes people feel safe in virtual and augmented worlds, the industry will grow.”

The idea behind PRIVACY LOST isn’t to prevent the development of any of the technologies shown in the video – most of which already exist, even though they don’t work together or to the exact ends displayed in the cautionary vignette. These technologies, like any technology, have the capacity to be useful but could also be used and abused for profit, or worse.

For example, sensors that could be used to determine emotion are already used in fitness apps to allow for more expressive avatars. If this data is communicated to other devices, it could enable the kinds of manipulative behavior shown in PRIVACY LOST. If it is stored and studied over time, it could be used at even greater scales and potentially for more dangerous uses.

“We need to allow for real-time emotional tracking, to make the metaverse more human, but ban the storage and profiling of emotional data, to protect against powerful forms of manipulation,” said Rosenberg. “It’s about finding a smart middle ground and it’s totally doable.”

The Pace of Regulation

Governments around the world respond to emerging technologies in different ways and at different paces, according to Rosenberg. However, across the board, policymakers tend to be “receptive but realistic, which generally means slow.” That’s not for lack of interest or effort – after all, the production of PRIVACY LOST was prompted by policymaker interest in these technologies.

“I’ve been impressed with the momentum in the EU and Australia to push regulation forward, and I am seeing genuine efforts in the US as well,” said Rosenberg. “I believe governments are finally taking these issues very seriously.”

The Fear of (Un)Regulated Tech

Depending on how you view the government, regulation can seem scary. In the case of technology, however, it seems to never be as scary as no regulation. PRIVACY LOST isn’t an exploration of a world where a controlling government prevents technological progress, it’s a view of a world where people are controlled by technology gone bad. And it doesn’t have to be that way.

“PRIVACY LOST”: New Short Film Shows Metaverse Concerns Read More »

scandit-launches-ar-powered-smart-data-capture-for-assisted-search

Scandit Launches AR-Powered Smart Data Capture for Assisted Search

Swiss tech company Scandit recently announced its newest addition to its smart data capture product line – the MatrixScan Find. Using AR overlays, MatrixScan Find enables users to efficiently search for products or packages they need among multiple items. This new feature is the latest solution in Scandit’s MatrixScan line which also includes MatrixScan Count and MatrixScan Augmented Reality.

Available on iOS and select Android devices, the Find feature is designed to scan multiple items simultaneously and identify the target item for faster selection. Smart data capture can help improve the accuracy and efficiency of businesses, particularly those that handle multiple products and packages, such as retail, and transport and logistics (T&L).

It also has the potential to significantly increase productivity while reducing errors. It ensures that the right packages are sent to the right locations, allowing frontline workers to spend less time searching for the correct items in warehouses.

Transforming Industries With Smart Data Capture

According to Scandit co-founder and CTO Christian Floerkemeier, AR can provide both businesses and consumers with innovation that enhances the user experience. Scandit’s smart data capture technology aims to revolutionize the workflow and reduce human error.

They believe that manual processes stifle productivity and waste valuable time. Through smart data capture, businesses can be more flexible and adaptable, get actionable insights in real time, and be more cost-efficient.

MatrixScan Find AR-powered

“Scandit has a deep understanding of the business processes, user experience, and computer vision technology necessary to successfully implement augmented reality into enterprise applications,” said Floerkemeier in a press release shared with ARPost. “With MatrixScan Find we are introducing another out-of-the-box offering to help companies without the necessary technology expertise in-house provide exceptional offerings to support workers and delight customers.”

Smart Data Capture Leads to Improved Workflows

One of the goals of MatrixScan Find is to transform the retail and T&L industries. It aims to make them more efficient and productive by reducing avoidable human errors. Smart data capture, which uses precise data to select the correct item every time, helps minimize mistakes. Simultaneous scanning also cuts down on the time it takes to look for these items.

Frontline workers can appreciate the ease with which they can find customers’ packages, particularly in large staging areas that store identical-looking boxes. In turn, customers will experience shorter wait times and have greater confidence that they are receiving the correct package. This can lead to high customer satisfaction and customer loyalty.

MatrixScan Find - Scandit AR-Powered Smart Data Capture Assisted Search

In retail, the AR smart data capture feature enables employees to quickly search for products that customers request. Using MatrixScan barcode scanning technology, MatrixScan Find can differentiate the same product in a different color or style. This helps employees fulfill customer orders faster, driving work productivity. Moreover, MatrixScan Find ensures that workers do not overlook available products, making inventory management more efficient.

Retail stores, like groceries, can also provide unique and innovative product search experiences for their customers. Companies can also integrate the feature into their own apps, and customers can use their smartphones or other devices to discover more about the products through an engaging and intuitive AR interface.

Customizable Interfaces for Different Needs

Part of the Scandit Smart Data Capture platform, MatrixScan Find has a built-in UI that supports different types of workforces. Customization options, such as color overlays, pause functionality, item carousel, and notification alerts, are available. Users can also integrate this feature into Scandit’s barcode scanner product, SparkScan, for a complete scanning solution.

Smart data capture technology is set to transform the way industries operate. With the introduction of AR-powered features like MatrixScan Find, businesses can streamline their workflows and reduce human error, paving the way for greater efficiency and productivity.

As technology continues to evolve, we can expect to see even more exciting innovations emerge, transforming the workplace and customer experience as we know it.

Scandit Launches AR-Powered Smart Data Capture for Assisted Search Read More »

unveiling-the-spacetop-ar-laptop:-awe-2023-first-impressions

Unveiling the Spacetop AR Laptop: AWE 2023 First Impressions

This year’s AWE 2023 was a remarkable testament to the accelerating pace of innovation in the field of augmented reality, hosting an unprecedented 6,000 guests and 300 exhibitors.

Amidst the sea of booths, one exhibit captured sustained attention—the Spacetop laptop by Sightful. Throughout the day, from early morning until the closing hours, its stand was constantly buzzing with activity.

Unveiling the Spacetop AR Laptop - AWE 2023 First Impressions
Long lines to try Sightful’s Spacetop AR; Source: AWE

Face-To-Face With The Spacetop

Spacetop’s uniqueness stems from its design—it shuns the traditional physical screen and employs a pair of AR glasses as the display medium. The glasses are not proprietary but are a product of Sightful’s collaboration with XREAL (formerly Nreal), who provided an existing AR solution tailored specifically for Spacetop.

Spacetop AR laptop
Source: Sightful – Spacetop press kit

Field of View

With its sleek and futuristic design, the laptop certainly looks promising at a glance. However, a set of issues quickly surfaced during my hands-on experience. The most significant one is the limited field of view that’s insufficient to accommodate the entire screen.

The glasses’ restricted field of view necessitates constant head tilting which undermines the entire purpose of having large virtual monitors and results in what is known as “windowing”—a term used in spatial computing when virtual objects fail to fully overlay and appear cut off.

Attempted solutions like moving the virtual monitor further away were not effective due to the glasses’ 1080p (1920×1080) resolution. Push the screen too far back and the text becomes difficult to read. Therefore, users are forced to deal with near-placed screens that, while clear and readable, outsize Spacetop’s field of view.

Input Solutions and Design

The laptop also lacks hand tracking, a disappointing omission considering the advancements in the field. Users are left with a trackpad, navigating a vast spatial spectrum with a traditional cursor, a process that can feel slow and inadequate. Monica Chin from The Verge has reported instances of losing the cursor among the screens, then struggling to locate it – a problem no doubt amplified by the limited FOV.

Low-precision tasks such as moving tabs or resizing that could be done in fractions of a second with either touchscreen or hand tracking, here took exponentially longer. It made the whole experience of using Spacetop feel frustrating.

There are also other less obvious quibbles. For example, no screen means the webcam must be positioned down on the keyboard. This suboptimal positioning creates an unflattering, spycam-like angle.

Although users can lower their virtual screen to align with the webcam, mitigating gaze-switching between the screen and camera, ultimately the very design of the Spacetop laptop necessitates certain compromises.

Sightful in It for the Long Haul

I asked a Sightful representative about the low field of view and was informed that the company is aware of these display limitations. They assured me that they are prepared to iterate in tandem with the industry.

It seems Sightful is conscious not to portray Spacetop as a purely AR device. More than anything else, Spacetop is a screen-less laptop with a proprietary operating system, Spacetop OS (based on Android), and a unique set of AR-specific features.

In the future, the team may design the laptop to work with any glasses they deem suitable for their purpose. This is their first product and instead of playing catch-up, Sightful is eager to start early and keep perfecting the experience as better, newer glasses come into the market.

However, as things stand today, it’s hard to avoid the obvious question: Why would one choose to splash $2,000 on a Spacetop when one could simply spend $379 on the XREAL glasses (or $488 bundled with the XREAL Beam) and use them to stream from any device? The Spacetop team attempts to answer this by emphasizing their AR-first design and focus.

For instance, executing a three-finger swipe on the touchpad moves screens spatially between closer and further planes. There is also a Reality Mode button that turns the AR off allowing for full pass-through, and a range of shortcuts that enable you to snap screens in place, re-center them, and more. While these improvements and enhancements are handy, they don’t quite seem to justify the substantial premium.

Mat at AWE using Spacetop
Author believers that Spacetop’s form factor makes it socially acceptable.

Potential Is There

Initially, I had planned to log into my Twitter account from within the Spacetop, take a screenshot with its webcam, and do a live tweet, heralding the dawn of a new era in spatial laptop computing.

However, the realization that the Spacetop still has some distance to cover before it can be deemed fully user-friendly made it challenging to compose a strictly positive and genuine tweet (time constraints and burdensome trackpad navigation played a role as well).

The potential is undoubtedly there. Large field-of-view, high-resolution AR displays, along with some ultralight tracking solutions, were already being showcased at this year’s AWE and might be integrated into the next generation of glasses.

During my brief encounter with the Spacetop, I could easily envision it becoming a preferred work tool for many, not just for those working from home, but also in cafes or co-working spaces. Moreover, there’s an inherent benefit of privacy. For stock traders, artists, or anyone who values personal workspace, the ability to work on non-public screens adds a lot of appeal.

Its form factor is among the most socially acceptable options available – there’s something about having AR glasses paired with a clearly visible laptop or tablet that makes the entire setup immediately understandable to onlookers. It doesn’t seem to invite confusion or ridicule; if anything, it might invite desirability.

Spacetop screens
The author thinks that promotional materials feel misleading; Source: Spacetop press kit

For now, however, Spacetop’s primary promise of being a superior alternative to traditional laptops falls short. Its promotional materials, which depict users encircled by screen panels, feel misleading.

The current iteration is hampered by a lack of hand-tracking, a limited field of view, and clunky user interface solutions. Moreover, the price point does not seem to correspond with the value provided. However, with improvements and upgrades coming, it’s worth keeping an eye on Sightful.

Guest Post


About the Guest Author(s)

Mat Pawluczuk

Mat Pawluczuk

Mat Pawluczuk is an XR / VR writer and content creator.

Unveiling the Spacetop AR Laptop: AWE 2023 First Impressions Read More »

rose-partners-with-premier-league-for-ar-experience-celebrating-summer-series

ROSE Partners With Premier League for AR Experience Celebrating Summer Series

Whether you’re an American fan of British football, or a citizen of the British Commonwealth spending time in the States, the Premier League Summer Series might be just what the doctor ordered.

The first-ever Premier League Summer Series will see six football clubs face off in five US cities from July 22 to July 30. Even if you can’t watch the matches live and in person, you can find trophies thanks to an AR experience from ROSE.

“The Hunt Is On.” – Celebrating the Summer Series in AR

Over the years, digital experience company ROSE has worked with industry giants including Mastercard, KHAITE, Patrón, adidas, Bloomingdales, and others. A current partnership with the Premier League and UK strategic consultancy and creative studio Doppelgänger might be their biggest partnership yet – and there are no tickets required.

“Doppelgänger created the idea of an augmented reality-powered trophy hunt experience for fans and in looking for an expert partner in the space, enlisted ROSE to advise on how the experience could be executed and ultimately design and build the experience,” ROSE Associate Creative Director Nicole Riemer told ARPost.

According to Managing Director at Doppelgänger, Max Proctor, “AR and broader metaverse activations are helping the world’s biggest brands to build loyalty and engagement with their audiences by connecting with them in new and exciting ways.”

The Summer Series is a major sporting event and is bringing in even more people into the experience than into the stadiums. Organizers turned to ROSE and WebXR authoring and hosting company 8th Walla duo that has worked together on multiple large-scale applications.

“Having been a long-time partner with 8th Wall, we are always looking for new ways to use their technology and use cases that push the way augmented can be used as well as made more accessible for brands,” said Riemer. “In this case study we utilized a number of 8th Wall’s newer features including face segmentation and sky segmentation.”

Experiencing the Summer Series

There are different ways to interact with the activation depending on whether you’re in or around any of the cities. That’s right, you can still join in on the fun, even if you aren’t in any of the cities hosting the Summer Series.

The five cities hosting the Summer Series are:

  • Philadelphia, Pennsylvania,
  • Atlanta, Georgia,
  • Orlando, Florida,
  • Harrison, New Jersey, and
  • Landover, Maryland.

Premier League Trophy Hunt AR experience

“Since the Summer Series is only in five cities, not every fan will have the opportunity to come to a game, but that doesn’t mean that they support their favorite teams any less,” said Riemer. “We created the at-home experience as a way for fans anywhere in the United States to show their support for their team and experience the Premier League trophy.”

Exploring Host Cities

If you’re in one or more of those cities between now and July 19, you can use the Premier League Trophy Hunt mobile AR experience to look for 20 augmented reality trophies (that’s one for each club in the Premier League). Naturally, you need to enter your location to hunt for the AR trophies.

Premier League Trophy Hunt experience in the cities

For each trophy that fans find, they get one entry into sweepstakes for tickets to the games. Fans who find all of the trophies get one entry into another drawing for a signed Premier League jersey.

Supporting Teams From Home

If you’re a fan of the Premier League but won’t be in one of the host cities, you can still engage in the web experience, if differently. Fans anywhere can view the Premier League Trophy in augmented reality, use face filters, and post the results to social media.

Premier League Trophy Hunt AR experience at home

Further, you don’t have to allow your location to use the experience from home. Just pick your favorite team – or join in as a “General Premier League Fan.” You can still have all of the fun of viewing and collecting all of the trophies.

There are also special entries that you can join in without finding hidden trophies in the host cities. Or, don’t join the entries and just have fun with the filters. The at-home experience is live until July 31.

Supporting Your Team With ROSE

The Premier League is coming stateside. That’s exciting whether it’s coming to a town near you or not. 

“We are honored to have such a passionate Premier League fanbase in the USA, and are very excited to be giving them the chance to experience the Premier League Summer Series on home soil for the very first time,” said Alexandra Willis, Director of Digital Media and Audience Development at the Premier League.

Thanks to ROSE and their partners, fans anywhere can interact with their favorite football clubs in new and amusing ways.

ROSE Partners With Premier League for AR Experience Celebrating Summer Series Read More »

magiscan-app-lets-users-create-3d-models-with-their-smartphone

MagiScan App Lets Users Create 3D Models With Their Smartphone

As if our smartphones weren’t already incredible enough, startup company AR-Generation is using them to bridge the gap between the real and virtual worlds. With their new cutting-edge app, you can create 3D models with only your smartphone and use them for any AR or metaverse application.

Introducing MagiScan

Meet MagiScan, an AI-powered 3D scanner app that produces high-quality 3D models for any AR or metaverse application. Developed by AR-Generation, a member of the NVIDIA Inception program, MagiScan is the first and only 3D scanner in NVIDIA Omniverse, a real-time 3D graphics collaboration platform.

The MagiScan app, available on both iOS and Android devices, allows users to capture an image of any object using their smartphone camera and quickly generate its detailed 3D model. AR-Feneration co-founder and CEO, Kiryl Sidarchuk, estimates that this process is up to 100 times less expensive and 10 times faster than manual 3D modeling, making it an accessible and user-friendly option for creators. With MagiScan, creators can easily refine their work and increase accessibility to AR technology.

3D scanning objects with MagiScan

While 3D scanning with smartphones is not new technology, it has significantly improved over the years. In 2015, researchers at Carnegie Mellon University developed a tool for measuring real objects in 3D space using “average” cellphone cameras. They developed their technology to perform accurate measurements so it could be helpful in self-driving cars and virtual shopping for eyeglass frames.

A similar technology was also created in 2021, called the PhotoCatch app, which uses the then-new Apple Object Capture photogrammetry technology.

How MagiScan Works

MagiScan is incredibly easy to use. Simply open the app, scan an object from all angles, and wait for a few seconds for the app to generate a 3D model. Once done, you can export your 3D model in various formats, including STL which allows you to print your model.

In addition to personal use, brands can also use MagiScan for their online platforms. Just enable “Connect MagiScan for Business,” then scan your products and add their 3D models to your website.

Exporting 3D Models Directly to Omniverse

AR-Generation also created an extension allowing MagiScan users to export their 3D models directly to the NVIDIA Omniverse. “We customized our app to allow export of 3D models based on real-world objects directly to Omniverse, enabling users to showcase the models in AR and integrate them into any metaverse or game,” Sidarchuk said.

Magiscan to Omniverse

This extension is made possible by OpenUSD or Universal Scene Description, an open-source software originally developed by Pixar Animation Studios for simulating, describing, composing, and collaborating in the 3D realm. The OpenUSD compatibility is Sidarchuk’s favorite Ominverse feature, and he believes that OpenUSD is the “format of the future.”

The company chose to build an extension for Omniverse because the platform, according to Sidarchuk,  “provides a convenient environment that integrates all the tools for working with 3D and generative AI.”

MagiScan and Augmented Reality’s Impact on E-Commerce

The impact of 3D models and AR is not limited to the gaming and metaverse realms. E-commerce businesses can also benefit from the rapid advancements in this technology.

About 60% of online shoppers consider high-quality images a critical factor in their purchasing decisions. To keep up with the competition, brands must provide more than just photos with a white background. They can also display 3D models of their products on their websites or online marketplaces to provide a more immersive browsing experience.

Through MagiScan, AR-Generation believes that conversion rates can increase up to 94%, while returns can drop up to 58%. Crisp and accurate 3D models allow consumers to visualize a product in real life, helping them make more informed purchasing decisions. This may be the same reason why Carnegie Mellon University researchers developed their 3D scanner to aid people in buying eyeglass frames online.

The Growing Significance of AR in Daily Life

Sidarchuk believes that AR will become an integral part of everyday life. And it’s not hard to see why. AR has grown in popularity over the years and is now widely used in various industries, from gaming to shopping to employee training. With AR, individuals and corporations can experience immersive virtual environments in a safe and secure way.

Thanks to advancements in technology, high-quality 3D experiences are now possible on smartphones. This means that AR and the Omniverse have the potential to impact even the most mundane activities of our daily lives. With this in mind, it’s clear that AR technology is here to stay.

MagiScan App Lets Users Create 3D Models With Their Smartphone Read More »