augmented reality

meta-shows-new-progress-on-key-tech-for-making-ar-genuinely-useful

Meta Shows New Progress on Key Tech for Making AR Genuinely Useful

Meta has introduced the Segment Anything Model, which aims to set a new bar for computer-vision-based ‘object segmentation’—the ability for computers to understand the difference between individual objects in an image or video. Segmentation will be key for making AR genuinely useful by enabling a comprehensive understanding of the world around the user.

Object segmentation is the process of identifying and separating objects in an image or video. With the help of AI, this process can be automated, making it possible to identify and isolate objects in real-time. This technology will be critical for creating a more useful AR experience by giving the system an awareness of various objects in the world around the user.

The Challenge

Imagine, for instance, that you’re wearing a pair of AR glasses and you’d like to have two floating virtual monitors on the left and right of your real monitor. Unless you’re going to manually tell the system where your real monitor is, it must be able to understand what a monitor looks like so that when it sees your monitor it can place the virtual monitors accordingly.

But monitors come in all shapes, sizes, and colors. Sometimes reflections or occluded objects make it even harder for a computer-vision system to recognize.

Having a fast and reliable segmentation system that can identify each object in the room around you (like your monitor) will be key to unlocking tons of AR use-cases so the tech can be genuinely useful.

Computer-vision based object segmentation has been an ongoing area of research for many years now, but one of the key issues is that in order to help computers understand what they’re looking at, you need to train an AI model by giving it lots images to learn from.

Such models can be quite effective at identifying the objects they were trained on, but if they will struggle on objects they haven’t seen before. That means that one of the biggest challenges for object segmentation is simply having a large enough set of images for the systems to learn from, but collecting those images and annotating them in a way that makes them useful for training is no small task.

SAM I Am

Meta recently published work on a new project called the Segment Anything Model (SAM). It’s both a segmentation model and a massive set of training images the company is releasing for others to build upon.

The project aims to reduce the need for task-specific modeling expertise. SAM is a general segmentation model that can identify any object in any image or video, even for objects and image types that it didn’t see during training.

SAM allows for both automatic and interactive segmentation, allowing it to identify individual objects in a scene with simple inputs from the user. SAM can be ‘prompted’ with clicks, boxes, and other prompts, giving users control over what the system is attempting to identifying at any given moment.

It’s easy to see how this point-based prompting could work great if coupled with eye-tracking on an AR headset. In fact that’s exactly one of the use-cases that Meta has demonstrated with the system:

Here’s another example of SAM being used on first-person video captured by Meta’s Project Aria glasses:

You can try SAM for yourself in your browser right now.

How SAM Knows So Much

Part of SAM’s impressive abilities come from its training data which contains a massive 10 million images and 1 billion identified object shapes.  It’s far more comprehensive than contemporary datasets, according to Meta, giving SAM much more experience in the learning process and enabling it to segment a broad range of objects.

Image courtesy Meta

Meta calls the SAM dataset SA-1B, and the company is releasing the entire set for other researchers to build upon.

Meta hopes this work on promptable segmentation, and the release of this massive training dataset, will accelerate research into image and video understanding. The company expects the SAM model can be used as a component in larger systems, enabling versatile applications in areas like AR, content creation, scientific domains, and general AI systems.

Meta Shows New Progress on Key Tech for Making AR Genuinely Useful Read More »

top-5-e-commerce-ar-and-vr-trends-to-follow-in-2023

Top 5 E-Commerce AR and VR Trends To Follow in 2023

AR and VR are two of the most promising technologies of the modern era. Both can potentially revolutionize how we interact with the world around us. However, these technologies have taken a long time to reach their full potential.

While AR and VR have been around for decades, it wasn’t until recently that they offered a quality experience without being too limited by technological constraints or not being portable enough for widespread use.

Nevertheless, they’ve been making waves in many industries. And now, the e-commerce industry is hopping on board. Statista reports that by 2023, there will be 1.4 billion AR devices worldwide, projected to rise to 1.73 billion by 2024.

Number of mobile augmented reality (AR) active user devices worldwide from 2019 to 2024 - Statista
Source: Statista

What should we expect from AR and VR in 2023 and beyond? In this article, we will explore the potential of AR and VR for e-commerce and how they can enhance your shopping experience.

1. Increased Adoption of AR in E-Commerce

According to a recent survey, 38% of marketers reported using AR in 2022. It’s a significant increase from the 23% reported in 2017. And it’s understandable, given AR technology’s benefits to e-commerce customers.

For example, it allows them to feel like they’re physically interacting with products in a brick-and-mortar store while being online. AR can also help consumers visualize how products will look in their homes or on their bodies, improving the shopping experience and leading to more informed purchasing decisions and fewer returns.

Looking ahead to 2023, there are six exciting trends in the AR shopping space to keep an eye on. They are:

1. Social Media Apps and Camera Filters

Social media apps and camera filters, with Snapchat and Instagram leading the charge by incorporating AR into their platforms. Brands can use SnapAR Lens Studio or Meta Spark to create engaging AR filters and lenses that bring products to life. A case in point is Gucci.

Gucci AR instagram filter
Screenshots taken on the official Gucci Instagram account

2. Virtual Try-On Technology

Virtual try-on technology to see how products look on shoppers like on the Sephora Snapchat page.

Sephora Snapcaht Lens
Screenshots taken on the official Sephora Snapchat account

3. Virtual Showrooms

Virtual showrooms are similar to try-on but involve the buyer flipping the camera around. For example, it’s popular among furniture stores like EQ3.

virtual showroom EQ3
Screenshots taken on the official EQ3 website

4. Better AR Hardware Options

Better AR hardware options, caused by innovations in mobile technology, such as LiDAR and ToF (depth sensing hardware). Companies like Google, Microsoft, Lenovo, and Vuzix are developing smart glasses to enhance the AR experience.

5. AR Mirrors

AR mirrors for in-store shopping, assisting buyers who are in-store and either don’t want to test various alternatives or can’t for whatever reasons.

6. Gamifying

Gamifying in-store shopping to connect physical products with apps, creating a fun and interactive shopping experience.

2. VR-Enabled Online Shopping Experiences

VR creates an immersive visual environment, including 360-degree videos, photos, product demos, and complex experiences using devices such as the HTC Vive or Oculus Quest.

Unlike AR, VR is entirely simulated and disconnected from the physical world. VR can benefit businesses in various ways, such as:

  • virtual tours of showrooms and stores;
  • visualization of products;
  • greater user engagement;
  • increased consumer trust;
  • enhanced conversion rates;
  • better retention rates;
  • improved customer service.

However, it’s essential to remember the “shiny toy syndrome” and avoid it. Ensure that VR experiences align with your business goals and customer needs before opting for them. E-commerce stores can use VR for the following purposes:

  • virtual stores with virtual clothing racks, an opportunity to meet with friends and shop together online;
  • “try before you buy”;
  • in-store experiences;
  • live events;
  • interactive education.

3. Introducing AI Into AR and VR Solutions

Artificial intelligence can integrate with AR and VR technologies to revolutionize the shopping experience. AI-powered 3D representation of products in a user’s environment can increase conversions. How? Here is how AI can enhance virtual experiences:

1. Object Recognition

AR and VR experiences can adjust to the user’s movements and actions thanks to AI algorithms’ ability to detect and track things in real time.

2. Computer Vision

It involves image recognition and tracking, enabling the system to respond to the environment.

3. Natural Language Processing (NLP)

NLP is about using voice commands for people to explore and interact with virtual worlds.

4. Predictive Analytics

As AI can predict user behavior, merchants can build personalized and proactive experiences.

5. Usage Analytics

AI can also help analyze usage data and client feedback. You can optimize your AR/VR services and boost buyer satisfaction based on the results.

6. Personalized Experiences

One of the ways to employ customer insights is to tailor offers to their tastes. It can boost satisfaction and sales.

4. Creating Digital Twins

The past year has seen an increase in AR and 3D technology use by fashion brands to boost sales and brand recognition in physical and virtual worlds. And in 2023, we can expect more brands to utilize AR innovatively.

It includes the ability to try on digital versions of physical clothing on your avatar. Another example is unlocking special effects for physical apparel. Some brands create digital-only looks that users can capture on camera and share on social media.

This trend becomes possible thanks to avatar platforms and AR features such as image targets and body tracking. So brands can offer and sell virtual goods. And with NFC (Near Field Communication) and QR codes embedded in physical apparel, you can transform one thing (for example, a T-shirt) into infinite designs.

5. Security Concerns Over the Usage of AR and VR

Consumers are increasingly concerned about privacy, security, and safety in computing. The metaverse, new headsets, and more AR and VR content have made safety a greater focus. Devices can now gather more information through eye and hand tracking. AR also relies on spatial data to immerse users. That’s why customers remain skeptical about using such devices daily.

How can people safely enjoy digital realities? We need new frameworks, regulations, and social contracts prioritizing safety. All these require collaboration through working groups, policy and standard discussions, and new software solutions for moderation and cyber threats.

Final Word

To sum up, AR and VR can enhance the e-commerce industry by improving the customer experience, driving more engagement, and cutting costs. But there are many challenges to overcome before these technologies can become mainstream.

For example, some websites are incompatible with VR headsets or AR apps. Why? The reason is that they were not built with those devices in mind. And not everyone owns a headset or smartphone capable of using these technologies.

That’s why e-commerce merchants should take advantage of these new opportunities to not lose potential clients due to incompatibility issues. As these technologies get better, more online stores will use AR and VR to give shoppers immersive shopping experiences. The future of e-commerce is exciting. And augmented and virtual reality are sure to play a significant role in shaping it.

Guest Post


About the Guest Author(s)

Art Malkovich

Art Malkovich

Art Malkovich is CEO and co-founder of Onilab, an e-commerce development company. He has about 10 years of experience in team management and web development. He is passionate about keeping up with recent technologies and working on innovative projects like headless commerce solutions and PWAs in particular.

Top 5 E-Commerce AR and VR Trends To Follow in 2023 Read More »

denny’s-celebrates-its-70th-anniversary-with-ar-food-menu-that-enhances-dining-experience

Denny’s Celebrates Its 70th Anniversary With AR Food Menu That Enhances Dining Experience

While celebrating its 70th anniversary, Denny’s partnered with QReal to produce AR menus where food items seem to leap off its pages. You don’t need to install the restaurant chain’s app on your phone for the AR food menu to work. Just activate your phone’s camera and launch 8th Wall‘s web-based AR platform from your phone’s browser to watch the images come alive.

AR food menu Denny's

Denny’s AR Food Menu: What to Expect

With the new AR food menu, you’ll see flames surround the classic Moons Over My Hammy egg sandwich and hear the new Mac N’ Brisket Sizzlin’ Skillet sizzle as it emerges from a barbecue smoker. Also making an appearance is a 3D model of the original diner in 1953—then known as Danny’s Donuts—before becoming the beloved establishment it is today.

Denny's AR food menu

Denny’s AR food menu, only accessible when dining at physical outlets across America, is part of Denny’s “It’s Diner Time” brand platform. The campaign also involves the remodeling of its kitchens, the rollout of improved food offerings, and the unveiling of new staff uniforms.

AR Food Menu: Denny’s Latest Foray Into AR

When Denny’s shared its 2022 results in February, CEO Kelli Valade said that one of the company’s strategic priorities is “to lead with technology and innovation.” She also mentioned that “Denny’s is skewing towards younger generations with Millennials and Gen Z currently representing about 45% of our customer base.” So, augmented reality makes perfect sense.

However, this is not the first time the company has tapped into the world of AR. The last time it used this type of computer-generated content was in late 2016 when the diner chain launched its “Shrek the Halls” campaign for the Christmas and New Year holidays. Using the DreamWorks COLOR app, the restaurant’s customers saw characters from Shrek, The Penguins of Madagascar, Puss in Boots, and Turbo Fast arise from the kids’ menus as their phones scanned its pages.

QReal and the Appeal of the AR Food Menu

QReal (formerly Kabaq.io) specializes in creating lifelike, 3D, and AR content for e-commerce platforms and social media campaigns. It works for various industries, from real estate and automotive to fashion and beauty. However, its original passion was food, becoming the first company to make photorealistic AR models of cuisine in 2016 with its KabaQ AR Food Menu app.

“The traditional way people interact with menus is being transformed utilizing [AR and life-like 3D models], leading to an enhanced experience, strong branding, and potentially higher order throughput,” said Mike Cadoux, QReal’s General Manager.

Researchers from several universities who studied QReal’s AR food models attest that such presentations can improve “decision comfort” or “craveability,” spread positive feedback about products, and increase the desire for “higher-value” types of food. Because QReal’s app hardly uses post-production, its users can see their order in advance from different angles in the most realistic way possible.

How the AR Food Menu Will Transform the Restaurant Industry 

If we are to believe Cadoux’s forecast, “high-fidelity digital cuisine” will only increase in demand due to its strong potential to boost branding and sales.

Businesses predict that AR food menus will enable customers to order more smartly because AR renders the item’s size and quantity more accurately. Another benefit of such transparency is lower food waste.

Moreover, establishments can use AR to promote new products and enhance engagement with prospects and loyal clientele through behind-the-scenes tours, which can include how they prepare and cook food.

Denny’s Celebrates Its 70th Anniversary With AR Food Menu That Enhances Dining Experience Read More »

how-large-retail-brands-are-using-augmented-reality

How Large Retail Brands Are Using Augmented Reality

Over 83 million people in the US alone used augmented reality on a monthly basis in 2020. By the end of 2023, it is projected that the number will grow by over 30%, to over 110 million people.

With the pandemic having accelerated the evolution of digital shopping, retail and e-commerce brands are looking for new ways to engage with their consumers and to bridge the online-offline experience gap that exists today while shopping.

How Big Brands Leverage Augmented Reality

Immersive AR experiences are increasingly being leveraged in stores, to create memorable and personalized relationships between the brand and its customers. Through augmented reality, retailers can not only engage the otherwise passive customers but also provide the context needed for them to make a decision and significantly improve the likelihood of the customer making a purchase.

Lego, for instance, used an augmented reality digital box in its stores for parents/kids to put up the physical boxes in front of the screen and see different scenes being built and come to life. This allowed parents and kids to find the right set and also proved to be a fun way to engage with consumers.

Other retailers use augmented reality to specifically drive sales for products that typically need the in-person context to make a buying decision. Houzz’s AR-powered app offers consumers the ability to view their rooms from their phone camera and ‘drop in’ true-to-scale 3D furniture items superimposed on their physical reality, for them to make a more informed buying decision.

Converse’s AR app lets consumers try shoes at home by simply pointing the camera at their feet. They can then evaluate multiple models with varying colors within minutes from the comfort of their home. The app is also integrated with their e-commerce platform, creating a seamless flow from discovery to intent to making the final purchase.

The Future of Retail Is 3D

While all these examples use AR in slightly different ways, they all have one commonality: the buyer is at the center of the experience and the camera has become the new home page. Replacing 2D images with interactive 3D products gives the shoppers the context through visualizations that they need, to be confident in their decisions.

The experience boosts consumers’ confidence, allowing them to make the right choice because AR provides the level of real-life context missing from a flat, 2D product image online. It’s a win-win for the customers and the retail brands, who experience a big increase in conversion rates and a lower product return rate by leveraging augmented reality.

Consumers are coming to expect this experience. Augmented reality adoption is following a similar pattern to mobile phone adoption of the 2000s. And as the mobile-first Gen Z cohort continually gains more buying power beyond the $360 billion they already have in disposable income, we will see large retailers transforming their traditional online and in-person shopping experiences into more immersive, 3D retail experiences to reshape online browsing and buying behavior as we know it.

Guest Post


About the Guest Author(s)

Aluru Sravanth

Aluru Sravanth

A technology enthusiast and a student for life, Sravanth started Avataar in 2014, with a vision to uncover untapped potential from the confluence of self-learning AI and computer vision.

How Large Retail Brands Are Using Augmented Reality Read More »

ar-and-vr-content-creation-platform-fectar-integrates-ultraleap-hand-tracking

AR and VR Content Creation Platform Fectar Integrates Ultraleap Hand Tracking

For the  Fectar AR and VR content creation platform users, creating XR content with hand tracking feature has just become simpler and easier.

Launched in 2020, Fectar is “the multi-sided platform that makes the metaverse accessible for everyone, everywhere.” Focused on creating AR and VR spaces for education, training, onboarding, events, and more, and aimed at non-technical users, the company provides a cross-platform, no-code AR/VR building tool.

Last week, Fectar integrated the Ultraleap hand tracking feature within its AR and VR content creation platform, allowing users to build VR training experiences with hand tracking from the beginning.

AR and VR Content Creation With Integrated Ultraleap Hand Tracking

Ultraleap was founded in 2019 when Leap Motion was acquired by Ultrahaptics, and the two companies were rebranded under the new name. Ultraleap’s hand tracking and mid-air haptic technologies allow XR users to engage with the digital world naturally – with their hands, and without touchscreens, keypads, and controllers.

Thanks to the Ultraleap feature, Fectar’s users will now be able to create and share immersive VR experiences that use hands, rather than VR controllers. According to Ultraleap, this makes the interaction more intuitive, positively impacts the training outcomes, reduces the effort of adoption, and makes the experiences more accessible.

Non-Technical People Can Develop Immersive Experiences 

The new addition to the AR and VR content creation platform is a strategic decision for Fectar. The company’s target clients are non-technical content creators. They don’t need to know how to code to create VR apps and tools, including training programs.

This is, in fact, one of the most frequent use cases of the Fectar AR and VR content creation platform. “We want our customers to be able to create world-class VR training experiences,” said Fectar CTO and founder, Rens Lensvelt, in a press release. “By introducing Ultraleap hand tracking to our platform we’re giving them an opportunity to level up their programs by adding an intuitive interaction method.”

VR Programs and Tools – the Future of Collaborative Work and Training

Virtual reality content has expanded beyond the field of games or applications for entertainment. VR is part of education and training, medicine, business, banking, and, actually, any kind of work.

This is why an AR and VR content creation platform for non-technical users, like Fectar, is so successful. Companies worldwide want to create their own training and collaborative VR tools, without hiring developers.

“The combination of Ultraleap and Fectar provides people with the right tools they need to develop the best education or training programs – and makes it easy to do so. We already know that enterprise VR programs improve productivity by 32%,” said Matt Tullis, Ultraleap VP of XR. “By making that experience even more natural with hand tracking, Fectar customers can expect to see their VR training ROI increase even further.” 

AR and VR Content Creation Platform Fectar Integrates Ultraleap Hand Tracking Read More »

coach-partners-with-zero10-on-ar-try-on-tech-for-metaverse-fashion-week

Coach Partners With ZERO10 on AR Try-On Tech for Metaverse Fashion Week

The second edition of Metaverse Fashion Week (MVFW) is set to take place at the end of this month in Decentraland’s Luxury District, where global brands will feature their digital wearables. MVFW is a four-day-long event that combines fashion and AR try-on technology to offer a unique, immersive experience to attendees.

Metaverse Fashion Week 2023 -Arena

Metaverse Fashion Week, which will run from March 28–31 this year, will see the participation of luxury brand Coach for the first time, showcasing its products in the virtual show. The event brings together top designers and brands, making it an exciting opportunity for Coach to showcase its signature leather-made products in the metaverse.

ZERO10’s AR Try-On Tech Highlights Coach’s Iconic Tabby Bag

In collaboration with ZERO10, Coach will introduce its iconic Tabby bag with a unique AR enhancement as part of its upcoming activation during MVWF. The feature will be accessible via the ZERO10 app, allowing users in Decentraland to try on the product virtually, providing a new and engaging way to experience the brand.

COACH - Tabby bag
Source: Coach

The AR enhancement effect, which makes use of cutting-edge technology, adds a unique touch to the virtual fashion event and provides visitors with a dynamic way to interact with Coach’s products. Using AR try-on, shoppers may virtually try on clothes, accessories, and even cosmetics before making a purchase. Buyers interested in a product can virtually see how they might look in it.

As a global digital fashion platform, ZERO10 offers AR try-on technology to brands and independent creators. Through its iOS app, users can try on digital clothing in real time using their phone camera, collect items in a virtual wardrobe, and create shareable content for social media.

The digital collections are collaborations with both emerging and established fashion brands, designers, musicians, and artists and are released in limited drops within the app. The app’s cloth simulation technology simulates fabric flow, while the body tracking technology lets users try on virtual outfits for unique social media photos and videos.

Blending Tradition and Innovation

This year’s Metaverse Fashion Week theme, “Future Heritage,” encourages both traditional and emerging fashion designers to engage and work together. As part of the upcoming event, brands will conduct interactive virtual experiences both on and off the runway.

Dolce & Gabbana plans to exhibit pieces from its Future Reward digital design competition. Tommy Hilfiger intends to launch new wearables on a daily basis, along with products powered by artificial intelligence. DKNY will have a pop-up art gallery and restaurant called DKNY.3. Adidas, like Coach, will make its MVFW debut this year. For owners of its “Into the Metaverse” non-fungible token (NFT) collection, the sports brand will debut its first set of digital wearables.

Metaverse Fashion Week 2023 brands

Coach will also participate in Brand New Vision (BNV), a Web3 fashion ecosystem that enables attendees to try on wearables from various global brands seamlessly and instantly. BNV has created specifically designed stations to showcase the digital clothing collections created in partnership with top brands such as Tommy Hilfiger, Carolina Herrera, Michael Kors, and Vivienne Tam. Moreover, a newly built “Fashion Plaza” will also exhibit emerging digital fashion possibilities.

MVFW Open Metaverses and Web3 Interoperability

Dr. Giovanna Graziosi Casimiro, Decentraland’s head of MVFW, remarked that they are honored to carry on the Metaverse Fashion Week tradition this year. “We are seeing the return of many luxury fashion houses, and also the emergence and elevation of digitally native fashion. We are excited to see the world’s greatest fashion minds engaging in digital fashion and exploring what it can mean for their brands, and for their communities,” she said.

This year’s MVFW will highlight the force of interoperability across open metaverses while expanding the boundaries of what digital fashion can be. MVFW23, organized by Decentraland and UNXD, is an immersive art and culture event, in association with the Spatial and OVER metaverses, that welcomes fashionistas from all over the globe to gather, mingle, and witness the most recent breakthroughs in digital fashion.

Fashion brands trying on various virtual technologies like AR try-on is a testament to their commitment to staying at the forefront of the latest technology trends and providing their customers with unique and immersive experiences.

Coach Partners With ZERO10 on AR Try-On Tech for Metaverse Fashion Week Read More »

get-ready-to-battle-in-space-with-valo-motion’s-latest-mr-game-release-astro-blade

Get Ready to Battle in Space With Valo Motion’s Latest MR Game Release Astro Blade

Last year, Valo Motion launched ValoArena, a mixed reality playground, in the US. Now, the company is back with an exciting MR game release – Astro Blade.

The Finnish game company, known for designing and developing cutting-edge interactive mixed reality games, has recently released a new space-themed game which can now be played in ValoArenas across the country. This interstellar adventure in a galaxy far, far away is sure to give you and your friends a thrilling time. So, get ready to take part in an action-packed battle in space and become a virtual superhero.

Blast Into Space With Valo Motion’s MR Game Astro Blade

Step into the world of the company’s MR game Astro Blade, where players become virtual holograms fighting in the hangar of a futuristic spaceship. Players can arm themselves with laser swords or spears and protect themselves with shields.

The game is inspired by classic fighting games and space-themed classics like Star Wars lightsaber battles. But what sets it apart is the technology behind ValoArena that makes it possible to bring these classics to life in an entirely new way.

ValoArena’s mixed reality system allows players to fully immerse themselves in the game’s interstellar world, where they can battle against friends and foes alike. The technology accurately tracks players’ movements, making the experience incredibly lifelike and realistic. With stunning graphics, exciting sound effects, and interactive gameplay, Astro Blade is a unique experience.

A Social and Safe Game

Valo Motion’s MR game Astro Blade is designed for up to six people. It is suitable for 8 to 14-year-olds, but it can be enjoyed by both young and adults. Those who grew up wielding imaginary lightsabers as pretend Jedis would definitely love this game. The game company has also paid special attention to making the game safe for younger players, which makes the game great for families with diverse age segments.

Valo Motion - ValoArena MR game Astro Blade

The game is very social and interactive. The actions of other players directly affect what you should do next, making it an excellent addition to ValoArena’s game offerings for small groups. They want players to feel like superheroes in a fighting game and be able to come up with their special moves without limitations.

Where and How to Play Astro Blade

Astro Blade can be found in various locations worldwide where Valo Motion products are available. You can check out their interactive map to find the nearest ValoArena to your location.

According to the company, “Astro Blade is also a part of Valo Motion’s mission of empowering people to move more and be physically active but also have a lot of fun while doing it.  n Astro Blade the players use their entire body to play the game and an intense sword duel among friends is guaranteed to make them sweat!”

Astro Blade is designed to be an interactive and social game, so players can work together in teams or compete against each other. It’s a fun and exciting way to experience the latest in mixed reality gaming.

The Future of Gaming 

Astro Blade is a testament to the power of mixed reality and how it can bring classic gaming experiences to life in new and exciting ways. As the popularity of MR releases like Astro Blade continues to grow, we can expect to see them become an increasingly ubiquitous part of the gaming landscape.

Astro Blade MR game by Valo Motion - ValoArena

For players, MR offers a unique and interactive experience that allows them to socialize and have fun in a way that traditional gaming simply cannot match. And for amusement and entertainment centers, investing in MR technology can provide a competitive advantage by offering a cutting-edge gaming experience that attracts customers and keeps them coming back for more.

Looking to the future, it’s clear that MR is set to play an even bigger role in gaming. With advancements in technology, we can expect to see even more immersive and interactive experiences that blur the line between the real and virtual worlds.

So, whether you’re a gaming enthusiast or an amusement center owner looking to stay ahead of the curve, it’s clear that MR has its place in the future of gaming. It’s an exciting time to be a part of the gaming industry.

Get Ready to Battle in Space With Valo Motion’s Latest MR Game Release Astro Blade Read More »

lark-optics-is-targeting-your-retinas-for-ar-without-nausea-and-other-sickness

Lark Optics is targeting your retinas for AR without nausea and other sickness

This story is syndicated from the premium edition of PreSeed Now, a newsletter that digs into the product, market, and founder story of UK-founded startups so you can understand how they fit into what’s happening in the wider world and startup ecosystem.

Whether you believe it’s the future of everything, or just a useful tool that will be part of the mix of tech we regularly use a few years from now, augmented reality is a rapidly developing field with one major drawback – like VR, it can leave you feeling sick.

For example, US soldiers who tried Microsoft’s HoloLens goggles last year suffered “‘mission-affecting physical impairments’ including headaches, eyestrain and nausea,” Bloomberg reported.

While the technology could “bring net economic benefits of $1.5 trillion by 2030” according to PwC, this sickness is a massive inhibitor to the growth of AR and VR.

One startup looking to tackle the problem is Cambridge-based Lark Optics, which has developed a way of bypassing the issues that cause these problems.

“In the real world, we perceive depth by our eyes rotating and focusing. Two different cues need to work in harmony. However, in all existing AR glasses, these cues fundamentally mismatch,” explains Lark Optics CEO Pawan Shrestha.

Having to focus on a ‘virtual screen’ on augmented reality glasses, means users have to switch focus between the real world and the augmented one. This depth mismatch causes physical discomfort and conditions like nausea, dizziness, eyestrain, and headaches.

What Lark Optics does differently, Shrestha says, is it projects the augmented reality image onto the user’s retina. This means the AR is always in focus no matter what your eyes do to adjust to the real world around you.

So far the startup has developed a proof of concept and is now iterating to refine its demonstrator model. Shrestha says they conducted two successful user studies with their proof of concept; one in their own lab and another with an external partner he prefers not to name.

When the tech is ready, they want to use a fabless model for producing the components they design, which they will then sell to original equipment manufacturers who make AR headsets.

Given they’re addressing such a fundamental challenge to the mass adoption of AR, it’s unsurprising that other companies are tackling it in other ways (more on that below). But Shrestha says his startup’s approach is the most efficient in terms of processing power and battery power, and doesn’t affect the user’s field of vision.

Shrestha grew up in rural Nepal (“really rural… I was nearly nine years old before I saw electric lights”). He says his parents’ enthusiasm for his education eventually led him to New Zealand where he obtained a masters degree in Electronics Engineering from the University of Waikato.

Keen to develop technology he could commercialise, he says he developed an interferometer. While that venture didn’t work out, his work led him on to a PhD from the University of Cambridge, where he spotted the commercial potential of a new approach to AR displays.

“It was scientifically challenging, but  it was also something that could touch the lives of many, many people,” he says.

Shrestha co-founded Lark Optics (which was previously known as AR-X Photonics) with his friend Xin Chang, and Daping Chu who previously oversaw the PhD work of Shrestha and Chang. The trio have been working together for around a decade but only got started with Lark Optics in earnest last year,

Shrestha says this week they have been joined by a new recruit, Andreas Georgiou, who previously worked at Microsoft as a principal researcher in the field of optical engineering.

The Lark Optics team (L-R): Weijie Wu, Dr Pawan Kumar Shrestha, Professor Daping Chu, Dr Andreas Georgiou, Dr Xin Chang

Perhaps unsurprisingly, Shrestha says being based in Cambridge is a big benefit to them, with a community of experienced advisers around them, and access to relevant investors. He is particularly inspired by the progress made by Micro LED tech startup Porotech, which has raised a total of $26.1 million to date.

And Shrestha has warm words for the Royal Academy of Engineering’s Enterprise Fellowship, of which he is a part. This provides up to £75,000 in equity-free funding to cover salary and business costs, along with mentoring, training and coaching. This was what allowed him to get started on developing Lark Optics as a business.

Lark Optics itself raised a pre-seed round of £210,000 in October last year, Shrestha says, and will be raising a seed round in Q2 this year.

As mentioned above, others are tackling the problem of AR sickness in different ways. LetinAR uses a ‘pin mirror’ method, Kura Technologies has developed a ‘structured geometric waveguide eyepiece’, while VividQ “compute[s] holograms in real-time on low power devices and integrate[s] them with off-the-shelf display hardware.” 

Another company, SeeReal develops holography-based solutions to address depth issues in 3D displays.

But Shrestha says these rival technologies either require a very high level of data throughput, with a related computational and battery power overhead, or require very high resolution displays. And while some techniques decouple the AR display from the real world like Lark Optics does, Shrestha says they are “like looking through a chicken fence.

“We solved the problem without getting a significant penalty on processing power or battery power, or artefacts. So that’s why I think our approach is the best.”

Lark Optics’ ambition is to become established as the best optics for AR, VR, and mixed reality glasses.

“We want to realise the full potential of AR and VR. Now we have AR and VR you can wear for 20 minutes or 30 minutes. We want to make it feel as natural to look at real objects, VR ,or AR, and allow people to use it for all-day, everyday use.”

Shrestha sees the biggest challenge to achieving this is being able to recruit the right people in what is quite a specialised field. But he’s optimistic that attracting just one or two high-level people will end up attracting more, and the endorsement of a good seed round raise in the coming months won’t hurt either.

AR, VR, and MR has been massively hyped in recent years but there have been questions over how much of a future it has. Investor disquiet over Meta’s huge spending in the ‘metaverse’ space, and Microsoft’s job cuts in its HoloLens division as it struggles to turn it into a viable business, show that there’s no straight line from here to a future where this tech is widely used.

But that said, the current jitters of the public markets over stock prices and tech company spending isn’t an end for AR, VR, and MR at all. Apple’s first headset is on the horizon, which will no doubt spin up another wave of interest in the space (although the latest report says it’s been delayed two months, until June). 

If technology like Lark Optics’ can help prepare AR, VR, and MR for the mainstream, the startup could be well positioned to reap the rewards.

The article you just read is from the premium edition of PreSeed Now. This is a newsletter that digs into the product, market, and story of startups that were founded in the UK. The goal is to help you understand how these businesses fit into what’s happening in the wider world and startup ecosystem.

Lark Optics is targeting your retinas for AR without nausea and other sickness Read More »

the-2023-polys-webxr-awards-recap

The 2023 Polys WebXR Awards Recap

The third Annual Polys WebXR Awards took place this weekend. The show was bigger than ever thanks to the first-ever in-person awards and a special event saying farewell to AltspaceVR. However, despite some new categories, the overall category list was shorter this year as a number of previous awards were combined.

A Very Special Polys

The Polys launched during the height of the pandemic. Fortunately, not being in person has a way of not greatly hindering an event that’s already dedicated to WebXR.

The event took place in a bespoke AltspaceVR world, with watch parties on YouTube as well as other remote platforms. However, this time, people were able to get together in person but they did it in a very “metaverse” way.

The Polys 2023 WebXR Awards

In-person hosts, producers, presenters, and an audience gathered at ZeroSpace, an XR stage and motion capture studio in Brooklyn. Their actions on the stage were volumetrically captured and displayed in The Polys’ AltspaceVR environment, similar to the launch of Microsoft Mesh. Polys Director Ginna Lambert said that this was the first award show to use the technology.

Further, while winners and honorees had previously received their Polys Awards as NFTs, the team worked with Looking Glass Factory so that this year’s Polys can be presented in a physical frame. This is as physical as The Polys can get, seeing as Linda Ricci designed the award to defy physics.

A Funeral for AltspaceVR

In lieu of a half-time show, Big Rock Creative CEO, co-founder, and producer Athena Demos held a eulogy for AltspaceVR. Virtual attendees lined the aisle to a pulpit adorned with flowers and candles in a ceremony that was heartfelt and a little macabre. Following mourners down the aisle was a coffin containing one of the iconic robot avatars that AltspaceVR used at launch.

“AltspaceVR will always hold a special place in our hearts,” said Demos. “While we say goodbye to the platform that brought us together, we will always remember the connections that we made here.”

AltspaceVR funeral - The 2023 Polys WebXR Awards

While the WebXR team has used AltspaceVR to host The Polys Awards and numerous other town hall events and summits over the last three years, Demos and her team have been using it to bring Burning Man into virtual spaces. There is also a farewell party scheduled by Big Rock Creative to last until the moment that AltspaceVR servers shut down later this week.

The Polys Awards

Where last year’s Polys saw 15 awards categories (not counting personal honors of Lifetime Achievement, Ombudsperson of the Year, and the Community Award), this year’s show had eight categories. That includes some new categories reflecting the advancement of immersive technology even over the last few months.

“We in this community are ahead of a massive shift that we call the fourth industrial revolution,” said host Julie Smithson. “We’re here to celebrate the progress made in WebXR in the year of 2022.”

Julie Smithson at The Polys WebXR Awards

Entertainment Experience of the Year

When popular culture looks at “the metaverse” they typically equate it with irresponsible escapism – something that people use to avoid the challenges of life. XR producer and director Kiira Benzing pointed out that positive escapism – using XR to take a break from life rather than to neglect it – is one of the medium’s greatest strengths.

“With the immersive medium, you get the opportunity to step into an experience,” Benzing said in presenting the award for Entertainment Experience of the year.

The award went to Project Flowerbed, an immersive gardening experience by the Meta WebXR team. The same project was nominated for Experience of the Year.

Innovator of the Year

Futurewei Technologies Senior Director for VR, Metaverse, Mobile, Apps, and Services Daniel Ljunggren presented the award for Innovator of the Year – previously “Innovation of the Year.” The award went to Sean Mann, CEO and co-founder of RP1, a “persistent, seamless, real-time platform with limitless scalability.”

“To be amongst this many pioneers and innovators in one space is amazing. I think we’re all winners,” said Mann. “I’m super excited to be a part of this.”

Developer of the Year

“Being on the frontier of the immersive web is a pioneering effort,” Yinch Yeap said in presenting this award. “It still feels like the Wild West.”

And, like in the Wild West, many of the biggest names are pseudonyms. This is certainly the case for this year’s winner, known only as “Jin.” Jin appeared as a similarly anonymized avatar to accept the award.

“I am a huge believer in WebXR,” said Jin. “I stand on the shoulders of giants. I am very humbled and I owe this to everyone building the immersive web.”

Game of the Year Award

“Game of the Year” is a broad category as most WebXR experiences are arguably “games” – and that’s what makes the award so important according to presenter Rik Cabanier, a software engineer at Meta. The award went to the mini golf game Above Par-Adowski by Paradowski Creative.

Above Par-Adowski VR game

Accepting the award was Paradowski Creative Director of Emerging Technology James Kane, who called WebXR “the best expression of the metaverse there is.” Kane was also a nominee for Innovator of the Year.

“I want to thank our team,” said Kane. “And thanks to the Meta team for creating an amazing WebXR platform as well as for directly supporting us.”

AR Passthrough Experience of the Year

“Where, for the past years AR experiences were mainly relegated to phones, now passthrough devices are everywhere,” said presenter Lucas Rizzotto. This allows more passthrough experiences on devices available today, but it also allows more impactful development of experiences for future AR devices.

The award went to Spatial Fusion by PHORIA and Meta, an experience which sees players repairing a damaged spaceship. Ben Ferns, a consulting developer, was one of those accepting the award.

“Huge thanks to the entire team – it was a huge team effort,” said Ferns. “It’s just exciting to see the promise of WebXR and passthrough.”

WebXR Platform of the Year

In presenting the award for WebXR Platform of the Year, Prestidge Group founder and CEO Briar Prestidge pointed out that every WebXR platform has strengths and weaknesses – something that she learned a lot about while famously spending “48 hours in the metaverse” for a documentary.

The award went to Croquet, “the operating system of the metaverse,” which also took home the Startup Pitch Competition Auggie Award last year. The award was accepted by The Polys on behalf of the organization.

Education Experience of the Year

The “digital divide” describes accessibility differences exacerbated by the benefit of technology only being available to those who can afford the required hardware or programs. WebXR is vital to the future of education because it lowers the cost of access for immersive experiences, according to Silicon Harlem founder Clayton Banks in presenting this award.

Banks presented the award to Prehistoric Domain, an immersive tour that brings learners up close and personal with virtual representations of dinosaurs and other extinct species. Accepting the award was creator Benjamin Dupuy. Prehistoric Domain was also nominated for Experience of the Year.

“WebXR opens so many possibilities – it’s very exciting,” said Dupuy in accepting the award. “We are all pioneers of the immersive web here and I think we’re at the beginning of an era where the line between illusion and reality is very thin.”

Experience of the Year

Demos returned to the stage – this time in volumetric capture instead of in her AltspaceVR avatar – to present the award for Experience of the Year to Spatial Fusion.

This was the experience’s second win of the night. The experience was also a nominee for Entertainment Experience of the Year. Ferns returned to accept the award and pointed out that the code has been open-sourced.

“I’m really excited to see what other people do with this now that it’s freely accessible,” said Ferns. “It’s an exciting time for trying out all of these new UX opportunities.”

This Year’s Honorees

In addition to the nominated awards categories, there are three honors categories. The honoree in each category is named by the previous year’s recipient rather than by a panel of judges.

Community Honor

Last year’s community honoree Trevor Flowers named Evo Heyning for this year, specifically for her work with the XR Guild, the Open Metaverse Interoperability Group, and [email protected].

“Whether it’s exploring AR, exploring 3D objects and NERFs, exploring interoperability of avatars and [email protected] specifically, being a part of these experiences with [Sophia Moshasha], with Ben [Irwin], with Julie [Smithson], with everyone – it’s meant so much to me,” Heyning said in accepting the honor.

Ombudsperson of the Year

The Ombudsperson of the Year Honor is specifically set up to recognize people working on the social and human aspects of WebXR. Last year’s honoree, Avi Bar-Zeev said that he was “honored to hand off the title” to Brittan Heller, a lawyer who introduced the term “biometric psychography” to describe mental and emotional profiling through an XR user’s personal data.

Brittan Heller at The Polys WebXR Awards

“I’d like to thank Avi, Kent [Bye], and everyone at the XR Guild and the Virtual World Society, and everyone in the XR community,” said Heller. “I appreciate how everyone here is so involved in making the community so welcoming to everyone.”

Bye, referenced by Heller in her acceptance speech, is a leading XR ethicist, a strong speaker in the nascent field of biometric psychography, and the first-ever recipient of this award.

Lifetime Achievement Honor

Last year’s Lifetime Achievement Honoree Brandon Jones selected Patric Cozzi for this year’s honor. Cozzi is the CEO of Cesium, but he was selected for this award because of his work co-creating glTF as a contributor to the Khronos Group.

Patric Cozzi at The Polys WebXR Awards

“I’m really honored for glTF and the community,” said Cozzi. “It was a grassroots effort for years.”

Looking Forward to the Future

This was the last year that The Polys WebXR awards will be hosted in AltspaceVR, but the team is still looking forward to next year’s event. While they haven’t yet said what platform (or platforms) it will take place on, there’s a full year to figure that out. And a year is a long time in this industry. If you missed this year’s ceremony, you can find the recording here.

The 2023 Polys WebXR Awards Recap Read More »

awe-2023-is-right-around-the-corner

AWE 2023 Is Right Around the Corner

Augmented World Expo, AWE for short, returns to Santa Clara this year from May 31 to June 2, 2023. The agenda is still coming together but there’s already a lot to be excited about. Let’s take a look.

Morning Keynotes

Many XR companies save some of their biggest announcements for the AWE stage. Even when companies aren’t dropping new products, apps, and services, they use the time to inform and inspire listeners about this rapidly developing space.

Day One

The first day of AWE always starts with an opening keynote from event founder Ori Inbar. Inbar’s addresses are always insightful and digestible with good measures of his palpable enthusiasm and humor. During his opening keynote last year, Inbar spoke about how XR can help make both big dreams and small dreams become reality.

Next up is the Qualcomm keynote from Vice President and General Manager of XR Hugo Swart. At his keynote last year, Swart presented Snapdragon Spaces and introduced the first two recipients of Qualcomm Ventures’ metaverse-funded companies.

Then, Nreal CEO Chi Xu takes the stage. Nreal hasn’t been a keynote presenter in the years that ARPost has covered AWE. But, the company is definitely going places. This year saw the commercial launch of Nreal Air (review) and we know that they have at least one more model waiting in the wings for the next big launch.

Day Two

Day two only has one proper keynote scheduled, this time with Magic Leap. Last year, the company’s Head of Product Management, Jade Meskill, took the stage to talk about the Magic Leap 2 and “augmented enterprise.” We don’t yet know what will come of this year’s keynote but it’s being given by the company’s CEO Peggy Johnson.

Following that is a “Fireside Chat” with Unity CEO John Riccitiello. That it’s a “fireside chat” and not a “keynote” arguably suggests that there won’t be any big product announcements but that doesn’t mean that this session shouldn’t be on your schedule.

Days two and three are lighter on heavy-hitting speakers to encourage attendees to check out the expo floor, which we’ll look at next. Don’t worry though, there are sessions to look forward to beyond just keynotes and we’ll look at some of those later.

The Expo Floor

It’s impossible to know exactly what will be going on on the expo floor, which is part of what makes it so exciting. A list of exhibitors (over 130 of them) and a map of the expo floor are posted on the AWE website, but what companies will be showcasing and how is a mystery until the floor opens on day two.

First off, a number of haptics pioneers will be there including Haptx, bHaptics, and SenseGlove. Any immersive technology is better when you experience it yourself instead of just seeing it on YouTube, but this is doubly true for haptics. But, unfortunately, many of these products are still hard for the average person to get their hands on. That makes the expo floor a great intro.

Mojo Vision will also be on the AWE expo floor. While this company isn’t likely to be putting their AR contact lenses onto the eyeballs of just anybody, they do have rigs that allow you to get a glimpse through what they’re building.

DigiLens, Vuzix, and Lenovo will also be on the AWE Expo floor. These companies make components and enterprise hardware that’s usually a cut above available consumer models. Trying them out can be a glimpse into the future. I got to get my hands on some of their hardware at last year’s expo and left feeling enlightened.

Also, Tilt Five will be returning. Last year, their augmented game board was the life of the expo floor drawing huge crowds – not just to interact with the product but to watch other people interact with the product.

Of course, that’s only a sliver of the total exhibitors. Personally, I’m hoping to reconnect with some of my friends from Avatour, Echo3D, FundamentalVR, Inworld AI, Leia Inc., Mytaverse, OVR Technology, VRdirect, and Zappar.

Expert Talks and Panel Discussions

Day One

On day one, right after the keynotes, many will likely stay in their seats to see Forbes columnist, author, and educator Charlie Fink talk with Magic Leap founder and former CEO Rony Abovitz about “How We Can Invigorate XR.” A few hours later on the same stage, Qualcomm Director of Product Management Steve Lukas will talk about “Building AR for Today.”

A little after that, one might head out of the Mission City Ballroom to Grand Ballroom C’s “Web3” track where EndeavorXR founder and CEO Amy Peck will be debating “Pros &Cons of Web3” with XR Guild President Avi Bar-Zeev. It’s hard to find an XR organization that Peck isn’t or hasn’t been involved with, and Bar-Zeev co-created Google Earth and HoloLens.

From there, one might head back to the Mission City Ballroom for “Intersection of AI and the Metaverse: What’s Next?” a panel discussion with leading XR ethicist Kent Bye, HTC VIVE China President Alvin Graylin, WXR Fund Managing Partner Amy LaMeyer, and Creative Artist Agency’s Chief Metaverse Officer Joanna Popper.

But wait! Happening at the same time is “How XR Technology Is Changing the Fashion Landscape” with Beyond Creative Technologist David Robustelli, Ready Player Me co-founder Kaspar Tiri, and DressX co-founder Daria Shapovalova.

Depending on which of those last two talks you see, you might have time for “What Problem Does the Metaverse Solve?” with Nokia Head of Ecosystem and Trend Scouting Leslie Shannon.

If you miss the first fashion session, you can always catch “Redefining Fashion and Beauty’s Next Decade – From Virtual Beings and Gaming to Generative AI” with LVMH VP of Digital Innovation Nelly Mensah, 5th Column founder and CEO Akbar Hamid, and Journey founder and Chief Metaverse Officer Cathy Hackl.

Day Two

On the same day that the expo opens up, on the main stage, Paramount Pictures Futurist Ted Schilowitz presents “XR Excellence: Demonstration & Discussion” – billed as a collection of “what he thinks are the best experiences in VR and MR today, and what we can learn from those experiences” followed by Q&A.

But oh no! At the same time in Ballroom D, Khronos Group President Neil Treveett, XRSI founder and CEO Kavya Pearlman, and Moor Insights & Strategy Senior Analyst Anshel Sag are talking about building open standards for the metaverse!

XR Talks with ARPost

Both of those events conflict with a “Meet the Makers” session featuring Julie Smithson and Karen Alexander of MetaVRse, Sophia Moshasha of the VR/AR Association, and Ben Erwin of The Polys Awards.

Later in the afternoon, Inworld AI’s Chief Creative Officer John Gaeta and Chief Product Officer Kylan Gibbs debut a new concept demo called “Origins” – a new kind of caper in which a human detective must navigate a world of generative AI bots.

The evening of AWE Day Two is also The Auggie Awards. We can’t tell you too much about the Auggie Awards because the finalists aren’t out. In fact, you still have until April 7 to submit nominees. Then, there’s a period of public voting until May 4. You can submit nominees and vote for your favorites here.

Day Three

On day three, in the “AI and Virtual Beings” track, producer, director, and strategist Rebecca Evans, Stanford University Graduate Research Fellow Eugy Han, Odeon Theatrical CEO Stephanie Riggs, and Dulce Dotcom advisor Dulce Baerga will discuss “Avatars, Environments & Self Expression – from Social VR to Cross-Reality Experiences.”

From there, you might head back to the Mission City Ballroom for a Fireside Chat with Tom Furness, the founder and chairman of the Virtual World Society – one of the oldest and noblest organizations in immersive tech.

AWE concludes on the afternoon of day three with Inbar’s closing statements and the Best In Show Awards on the main stage.

How to Attend AWE

Once again, all AWE recordings will become available on AWE.live. If you want to experience AWE in person, you still have time to get tickets. If you’re reading this before February 28, you still have time for Super Early Bird Tickets. You can also get 20% off of your ticket price by using discount code 23ARPOSTD at checkout.

And keep an eye on ARPost as AWE draws nearer. As a media partner of the event, we’ll be giving two free tickets to selected readers as part of an upcoming drawing. Watch our social media channels for details.

AWE 2023 Is Right Around the Corner Read More »

qualcomm-partners-with-7-major-telecoms-to-advance-smartphone-tethered-ar-glasses

Qualcomm Partners with 7 Major Telecoms to Advance Smartphone-tethered AR Glasses

Qualcomm announced at Mobile World Congress (MWC) today it’s partnering with seven global telecommunication companies in preparation for the next generation of AR glasses which are set to work directly with the user’s smartphone.

Partners include CMCC, Deutsche Telekom, KDDI Corporation, NTT QONOQ, T-Mobile, Telefonica, and Vodafone, which are said to currently be working with Qualcomm on new XR devices, experiences, and developer initiatives, including Qualcomm’s Snapdragon Spaces XR developer platform.

Qualcomm announced Snapdragon Spaces in late 2021, a software tool kit which focuses on performance and low power devices which allows developers to create head-worn AR experiences from the ground-up or by adding head-worn AR to existing smartphone apps.

Qualcomm and Japan’s KDDI Corporation also announced a multi-year collaboration which it says will focus on the expansion of XR use cases and creation of a developer program in Japan.

Meanwhile, Qualcomm says OEMs are designing “a new wave of devices for operators and beyond” such as the newly unveiled Xiaomi Wireless AR Glass Discovery Edition, OPPO’s new Mixed Reality device and OnePlus 11 5G smartphone.

At least in Xiaomi’s case, its Wireless AR Glass headset streams data from compatible smartphones. Effectively offloading computation to the smartphone, the company’s 126g headset boasts a wireless latency of as low as 3ms between the smartphone device to the glasses, and a wireless connection with full link latency as low as 50ms which is comparable to wired solution.

Qualcomm Partners with 7 Major Telecoms to Advance Smartphone-tethered AR Glasses Read More »

“the-bear-who-touched-the-northern-lights”-is-a-charming-ar-story-puzzle

“The Bear Who Touched the Northern Lights” Is a Charming AR Story Puzzle

When a polar bear sees the northern lights for the first time, he wants to reach out and touch them. How will he get there and who will he meet along the way? That’s up to you with this charming interactive AR story puzzle.

The Bear Who Touched the Northern Lights

The Bear Who Touched the Northern Lights” is a sort of choose-your-own-adventure AR story for children where the “chapters” are physical puzzle pieces. The artwork and story are by Julie Puech and Karl Kim.

The ways in which these pieces fit together (or don’t) helps the AR story keep a logical narrative. However, pieces can be added and removed or swapped out resulting in multiple different possible tellings of the tale.

Of course, the adorable puzzle doesn’t tell the whole story. The puzzle pieces come to life with the help of a free AR mobile app for Apple and Android devices. The mobile app recognizes the pieces and animates their artwork, as well as queueing an audio narration by Kasey Miracle.

As a weary old XR veteran with a cold little heart, I sometimes find it helpful to recruit fresh eyes for product reviews – like when my younger brother provided his insights for my Nreal Air review. This time I recruited the help of my fiancée’s eight-year-old daughter.

What’s in the Box?

The puzzle comes with 15 AR story cards and an instructional booklet. The instructional booklet has information about the product, links to the app, and some advice for doing the puzzle for the first time – but don’t panic if you lose it. The puzzle information and a QR code to the app are both on the outside of the box and the first puzzle piece triggers an AR guide to using the app.

AR app - The Bear Who Touched the Northern Lights - AR Story Puzzle

The free app, powered by Unity, opens with a quick warning about being aware of your surroundings while using AR and encourages you to supervise children when using the app. From there, the app only has a play button and a settings button. Settings include background dimming to make the animations stand out better, or an option to turn the animations off.

Do be aware that the app is 394 MB and does require a fairly modern device to run. Like any AR app, it requires the use of your camera while the app is running.

Following Directions

Some pieces have special icons on them. Cards with a blue “+” are optional chapters that don’t have to be included in the AR story. Cards with green and orange arrows can be swapped out for one another, changing how the story unfolds.

The play guide recommends that you remove the optional chapters and two of the interchangeable chapters the first time that the puzzle is constructed. This is presumably an introductory version of the puzzle to avoid throwing too much at first-time players.

As with any puzzle, it’s important to find a flat surface large enough for the puzzle when completed. The play guide recommends a space of two feet by three-and-a-half feet. The AR story puzzle is long and narrow in nature, particularly with all of the possible pieces in play, but has some curves in the overall shape so it isn’t just a straight line.

AR app - The Bear Who Touched The Northern Lights

The AR instructions at the beginning of the puzzle remind you that you also need to have space to sit comfortably with the puzzle in front of you for about 20 minutes (give or take). After all, the play guide also recommends additional activities like asking the child to try to construct the story from the puzzle before watching the narration.

Putting the Pieces Together

The first time putting the puzzle together, we followed the play guide’s advice to remove extra pieces and one set of interchangeable chapters. The shapes of the pieces are similar enough to make it a little challenging for young hands to assemble without it being frustrating. They’re also different enough that the story can’t be constructed in an order that wouldn’t make sense.

It only took a few minutes to assemble the puzzle for the first time, and then we fired up the app. The AR instructions are short, cute, and very informative, telling us everything we needed to know without being boring. It takes the app a second or so to recognize the cards, so moving from one chapter to the next is neither seamlessly fast nor frustratingly slow.

The Bear Who Touched the Northern Lights - AR Story Puzzle

The animations are cute and colorful, and the effects are simply but beautifully done. The default background dimming on the app is 35%, and it certainly worked. Turning it up can make the background disappear completely, which makes for optimum viewing quality, but also makes it harder to find the pieces in the camera. Pick what setting you like best.

At one point in the story, the bear starts receiving items for his journey. The Child got to choose which items he used when, but only one item was ever needed in the story, and selecting the wrong item isn’t penalized – you just pick again. We were split on this. It’s nice that we couldn’t pick wrong, but picking at all felt kind of unnecessary. (This made more sense later on.)

We reached the end of the AR story. Sort of. Immediately upon finishing the puzzle and the story the first time, The Child asked to do the puzzle again with the extra chapters.

Putting the Pieces Together Again

We added in the two optional AR story pieces and swapped out both of the interchangeable pieces and put the puzzle together again. Suddenly, the choices made a much bigger difference and a lot more sense.

The interchangeable pieces provide the bear with a different item and see him use it in a different way. The additional chapters introduce new characters, which the bear befriends by using the different items. This gave The Child a new appreciation for the AR story, but it gave me a new appreciation for the AR app.

Doing the puzzle the first time, one would be forgiven for assuming that the chapters are stand-alone pieces that don’t affect one another. Doing the puzzle again makes it clear that the app is telling a new story each time based on the pieces, their placement, and your choices throughout the story.

AR Story Puzzle - The Bear Who Touched the Northern Lights

We’ve only done the puzzle those two times so far. I haven’t done the math to figure out how many different versions of the story are possible with different choices, pieces, and arrangements, but I know that there are a lot of versions of the story that we have yet to hear.

And that’s a good thing. As soon as we finished doing the puzzle the second time, The Child immediately asked if there were any more AR story puzzles like this one.

Where to Find the AR Story Puzzle

So far, The Bear Who Touched the Northern Lights is the only product by Red+Blue Stories (but we’re hopeful for more). The company is based in Canada but also ships to the US. Prices start at around US$34, but you can pay more for different shipping options. As of this writing, the AR story puzzle is not available on other online retailers like Amazon.

The AR instructions say that a child can use the product by themselves after the first go-around. That may be true, but if you’re letting your child construct this AR story puzzle without you, you’re missing out.

“The Bear Who Touched the Northern Lights” Is a Charming AR Story Puzzle Read More »