augmented reality

how-xr-fan-engagement-brings-fans-closer-to-the-game

How XR Fan Engagement Brings Fans Closer to the Game

Over the years, ARPost has covered the physical nature of XR in athletics and sports a number of times – from how athletes use XR to improve their game, to how gamers can use VR sports to stay fit, to how thrilling and active a good AR team game can be for players and spectators alike. XR is also increasingly being used in another capacity: fan engagement.

Is AR the Future of Fan Engagement?

Athletes are usually sports fans, but are sports fans usually athletes? This article isn’t about how XR can make a sports viewer into a finely-tuned machine, or how a sports viewer can become a star in their own right through things like esports. After all, not all sports fans want to do those sorts of things.

However, it’s probably fair to say that all sports fans want to feel closer to the athletes and teams that they follow. That doesn’t mean getting onto the field, but it might mean getting out of the stands. Sports teams and property managers are increasingly using XR for sports fan engagement to let fans get closer to their passion, if not closer to the action.

In-Arena Opportunities for CBJ Fans

In January, NHL team The Columbus Blue Jackets unveiled “The Fan Zone” in their home Nationwide Arena in partnership with MVP Interactive. Followers of ARPost might remember that MVP Interactive also made appearances in our 2021 article about how and why brand engagement is driving XR development.

“The Blue Jackets are one of the few sports organizations taking the lead to bring fans the latest in cutting-edge technology with first-ever immersive experiences to their arena,” MVP Interactive CEO James Giglio said in a release shared with ARPost. “Our team was honored to work with everyone at CBJ to bring technology forward with multi-generational experiences to their Fan Zone.”

Slapshot Challenge 3 - The Columbus Blue Jackets - fan engagement

The 4,000 square-foot space overlooks the team’s practice area and includes a number of XR experiences, as well as the eSports Lounge for CBJ gaming, the team’s official esports arm. As exciting a development as esports is in the general gaming world, we’re most interested in the XR fan engagement activations.

“With the upgraded space and technology advancement of our new Fan Zone, we hope to provide a world-class experience for fans of all ages,” Blue Jackets Vice President of Marketing Ryan Chenault said in the release.

XR in the Fan Zone

In the “Slapshot Challenge” fan engagement activation, fans choose between three different game modes including “Shots on Goalie” pitting their skills against a virtual goaltender. Using a real stick and a ball, the fan’s movements are tracked by sensors to replicate an on-ice experience in a space reminiscent of the Cave VR system.

Slapshot Challenge - AR fan engagement - The Columbus Blue Jackets

The “Goalie Challenge” flips the scenario, both figuratively and physically. In full goalie gear, the fan now faces the screen where a virtual contender appears to launch physical balls their way. While the goalie in the slapshot challenge is entirely automated, the placement of balls fired off in the goalie challenge can be controlled by a friend via a computer interface.

“The Blue Jackets are dedicated to removing barriers to the game of hockey and investment in this space is a meaningful nod to this mission,” said Chenault. “By providing both stick-in-hand and controller-in-hand activations, we can give fans an opportunity to not only watch the game but experience it first-hand.” 

Slapshot Challenge 2 - The Columbus Blue Jackets - fan engagement

There are less intense fan engagement opportunities as well. A “Pose with a Pro MorphingStation” gives fans an opportunity to take a selfie next to a virtual replica of their favorite Blue Jackets. A similar activation allows fans to pose in a virtual Blue Jackets jersey. All of these activations reward the fans with videos and images optimized for social media.

Pose with a pro - The Columbus Blue Jackets - XR fan engagement

Implementation and Stats

On entering the Fan Zone, fans have the opportunity to check in by scanning a QR code and providing an email address to receive their videos and photographs. According to figures provided to ARPost following the launch of the activation, over 1,200 fans entered the Fan Zone on opening night and 375 provided emails to receive their digital mementos.

Further, the “average dwell time across experiences was 24.55 seconds.” This may not seem like a long time, but it is averaged across all of the fan engagement experiences though the challenges likely engaged fans for significantly longer than the AR photo opportunities.

NIL in AR

The “Pose with a Pro” fan activation presented by the Blue Jackets shows that there is a lot of promise in sports fan engagement with virtual replicas of their favorite athletes. That isn’t just limited to professional sports, however.

College sports are tremendously popular but its athletes were, to some degree, barred from benefiting from that popularity for most of the history of college sports. That’s because college athletes were largely prevented from benefiting from their name, image, and likeness (NIL) by the NCAA – the organization that governs college sports.

However, in 2021, the NCAA began loosening NIL rules, opening up potentially lucrative opportunities for college athletes. AR publishing platform LDP Studio claims to be part of the first “NILAR” (name, image, and likeness in augmented reality) agreement. The signee? The University of Tennessee senior tight end Jacob Warren for the Craven Wings restaurant chain.

“We believe AR Hero will change the way college football fans experience the game by engaging more people with the players they know and love,” LDP Studio VP of business development Jessee Black said in a release shared with ARPost. “It’s a really cool and futuristic new concept for QR code use which increases engagement for businesses and brings fun to the fans.” 

NILAR Jadob Warren - fan engagement

AR Hero, the tool that runs the experience, invites users to trigger fan engagement activation via a QR code. From there, fans can take photos and videos with an AR version of Warren that goes through different poses giving plenty of opportunities to fans.

“With AR Hero, fans can feel like they are part of the action and experience the players they know and love in a whole new way,” said Black. “Businesses have the opportunity to create more engagement with fans through ‘NILAR’ as well.”

The First NILAR Agreement?

It’s easy to be skeptical of whether this fan engagement initiative is really the first NILAR agreement. It is very probably the first NILAR agreement in college sports and it just might be the first of its kind anywhere as LDP Studio claims.

Digital twins of celebrities aren’t brand new. However, the ownership of these twins has long been problematic. The owner of a digital twin is usually the studio that commissioned it, rather than the individual that the twin is created from.

NILAR agreements with athletes as well as other individuals have huge potential to give individuals more control over their own digital twins. That’s a big win for those individuals from an economic standpoint, but it’s also a good idea from an ethical perspective.

Getting Sports Fans Out of Their Seats

With good AR fan engagement, everybody wins. Fans get more interactive ways to engage with their favorite content and athletes. Athletes can have an AR proxy that’s available to fans while they’re busy training, on the field, or at home. Teams get new ways to bring fans deeper into the sports that they love (and, yes, collect some much-cherished user data).

The good news keeps getting better. XR fan engagement activations are becoming simpler to use, more interactive, and are even being created in ways that are more mindful of the humans that lend their digital duplicates to these activations.

How XR Fan Engagement Brings Fans Closer to the Game Read More »

apple-ar-glasses-put-on-hold-to-make-way-for-mr-glasses-–-vr-and-metaverse-expert-weighs-in

Apple AR Glasses Put on Hold to Make Way for MR Glasses – VR and Metaverse Expert Weighs In

Has Apple bitten more than it could chew? It appears that the long-awaited AR glasses won’t be hitting the shelves any time soon. A Bloomberg article published recently says that the Apple AR glasses are facing technical challenges, so their release has been delayed indefinitely and the project scope pared back. The report also revealed that Apple may instead opt to release a more affordable mixed reality headset.

Emma Ridderstad, CEO and Co-founder of Warpin Reality, shares her insights on the delayed release of the Apple AR glasses and the development of its mixed reality headset, probably to be called Reality Pro. She also shares her thoughts on what these developments mean for the industry, the consumers, and the future of AR/VR.

Apple AR Glasses Shelved to Make Way for an MR Headset

For a couple of years now, Apple has been developing AR glasses that resemble real eyeglasses. The design has already gone through several iterations but still, apparently, fails to meet expectations. While it is unclear where the real problem lies, it is clear that we won’t be seeing through the Apple AR glasses this year.

According to Bloomberg, what we may see soon are MR headsets that combine virtual and augmented reality elements. It was reported that Apple is shifting its focus towards developing a bulkier but less complicated MR headset with a projected price tag of $3,000. The company then plans to follow this with a more affordable version priced at just around $1,500, closer to Meta Quest Pro, though still with a higher price tag.

A Wise Move by Apple

When asked whether the delay of the Apple AR glasses will affect businesses that have already adopted the technology, Ridderstad believes that it would have little impact. Aside from the limited number of businesses currently using Apple’s AR technology, those that have adopted it are not fully reliant on it.

According to Ridderstad, AR/VR technology is still in its infancy. As immersive as these headsets are, they aren’t very convenient. The use cases are still quite limited, and the high cost of both hardware and software can be restrictive. “VR headsets need to become useful to people. Right now, they solve business-to-business problems but they’re still mostly just fun for the end consumer,” Ridderstad explained. So, Apple’s shift from AR glasses to MR headsets makes sense given the broader need to make immersive technology more accessible and affordable.

Ridderstad also believes that Apple will remain a key player in the industry, despite delays on its AR glasses. Consumers continue to trust Apple to produce well-researched and designed products. Considering the price, design, and content of these headsets, the market needs to see more affordable and functional headsets. “Since most people are just starting to see what these new technologies can do, we have to remind ourselves that this evolution is going to take time,” she said. “The real end consumer adoption will probably happen with Apple this time too.”

The True Value of XR Goes Beyond Gaming and Entertainment

XR technology has long been associated with gaming. But Ridderstad argues that the true value of XR lies in its potential in business, training, and education.

Her company, Warpin Reality, has developed a platform called Xelevate, which allows companies to launch customizable VR training courses for their employees. These courses range from safety drills to customer experience simulations and personality development workshops. Platforms like this have allowed construction companies to train their people on safety and equipment use and taught employees what to do during emergencies.

Ridderstad believes that VR/AR can optimize focus, learning, and training. She cites a PwC study that found that VR learners are more focused, learn more quickly, and are more emotionally engaged than e-learners. It could also create opportunities in remote work for those who struggle with in-person demands such as people with disabilities.

Diversity and Accessibility in Tech 

For years, the tech industry has been known to be a boys’ club. This still remains true in the metaverse. A McKinsey report found that in organizations shaping metaverse standards, 90% of leadership roles are held by men. Ridderstad warns, “The metaverse is not going to be an environment that people want to be in unless everyone feels welcome and comfortable. I think it is safe to say that unless women play their part in building the metaverse, and take their place among its architects, it won’t be.”

These technologies have the potential to revolutionize the future, so it’s important that they are designed for both men and women to see a higher level of adoption.

Apple AR Glasses Put on Hold to Make Way for MR Glasses – VR and Metaverse Expert Weighs In Read More »

digital-fashion-week-new-york:-reimagining-the-future-of-phygital-fashion

Digital Fashion Week New York: Reimagining the Future of Phygital Fashion

The 2023 Digital Fashion Week New York was a three-day event that was a hub for phygital fashion experiences. Merging the physical and digital fashion worlds, the Web3 event provided audiences with immersive phygital fashion experiences, as well as informative discussions, and networking opportunities, and acted as a digital venue for independent designers around the world to showcase their designs.

In a press release shared with ARPost, the Digital Fashion Week NY team expected to see a variety of attendees, including industry specialists, tech CEOs, fashion industry executives, investors, designers, and artists, among other guests.

Phygital Fashion Took the Center Stage

The three-day event kicked off on Thursday, February 9, with a Networking and Speaker Summit, which featured global panel discussions on the role of AI in transforming design protocols. On Friday, February 10, the event hosted the opening of the Metaverse Fashion Experience where attendees could explore virtual worlds through digital avatars. These digital avatars donned custom special drops from digital fashion week design winners.

Aside from the opening of the Metaverse Exhibition, there were also global panel discussions. Some of the members of the panel and speakers during the panel discussions included ZERO10’s Chief Product Officer Maxim Raykhrud, Exclusible’s Chief Commercial Officer Olivier Moingeon, and Sensorium’s Deputy CEO Sasha Tityanko.

During the last day of the Digital Fashion Week NY, Saturday, February 11, attendees could experience an array of immersive digital experiences, which combined physical and digital assets. These phygital fashion experiences included holograms, virtual showrooms, and animation screenings from some of the world’s leading artists and fashion designers who work within the Web3 space.

ZERO10 at the 2023 Digital Fashion Week NY

The AR fashion platform ZERO10 also showcased activations, alongside LODE. Through these phygital fashion experiences, attendees were able to learn more about how modern technology could play a role in transforming digital fashion and how this, in turn, could give them a new channel for self-expression.

Aside from activations, ZERO10 also showcased a recreation of five designs from independent phygital fashion designers, in particular, pieces from Private Policy’s Fall/Winter ‘23 collection, transforming them into augmented reality.

ZERO10 designs on Digital Fashion Week NYC 2023

Private Policy, a New York-based inclusive fashion brand, debuted its F/W ‘23 collection, entitled “We Are All Animals” and is a celebration of the interconnectedness of all the earth’s living beings. Pieces from the collection featured graphic designs of endangered and critically endangered species, such as the Amur Leopard and the Yangtze Finless Porpoise.

The collection combines utilitarian features, such as harnesses and tactile pockets, with sustainable materials. For instance, components like slanted checker pieces are made using reclaimed or unwanted denim garments, while statement outerwear designs are crafted using recycled poly faux fur.

Aside from promoting sustainability through the collection’s materials, Private Policy’s newest collection also distills the beauty and vigor of nature, combining it with elements that give it an urban edge.

Of the five designs, two had physical representations that were showcased at the show in New York City on February 11. These physical representations will also be made available for viewing in London on February 18. The Digital Fashion Week London runs from Friday, February 17 to Saturday, February 18.

ZERO10 on Digital Fashion Week 2023 NYC - The Marine Explorer #manipulator

As for the other three designs, they will remain digital and will be made available for NYFW 2023 guests to try on. Furthermore, Private Policy’s collaboration with ZERO10 will enable fans to virtually try on pieces from the collection via the ZERO10 app.

Through ZERO10’s integration, attendees and fashion enthusiasts have a new way of interacting and learning more about immersive phygital fashion through augmented reality.

Aside from ZERO10 and LODE, some of the other artists, designers, and brands that were present during the Digital Fashion Week NY included Anastasia Sladkova, Clo B, DOPE GLOBAL, Maya ES, MOS Brand, Schieva x Tokyo White, Tony Murray, and Zoha Khan. S

Digital Fashion Week - phygital fashion - Design by Maya ES
Fashion by Maya ES

Digital Fashion Week New York: Reimagining the Future of Phygital Fashion Read More »

ray-tracing-comes-to-snap-lens-studio

Ray Tracing Comes to Snap Lens Studio

One of the most powerful recent breakthroughs in graphics and rendering is coming to mobile AR thanks to a recent update to Snap’s Lens Studio. We’re talking about ray tracing.

What Is Ray Tracing?

Ray tracing is a rendering technique that helps to bring digital assets to life in the environment around them – whether that environment is digital or viewed in augmented reality. Recent examples in gaming include convincingly reflective surfaces like water, believable dynamic shadows, and improved light effects.

The technique can be fairly computing-heavy, which can be a problem depending on the program and how it is accessed. For example, when some existing games are updated to use ray tracing, users accessing that game on an older or less fully-featured computer or console may have to turn the feature off to avoid problematic latency.

Fortunately, ray tracing is being developed at the same time as new computing and connectivity methods like cloud and edge computing. These advancements allow the heavy lifting of advanced computing techniques to take place off of the device, allowing older or less fully-featured devices to run more high-level experiences smoothly.

While Snap releases detailing the update didn’t mention Lens Cloud, it’s likely that that feature is behind the update. Announced at the 2022 Snap Partner Summit, which also announced ray tracing for the first time, Lens Cloud provides improved off-device storage and compute, among other advancements.

The Road to Lens Studio

If you closely follow Snap, you’ve known for almost a year that this was coming. Snap also discussed ray tracing at the fifth annual Lens Fest in December. There we learned that the update has been in the hands of select developers for a while now, and they’ve been working with Snap partners to create experiences pioneering the software.

The news announced yesterday is that the feature is now in Lens Studio, meaning that any Lens creator can use it. We also have a new demonstration of the technology: a Lens created with Snap partner Tiffany & Co.

Snap ray tracing - Tiffany & Co

The company has likely been so involved in the development and showcasing of Snap’s ray tracing at least in part because the jewelry that the company is known for provides both a great challenge for and an excellent demonstration of the technology. However, Snap is already looking forward to the feature finding other use cases.

“Now, Lenses that feature AR diamond jewelry, clothing and so much more can reach ultra-realistic quality,” Snap said in the announcement.

The principal use case presented by Snap in the announcement is virtual try-on for clothing retail, like the Tiffany & Co. Lens. However, it is likely only a matter of time before the new feature finds its way into other kinds of AR experiences as well.

What’s Next?

Ray tracing is likely to be a topic yet again at the upcoming Snap Partner Summit in April, and ARPost will be there to hear about it. The online event doesn’t have the same energy as Lens Fest but as we saw here, the Partner Summit is often the first look at Snap’s developing software offerings. We always look forward to seeing what they’ll roll out next.

Ray Tracing Comes to Snap Lens Studio Read More »

blippar-expands-blippbuilder-support-to-ar-glasses-under-new-ceo

Blippar Expands Blippbuilder Support to AR Glasses Under New CEO

AR creation tool Blippar has long offered its creation tool Blippbuilder, which recently implemented a “freemium” pricing model. Naturally, the tool was built around smartphones, which is how most people still experience AR. However, with the increasing prevalence of AR-enabled headsets, the company is expanding the tool’s availability.

To learn more about Blippbuilder on headsets, the company’s long-term strategy, and the effects of other Blippar developments, ARPost met the company’s new CEO, Preet Prasannan.

Meet New CEO Preet Prasannan

Prasannan is Blippar’s new CEO, but he isn’t new to the company. Prasannan discovered Blippar almost ten years ago when he was working at DreamWorks when his manager left DreamWorks to work at Blippar.

“At the time, I got very excited about what Blippar was doing in AR,” said Prasannan. “To be honest, I didn’t even know what AR was.”

Prasannan worked at Blippar for a time before leaving to found his own startup. He was still working on that project when Blippar came into problems and ultimately entered into administration. Prasannan returned to Blippar and was instrumental in its return as “Blippar 2.0” by serving as the Chief Technical Officer.

“Blippar was like family to me, so I reached out, we started speaking,” said Prasannan. “I realized that there was an opportunity to bring Blippar back to life.”

Prasannan was the CTO throughout the tenure of CEO Faisal Galaria, who recently stepped down. This offered another opportunity for Prasannan to step up.

“In December, when Faisal decided to part ways with us, we decided it would be good if I was up for it,” said Prasannan. “This is my family.”

Blippbuilder Comes to Next-Gen Hardware

The first big move under Prasannan’s leadership is bringing Blippbuilder compatibility to AR glasses. While AR on a head-mounted display and AR on a handheld display might sound similar, there were some initial hurdles.

“To be frank, it was a bit of heavy lifting when we started on headsets. The first one,” said Prasannan. “The first headset that we supported took us six months and the next headset that we supported took us 48 hours.”

The first two headsets were Magic Leap and Meta Quest Pro. While some things are being ironed out before the next selection of compatible AR headsets is released, Prasannan says that the company can essentially achieve compatibility with new headsets as fast as they are produced. Which is good, they are being produced a lot more regularly these days.

Blippbuilder for AR glasses - Meta Quest Pro

“For the next generation of AR, we have to have devices that feel natural,” said Prasannan. “It becomes a natural way of seeing and visualizing AR content.”

This isn’t just a way of future-proofing Blippar. It’s also a way to advance AR as a field worth buying into.

“If you have amazing, exciting content and a tool that creates content easily, why would you not want to buy that headset?” asked Prasannan.

A Growing Ecosystem

The announcement is exciting in another way as well: the sort of experiences that are created using Blippbuilder, particularly since it became free to use. The move has also been positive for Blippar, of course.

“We had tens of thousands of users more than usual joining us,” said Prasannan. “It seems like we took the right step when we went in that direction.”

So, who are all of those new users? Naturally, they don’t all fit into one basket, but Prasannan said that there have been a lot of educational experiences created.

“We saw a very interesting solar system being created by one of our users,” said Prasannan. “I was actually showing it to the kids in my family and the feedback was immediate.”

There has been a long-standing chasm in the promising field of educational XR. The sum is that educators don’t typically know how to build experiences and experience builders don’t typically know how to educate. Blippbuilder’s free, no-code, increasingly versatile authoring tool is helping to bridge that gap.

“One of the driving factors of switching to the freemium model was to encourage creativity in all of our users,” said Prasannan. “Right now, Blippbuilder is free so anyone can create an account and publish projects.”

More to Come From Blippar

There are more big things coming from Blippar, as a “new iteration of Blippbuilder” is scheduled to release as a beta toward the end of Q1 of this year. The tool will “make developers and technologists out of anyone who wants to” because “technology should make things more simple, not more complicated.”

Blippar Expands Blippbuilder Support to AR Glasses Under New CEO Read More »

assisted-reality:-the-other-ar

Assisted Reality: The Other AR

“AR” stands for “augmented reality,” right? Almost always. However, there is another “AR” – assisted reality. The term is almost exclusively used in industry applications, and it isn’t necessarily mutually exclusive of augmented reality. There are usually some subtle differences.

Isn’t Augmented Reality Tricky Enough?

“AR” can already be confusing, particularly given its proximity to “mixed reality.” When ARPost describes something as “mixed reality” it means that digital elements and physical objects and environments can interact with one another.

This includes hand tracking beyond simple menus. If you’re able to pick something up, for example, that counts as mixed reality. In augmented reality, you might be able to do something like position an object on a table, or see a character in your environment, but you can’t realistically interact with them and they can’t realistically interact with anything else.

So, What Is “Assisted Reality?”

Assisted reality involves having a hands-free, heads-up digital display that doesn’t interact with the environment or the environment’s occupants. It might recognize the environment to do things like generate heatmaps, or incorporate data from a digital twin, but the priority is information rather than interaction.

The camera on the outside of an assisted reality device might show the frontline worker’s view to a remote expert. It might also identify information on packaging like barcodes to instruct the frontline worker how to execute an action or where to bring a package. This kind of use case is sometimes called “data snacking” – it provides just enough information exactly when needed.

Sometimes, assisted reality isn’t even that interactive. It might be used to do things like support remote instruction by enabling video calls or displaying workflows.

Part of the objective of these devices is arguably to avoid interaction with digital elements and with the device itself. As it is used in enterprise, wearers often need their hands for completing tasks rather than work an AR device or even gesture with one.

These less technologically ambitious use cases also require a lot less compute power and a significantly smaller display. This means that they can occupy a much smaller form factor than augmented reality or mixed reality glasses. This makes them lighter, more durable, easier to integrate into personal protective equipment, and easier to power for a full shift.

Where It Gets Tricky

One of the most popular uses for augmented reality, both in industry and in current consumer applications, are virtual screens. In consumer applications, these are usually media viewers for doing things like watching videos or even playing games.

However, in enterprise applications, virtual screens might be used for expanding a virtual desktop by displaying email, text documents, and other productivity tools. This is arguably an assisted reality rather than an augmented reality use case because the digital elements are working over the physical environment rather than working with it or in it.

In fact, some people in augmented reality refer to these devices as “viewers” rather than “augmented reality glasses.” This isn’t necessarily fair, as while some devices are primarily used as “viewers,” they also have augmented reality applications and interactions – Nreal Air (review) being a prime example. Still, virtually all assisted reality devices are largely “viewers.”

Nreal Air - Hands-on Review - Jon
Jon wearing Nreal Air

Words, Words, Words

All of these terms can feel overwhelming, particularly when the lines between one definition and another aren’t always straight and clear. However, emerging technology has emerging use cases and naturally has an emerging vocabulary. Terms like “assisted reality” might not always be with us, but they can help us stay on the same page in these early days.

Assisted Reality: The Other AR Read More »

ces-2023-highlights-featuring-news-and-innovations-from-canon,-micledi,-and-nvidia

CES 2023 Highlights Featuring News and Innovations From Canon, MICLEDI, and NVIDIA

CES is considered the world’s tech event, showcasing groundbreaking technologies and innovations from some of the world’s biggest brands, developers, manufacturers, and suppliers of consumer technology. At CES 2023, attendees saw the unveiling of the latest developments from over 3,200 exhibitors, including technology companies Canon, MICLEDI, and NVIDIA.

Canon Immersive Movie Experience and Immersive Calling Experience

Canon USA has partnered with filmmaker and director M. Night Shyamalan (The Sixth Sense, The Village, and Signs) to create an immersive movie experience for CES 2023 attendees. Featuring M. Night Shyamalan’s upcoming film Knock at the Cabin (which will be in theaters February 3), Canon unveiled Kokomo, an immersive virtual reality software that leverages VR to give users an immersive calling experience.

Canon Kokomo - CES 2023
Kokomo

With Kokomo, users can now connect with their friends and family as if they’re there in person by using a compatible VR headset and smartphone. In a 3D call, Kokomo will emulate a photo-real environment and mirror the physical appearance of the user. CES 2023 participants were able to witness Kokomo in action at the Canon booth, where they were able to have a one-on-one Kokomo conversation with select characters from the movie Knock at the Cabin.

Aside from Kokomo, Canon also unveiled its Free Viewpoint Video System, which creates point-cloud-based 3D models for more immersive viewing experiences in larger areas like arenas and stadiums. At CES 2023, attendees were able to experience the Free Viewpoint System, which allowed them to watch an action scene from Knock at the Cabin from multiple viewpoints.

CES 2023 attendees also had the opportunity to see Canon’s mixed reality system MREAL in action, by experiencing a scene from Knock at the Cabin as if they were a character in the movie.

Canon MREAL X1 headset
MREAL X1

MICLEDI Demonstrates New Red µLEDs at CES 2023

MICLEDI Microdisplays, a technology company developing the microLED displays for the augmented reality market, also showcased its advancements in microLED display tech for AR glasses at CES 2023.

At the event, the company demonstrated its new red microLEDs on AllnGaP starting material. This development is in line with MICLEDI’s aim to create high-performance individual color-performing microLEDs that can be combined with the company’s full-color microLED display module.

Through MICLEDI’s innovations in microLED technology, users can begin to experience clearer and more precise digital images via AR glasses that are more portable and lightweight. The red AllnGaP microLEDs, along with MICLEDI’s three-panel full-color microLED display module, are poised to raise the standards of AR glasses in the coming years.

MICLEDI - Red GaN and Red AlInGaP microLED displays - CES 2023

“There is no one-size-fits-all solution for AR glasses,” said MICLEDI CEO, Sean Lord. “This achievement, with our previously announced blue, green, and red GaN µLEDs, opens the door to a broader offering of display module performance parameters which enables MICLEDI to serve customers developing AR glasses from medium to high resolution and medium to high brightness.”

Demonstration units of both Red GaN and Red AlInGaP were shown at the company’s booth at CES 2023.

NVIDIA Announces New Products and Innovations at CES 2023

NVIDIA announced new developments and NVIDIA Omniverse capabilities at CES 2023. The tech company, which is known for designing and building GPUs, unveiled its new GeForce RTX GPUs, which come with a host of new features that can be found in NVIDIA’s new studio laptops and GeForce RTX 4070 Ti graphics cards. This new series of portable laptops gives artists, creators, and gamers access to more powerful solutions and AI tools that will help them create 2D and 3D content faster.

NVIDIA also shared new developments to its Omniverse, including AI add-ons for Blender, access to new and free USD assets, and an update on the NVIDIA Canvas, which will be available for download in the future.

Aside from these updates, the company also released a major update to its Omniverse Enterprise, which enables users to access enhancements that will let them develop and operate more accurate virtual worlds. This major update is also set to expand the Omniverse’s capabilities through features such as new connectors, Omniverse Cloud, and Omniverse DeepSearch. More new partners are planning to use NVIDIA Omniverse to streamline their workflows and operations. These include Dentsu International, Zaha Hadid Architects, and Mercedes Benz.

NVIDIA Omniverse ACE - CES 2023
NVIDIA Omniverse ACE

Moreover, this January, NVIDIA opened its early-access program for NVIDIA Omniverse Avatar Cloud Engine (ACE), allowing developers and teams to build interactive avatars and virtual assistants at scale.

Demos of VITURE One XR Glasses and Mobile Dock

Aside from these established tech companies, VITURE, a new XR startup that received accolades from CES, TIME, and the Fast Company for its flagship product, the VITURE One XR glasses, also prepared something interesting for the CES 2023 attendees.

VITURE One XR glasses and Mobile Dock
VITURE One XR glasses and Mobile Dock

The company made both their VITURE One XR glasses, compatible with Steam Deck, laptops, and PCs, and their Mobile Dock, which introduces co-op play and Nintendo Switch compatibility, available for testing.

CES 2023 Highlights Featuring News and Innovations From Canon, MICLEDI, and NVIDIA Read More »

ontop-studios-wants-to-bring-theaters-back-to-life-with-xr-esports

ONTOP Studios Wants to Bring Theaters Back to Life With XR Esports

Have you ever seen an empty theater and thought that the space had more potential? It doesn’t even have to be an abandoned theater, just a theater with more rooms than it regularly uses, or a theater that isn’t showing movies at all hours of the day. What if those theaters could be used for, say, XR esports? That’s the idea that Nuno Folhadela is exploring with ONTOP Studios.

Meet ONTOP Studios

ONTOP Studios makes AR experiences, filters, and games. Its mission is “turning the world into your playground” through augmented reality. The studio makes independent projects but also works with an impressive list of partners including Vodafone, Samsung, and Snapchat.

ONTOP’s most recent venture, ARcade Sports, involves turning empty theaters into XR esports playgrounds through its social AR games. The idea didn’t come about because Folhadela, the Studio’s founder, has anything against movies.

“My background is in cinema, but I’ve been working in games,” Folhadela said in a video call with ARPost. “My interest is always to bring stories into the real world… Going to the cinema isn’t just about movies, it’s about an experience.”

So, why change that experience? The answer, as so many answers do these days, has to do with trends that were already underway before the pandemic caused them to explode.

“After the pandemic, movies really got hit hard. We realized that gaming is what the younger audience is going for,” said Folhadela.

From Movies to Games to Esports

Games are more interactive than movies, but they’re also more social, and both of these elements of storytelling are drawing younger people away from conventional forms of linear narratives, according to Folhadela. But, that’s not the end of the story. Games are more interactive and social than movies, and AR is a more immersive medium than 2D games.

“As a player, [AR] brings everything that we view on a screen into the real world,” said Folhadela. “All of these adventures that you have, are confined onto a flat screen. Now, you can bring all of your adventures with you.”

All of this talk about young audiences doesn’t mean that ONTOP is only interested in kids. Like VR arcades, ONTOP’s theater arenas appeal to visitors of all ages, including entire families.

ONTOP Studios - esports - ARcade Sports - game Morgana

“One man said that it felt like the first time playing with his kids – he was used to them sitting and playing Fortnite and him sitting and watching them,” said Folhadela.

Further, AR gaming can involve a lot of movement. This makes things more exciting for the players, but it also opens up a whole new level of attraction for spectators. At a time when streaming video game playthroughs is already popular, making gaming more human brings a lot of promise by making esports a lot more sporty.

“You really see the players running around so when you see a good player it’s seeing a good athlete. It’s bringing those worlds together,” said Folhadela. “It’s taking the ‘e’ out of ‘esports.’”

Buying Tickets and Paying Bills

So, how does ARcade Sports work? Folhadela describes the esports platform as a “b2b2c” (business to business to consumer) model. ONTOP Studios develops the content and maintains the companion app. Content is then licensed to property managers who promote the availability in their area. Content can even be modified to fit different areas or different business licensees.

“ARcade Sports is a platform, it’s not a game. We are always adding new games and new features,” said Folhadela.

A ticket to play in the XR-enabled esports facility includes a QR code. Scanning the QR code with the companion app lets players enter the same session. The app tracks the players’ performance in the game including their activity levels. Games are tiered based on difficulty, so beginners aren’t left out and veteran gamers don’t get bored.

esports - ARcade Sports - Morgana game

Right now, ARcade Sports is only available at select locations in Portugal. That’s set to change.

“We launched the games locally to understand the mechanics … for many players, this was their first experience with AR,” said Folhadela. “Now that this is at the right moment, we are hoping to expand to the US this year.”

Coming Soon to a Theater Near You

During our call, Folhadela displayed a number of experimental social features that aren’t yet ready to be fully integrated into the platform. However, hopefully, by the time that ONTOP Studios brings its unique brand of XR esports to the US – ideally this summer – there will be even more to keep gamers entertained, whether they’re playing or watching.

ONTOP Studios Wants to Bring Theaters Back to Life With XR Esports Read More »

another-ces-2023-gem:-next-gen-z-lens-waveguide-technology-by-lumus

Another CES 2023 Gem: Next-Gen Z-Lens Waveguide Technology by Lumus

Lumus has recently launched its Z-Lens AR architecture, which can help with the development of more compact AR glasses in the near future, thanks to efforts that reduced its micro-projector’s size by 50%.

Making its debut at the Consumer Electronics Show (CES) 2023, the new Z-Lens—which builds on the company’s Maximus 2D reflective waveguide technology—can be fitted with prescription lenses.

Lumus’ Waveguide Technology

According to the company, Lumus is currently the only brand that produces waveguides for outdoor use. Its luminance efficiency is 10 times better than those of Lumus’s competitors. Its design allows for a “true white” background and color uniformity. Moreover, the battery life of its micro-projector is 10 times better than other waveguides on the market.

The structure of the new Z-Lens  gives manufacturers more options regarding where to position the aperture or the opening where the light passes through. Lumus CEO, Ari Grobman, expressed optimism that this flexibility can lead to the creation of less bulky and more “natural-looking” AR eyewear.

“In order for AR glasses to penetrate the consumer market in a meaningful way, they need to be impressive both functionally and aesthetically,” said Grobman in a press release shared with ARPost. “With Z-Lens, we’re aligning form and function, eliminating barriers of entry for the industry, and paving the way for widespread consumer adoption.”

Z-Lens 2D Image Expansion

In AR glasses, the lenses that use Z-Lens reflective waveguides will serve as the “screen” onto which a tiny projector would display the AR image. Lumus’s lenses consist of waveguides or a series of cascading partially reflective mirrors. These mirrors are responsible for 2D expansion, widening the projected image horizontally and vertically.

Lumus Z-Lens new waveguide technology

Maximus’ patented waveguides reflect the light from the projector two times before the light bounces into your eye. The mini-projector—which is hidden in the temple of the eyeglass frame—has two components. First is a microdisplay that produces the virtual image and second is a collimator, which beams the light waves to the waveguide. The mirrors then reflect the light out of the waveguide to the user’s eyes.

“Our introduction of Maximus 2D reflective waveguide technology two years ago was just the beginning,” said Grobman. “Z-Lens, with all of its improvements unlocks the future of augmented reality that consumers are eagerly waiting for.”

New Z-Lens Standout Features

Lumus’s second-generation Z-Lens boasts a lightweight projector with a 2K by 2K vibrant color resolution and 3K-nit/watt brightness. The latter feature allows users to enjoy AR viewing in daylight or outdoors. Other AR lenses on the market feature sunglass-type tinting on their products to ensure that users can view virtual images. The absence of dark tints allows others to see the user’s eyes as if they’re wearing regular eyeglasses.

The first prototypes of Z-Lens have a 50-degree field of view (FOV). However, the company’s goal is to reach at least 80 degrees FOV in the future.

Z-Lens waveguide technology - Lumus

Here are the other qualities of the Maximus successor:

  • Eliminates ambient light artifacts or small light glares on the optical display that typically occur in AR eyewear.
  • Offers dynamic focal lens integration, which eases vergence-accommodation conflict (VAC). VAC can make images blurry because virtual objects appear closer to the eyes than their actual distance from them.
  •  Z-Lens architecture allows for direct bonding of optical elements for prescription glasses.
  • Provides more privacy through light leakage control. Third parties can’t view the displays seen by the wearer. Moreover, users don’t draw attention because Z-Lens don’t produce any “eye glow.”

“The Future Is Looking Up”

Waveguides already have practical applications in the military and medical professions, particularly among air force pilots and spinal surgeons. Lumus believes these wearable displays can someday overtake mobile phone screens and laptop monitors as hands-free communication tools.

“AR glasses are poised to transform our society,” Grobman said. “They feature better ergonomics than smartphones, novel interaction opportunities with various environments and businesses, and a much more seamless experience than handheld devices. The future, quite literally, is looking up.”

Another CES 2023 Gem: Next-Gen Z-Lens Waveguide Technology by Lumus Read More »

digilens-announces-argo-–-its-first-mass-market-product

DigiLens Announces ARGO – Its First Mass Market Product

DigiLens has been making groundbreaking components for a while now. And, last spring, the company released a developers kit – the Design v1. The company has now announced its first made-to-ship product, the ARGO.

A Look at the ARGO

DigiLens is calling ARGO “the future of wearable computing” and “the first purpose-built stand-alone AR/XR device designed for enterprise and industrial-lite workers.” That is to say that the device features a 3D-compatible binocular display, inside-out tracking, and numerous other features that have not widely made their way into the enterprise world in a usable form factor.

ARGO AR glasses by DigiLens

“ARGO will open up the next generation of mobile computing and voice and be the first true AR device to be deployed at mass scale,” DigiLens CEO, Chris Pickett, said in a release shared with ARPost. “By helping people connect and collaborate in the real – not merely virtual – world, ARGO will deliver productivity gains across sectors and improve people’s lives.”

Naturally, ARGO is built around DigiLens crystal waveguide technology resulting in an outdoor-bright display with minimal eye glow and a compact footprint. The glasses also run on a Qualcomm Snapdragon XR2 chip.

Dual tracking cameras allow the device’s spatial computing while a 48 MP camera allows for capturing records of the real world through photography and live or recording video. One antenna on either temple of the glasses ensure uninterrupted connectivity through Wi-Fi and Bluetooth.

Voice commands can be picked up even in loud environments thanks to five microphones. The glasses also work via gaze control and a simple but durable wheel and push-button input in the frames themselves.

The DigiLens Operating System

The glasses aren’t just a hardware offering. They also come with “DigiOS” – a collection of optimized APIs built around open-source Android 12.

“You can have the best hardware in the world, hardware is still an adoption barrier, but software is where the magic happens,” DigiLens VP and GM of Product, Nima Shams, said in a phone interview with ARPost. “We almost wanted the system to be smarter than the user and present them with information.”

While not all of those aspirations made it into the current iteration of DigiOS, the operating system custom-tailored to a hands-free interface does have some tricks. These include adjusting the brightness of the display so that it can be visible to the user without entirely washing out their surroundings when they need situational awareness.

“This is a big milestone for DigiLens at a very high level. We have always been a component manufacturer,” said Shams. “At the same time, we want to push the market and meet the market and it seems like the market is kind of open and waiting.”

A Brief Look Back

ARPost readers have been getting to know DigiLens for the last four years as a component manufacturer, specifically making display components. Last spring, the company released Design v1. The heavily modular developers kit was not widely available, though, according to Shams, the kit heavily influenced the ARGO.

“What we learned from Design v1 was that there wasn’t a projector module that we could use,” said Shams. “We designed our own light LED projector. … It was direct feedback from the Design v1.”

A lot of software queues in the ARGO also came from lessons learned with Design v1. The headset helped pave the way for DigiOS.

DigiLens ARGO AR glasses

“Design v1 was the first time that we built a Qualcomm XR2 system, and ARGO uses the same system,” said Shams.

Of course, the Design v1 was largely a technology showcase and a lot of its highly experimental features were never intended to make it into a mass-market product. For example, the ARGO is not the highly individualized modular device that the Design v1 is.

The Future of DigiLens

DigiLens still is, and will continue to be, a components company first and foremost. Their relationship with enterprise led the company to believe that it is singularly situated to deliver a product that industries need and haven’t yet had an answer for.

“I’ve seen some things from CES coming out of our peers that are very slim and very sexy but they’re viewers,” said Shams. “They don’t have inside-out tracking or binocular outdoor-bright displays.”

With all of this talk about mass adoption and the excitement of the company’s first marketed product, I had to ask Shams whether the company had aspirations for an eventual consumer model.

“Our official answer is ‘no,’” said Shams. “Companies like the Samsungs and the Apples of the world all believe that glasses will replace the smartphone and we want to make sure that DigiLens components are in those glasses.”

In fact, in the first week of January, DigiLens announced a partnership with OMNIVISION to “collaborate on developing new consumer AR/VR/XR product solutions.”

“Since XR involves multiple senses such as touch, vision, hearing, and smell, it has potential use cases in a huge variety of fields, such as healthcare, education, engineering, and more,” Devang Patel, OMNIVISION Marketing Director for the IoT and Emerging Segment said in a release. “That’s why our partnership with DigiLens is so exciting and important.” 

Something We Look Forward to Looking Through

The price and shipping date for ARGO aren’t yet public, but interested companies can reach out to DigiLens directly. We look forward to seeing use cases come out of the industry once the glasses have had time to find their way to the workers of the world.

DigiLens Announces ARGO – Its First Mass Market Product Read More »

new-waveguide-tech-from-vividq-and-dispelix-promises-new-era-in-ar

New Waveguide Tech From VividQ and Dispelix Promises New Era in AR

Holograms have been largely deemed impossible. However, “possible” and “impossible” are constantly shifting landscapes in immersive technology. Dispelix and VividQ have reportedly achieved holographic displays through a new waveguide device. And the companies are bringing these displays to consumers.

A Little Background

“Hologram” is a term often used in technology because it’s one that people are familiar with from science fiction. However, science fiction is almost exclusively the realm in which holograms reside. Holograms are three-dimensional images. Not an image that appears three-dimensional, but an image that actually has height, width, and depth.

These days, people are increasingly familiar with augmented reality through “passthrough.” In this method, a VR headset records your surroundings and you view a live feed of that recording augmented with digital effects. The image is still flat. Through techno-wizardry, they may appear to occupy different spaces or have different depths but they don’t.

AR glasses typically use a combination of waveguide lenses and a tiny projector called a light engine. The light engine projects digital effects onto the waveguide, which the wearer looks through. This means lighter displays that don’t rely on camera resolution for a good user experience.

Most waveguide AR projects still reproduce a flat image. These devices, typically used for virtual screens or screen mirroring from a paired device, often include spatial controls like ray casting but are arguably not “true” augmented reality and are sometimes referred to as “viewers” rather than “AR glasses.”

Some high-end waveguide headsets – almost exclusively used in enterprise and defense – achieve more immersive AR, but the virtual elements are still on a single focal plane. This limits immersion and can contribute to the feelings of sickness felt by some XR users. These devices also have a much larger form factor.

These are the issues addressed by the new technology from Dispelix and VividQ. And their material specifically mentions addressing these issues for consumer use cases like gaming.

Bringing Variable-Depth 3D Content to AR

Working together, VividQ and Dispelix have developed a “waveguide combiner” that is able to “accurately display simultaneous variable-depth 3D content within a user’s environment” in a usable form factor. This reportedly increases user comfort as well as immersion.

“Variable-depth 3D content” means that users can place virtual objects in their environment and interact with them naturally. That is opposed to needing to work around the virtual object rather than with it because the virtual object is displayed on a fixed focal plane.

VividQ 3D waveguide

“A fundamental issue has always been the complexity of displaying 3D images placed in the real world with a decent field of view and with an eyebox that is large enough to accommodate a wide range of IPDs [interpupillary distances], all encased within a lightweight lens,” VividQ CEO, Darran Milne, said in a release shared with ARPost. “We’ve solved that problem.”

VividQ and Dispelix have not only developed this technology but have also formed a commercial partnership to bring it to market and bring it to mass production. The physical device is designed to work with VividQ’s software, compatible with major game engines including Unity and Unreal Engine.

“Wearable AR devices have huge potential all around the world. For applications such as gaming and professional use, where the user needs to be immersed for long periods of time, it is vital that content is true 3D and placed within the user’s environment,” Dispelix CEO and co-founder, Antti Sunnari, said in the release. “We are thrilled to be working with VividQ.”

When Waveguides Feel Like a Mirage

Both companies have been building toward this breakthrough for a long time. Virtually every time that APost has covered Dispelix it has at least touched on a partnership with another company, which is typical for a components manufacturer. New product announcements are comparatively rare and are always the result of lots of hard work.

“The ability to display 3D images through a waveguide is a widely known barrier to [a compelling AR wearable device],” VividQ Head of Research, Alfred Newman, said in an email. “To realize the full capability, we needed to work with a partner capable of developing something that worked with our exact specifications.”

Of course, those who have been following immersive tech for a while will understand that a long time working hard to achieve a breakthrough means that that breakthrough reaching the public will require working hard for a long time. Devices using this groundbreaking technology might not reach shelves for a few more calendar pages. Again, Newman explains:

“We license the technology stack to device manufacturers and support them as they develop their products so the timeframe for launching devices is dependent on their product development. …Typically, new products take about two to three years to develop, manufacture, and launch, so we expect a similar time frame until consumers can pick a device off the shelf.”

Don’t Let the Perfect Be the Enemy of the Good

Waiting for the hardware to improve is a classic mass adoption trope, particularly in the consumer space. If you’re reading that you have to wait two to three years for impactful AR, you may have missed the message.

There are a lot of quality hardware and experience options in the AR space already – many of those already enabled by Dispelix and VividQ. If you want natural, immersive, real 3D waveguides, wait two or three years. If you want to experience AR today, you have options in already-available waveguide AR glasses or via passthrough on VR headsets.

New Waveguide Tech From VividQ and Dispelix Promises New Era in AR Read More »