Author name: Shannon Garcia

northvolt-to-build-gigafactory-in-germany-after-state-aid-pledge

Northvolt to build gigafactory in Germany after state aid pledge

Northvolt to build gigafactory in Germany after state aid pledge

Ioanna Lykiardopoulou

Story by

Ioanna Lykiardopoulou

Ioanna is a writer at TNW. She covers the full spectrum of the European tech ecosystem, with a particular interest in startups, sustainabili Ioanna is a writer at TNW. She covers the full spectrum of the European tech ecosystem, with a particular interest in startups, sustainability, green tech, AI, and EU policy. With a background in the humanities, she has a soft spot for social impact-enabling technologies.

Northvolt, Europe’s biggest battery maker, has confirmed that it will build its next gigafactory in Heide, Germany, following the federal government’s pledge to provide state aid.

The announcement comes after several months of uncertainty. In March 2022, the Swedish manufacturer and the German state of Schleswig-Holstein signed a memorandum of understanding to construct a factory in the region. But in October 2022, Northvolt said it might postpone the plan and priotirise a US expansion instead — unless the EU was willing to match the IRA’s loftier subsidies for green technologies.

In response, the German government has now confirmed it’ll fund the gigafactory under the Temporary Crisis and Transition Framework (TCTF) — a new EU state-aid plan, designed to support the development of green projects in view of the US’ respective subsidies and Russia’s energy monopoly.

The funding needs to be approved by Brussels first, but the federal government said it’s already “in the first constructive discussions” with the European Commission.

“Backed by this commitment of the federal government, Northvolt has decided to take the next steps towards our expansion in Heide,” said Peter Carlsson, founder & CEO of the company.

The gigafactory will have a 60GWh annual production volume of battery cells, aiming to supply approximately 1million EVs. It’s expected to unlock a multi-billion euro private investment, and create 3,000 direct jobs with thousands more estimated in the surrounding industry and service sector.

“With the next steps regarding Northvolt, Germany can look forward to one of the most significant lighthouse projects of the energy and transport transition,” said Robert Habeck, Germany’s deputy chancellor and economy minister.

In addition to the Commission’s approval, the gigafactory still requires preparational on-site work for construction and the final building permission. Deliveries of the first battery cells are expected in 2026.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Northvolt to build gigafactory in Germany after state aid pledge Read More »

fairphone-unveils-user-repairable-wireless-headphones

Fairphone unveils user-repairable wireless headphones

Fairphone unveils user-repairable wireless headphones

Ioanna Lykiardopoulou

Story by

Ioanna Lykiardopoulou

Ioanna is a writer at TNW. She covers the full spectrum of the European tech ecosystem, with a particular interest in startups, sustainabili Ioanna is a writer at TNW. She covers the full spectrum of the European tech ecosystem, with a particular interest in startups, sustainability, green tech, AI, and EU policy. With a background in the humanities, she has a soft spot for social impact-enabling technologies.

At a time when the electronics industry is constantly luring consumers into buying the latest and most advanced devices, Amsterdam-based Fairphone has made a name for itself by doing the exact opposite.

Best known for its sustainably-made, modular, and repairable (DIY style) smartphones, the startup is now applying the same ethos to another product segment: headphones.

The newly-launched Fairbuds XL are a pair of over-ear wireless headphones, priced at €249. Much like the company’s smartphones, they’re sold mainly in Europe, although some authorised resellers ship to other parts of the world as well.

The Fairbuds XL come with a 30 hour battery life, 40mm dynamic drivers for sound quality, and active noise cancellation. They also feature a USB-C connector for charging, a 10m Bluetooth range, and smart assistants capabilities.

But their most impressive element is undoubtedly the design. The modular headphones consist of nine components/potential spare parts: battery, speaker to speaker cable, earcap covers, headband, ear cushion, headband base, speakers, and headband cover.

Fairphone modular repairable headphones
The components of the Fairbuds XL. Credit: Fairphone

Customers can order any of them on the company’s website or the Fairbuds app, and easily replace or repair parts that are broken or worn over time. The headphones come with a two-year warranty, which means that within this period the components’ cost will be most likely covered by the startup.

To further boost their positive environmental and societal impact, the Fairbuds XL are made with 100% recycled plastics, aluminium, and tin solder paste to the maximum extent possible, while the startup claims it’ll pay $0.55 per headphone to fill the living wage gap of the production line workers.

Fairphone’s overall ethos aligns with the EU’s goal to drastically reduce e-waste and move towards a circular economy by 2050. Upcoming policies such the Right to Repair and the Ecodesign for Sustainable Products could give a significant regional boost to the startup’s approach.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Fairphone unveils user-repairable wireless headphones Read More »

80%-of-our-fraud-scams-come-from-meta’s-platforms,-leading-uk-bank-warns

80% of our fraud scams come from Meta’s platforms, leading UK bank warns

80% of our fraud scams come from Meta’s platforms, leading UK bank warns

Ioanna Lykiardopoulou

Story by

Ioanna Lykiardopoulou

Ioanna is a writer at TNW. She covers the full spectrum of the European tech ecosystem, with a particular interest in startups, sustainabili Ioanna is a writer at TNW. She covers the full spectrum of the European tech ecosystem, with a particular interest in startups, sustainability, green tech, AI, and EU policy. With a background in the humanities, she has a soft spot for social impact-enabling technologies.

TSB is urging consumers to remain wary of financial fraud on Facebook, Instagram and WhatsApp, as scams through Meta’s platforms are increasing at a worrying pace.

The UK bank analysed its internal customer fraud data between 2021 and 2022. It found that the Meta-owned sites and apps account for a whopping 80% of all scam cases within its three biggest fraud categories: purchase, impersonation, and investment fraud.

Facebook Marketplace is responsible for 60% of TSB’s purchase fraud cases, seeing a 97% year-on-year increase. Remember that vintage table you had to pre-buy because the seller was conveniently in the Bahamas, but swore to deliver upon their return? Yes, that was a scam.

The bank attributes Facebook’s high numbers to two main factors: minimal vetting of adverts and seller profiles, and the lack of an integrated payment platform that would support secure transactions.

Meanwhile, impersonation scams — where ‘friends’ or ‘family’ in need ask for money — are soaring on WhatsApp, which has seen a 300% increase in 2022 and accounts for 65% of all cases. This is followed by Facebook and text messages at 13% each.

Meta’s platforms are also responsible for 87% of all investment fraud cases at TSB. The majority occurred on Instagram, which accounted for 67%. Facebook came in second at 22%, followed by non-Meta-owned Snapchat at 9%. The bank advises investors to be wary of social media “get rich quick” schemes, and stick to recognised investment platforms.

TSB’s findings follow the announcement of the UK’s new fraud strategy earlier this week, as the government is trying to fight back against the growing number of web- and phone-based scams. Fraud is now the most common crime in the country, costing nearly £7bn per year with 1 in 15 people falling victim.

Some of the measures include the ban of cold calls on financial products, new tech to tackle number “spoofing,” and reviewing the use of mass texting services.

The government is also requiring social media platforms to provide systems that will enable users to find a “report” button with a single click, and then a “report fraud or scam” button. Non-Meta-owned TikTok and Snapchat already offer this option for adverts.

“Social media companies must urgently clean up their platforms to protect the countless innocent people who use their services every day,” said Paul Davis, Director of Fraud Prevention at TSB. “ In the meantime, we are urging the public to remain cautious to potential scam content — and to spread the word to help protect those around you.”

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


80% of our fraud scams come from Meta’s platforms, leading UK bank warns Read More »

northvolt-targets-zero-emission-aviation-with-‘superior’-lithium-metal-battery

Northvolt targets zero-emission aviation with ‘superior’ lithium metal battery

Northvolt targets zero-emission aviation with ‘superior’ lithium metal battery

Linnea Ahlgren

Story by

Linnea Ahlgren

Swedish low-carbon battery startup Northvolt is on a bit of a roll lately. Recently, the company revealed a new collaboration with Scania to produce the longest lasting EV batteries on the market. Now, its wholly-owned subsidiary Cuberg has unveiled a program to develop high-performance batteries to achieve “safe and sustainable” electric flight. 

One of the biggest stumbling blocks to zero-emission electric aviation is, apart from access to renewable energy, battery technology. Today’s batteries are, simply put, too inefficient and too heavy. 

However, Cuberg says it has already achieved significant milestones in its next-generation lithium metal cell battery technology. This involves a lithium metal anode and proprietary liquid electrolyte, which the company says simultaneously solves the interlocking challenges of battery performance and manufacturability. 

Furthermore, Cuberg says it will have “superior power and energy capabilities to today’s conventional lithium-ion batteries.” The aim is to develop a breakthrough lithium metal cell boasting an energy density of 1000 Wh/l by 2025. 

Significant achievements thus far include building and shipping a 20 Ah commercial-format lithium metal pouch cell with specific energy of 405 Wh/kg. Furthermore, the company has engineered and produced an aviation module based around the 20 Ah cells, with specific energy of 280 Wh/kg and energy density of 320 Wh/L. 

Cubergs aviation module
Cuberg’s aviation module has up to 40% higher specific energy than comparable lithium-ion technology. Credit: Northvolt/Cuberg

Importantly, the module platform has achieved what is called passive propagation resistance during a verification test campaign, which means it can resist the spread of a thermal runaway event from one cell to another.

Thermal runaway is one of the biggest safety concerns with lithiummetal cells, as it may cause the battery to catch fire or explode. As such, the verification is considered a key step when certifying batteries for aviation. 

Expanding lithium metal cell cycles

Meanwhile, lithium metal batteries, as opposed to lithium-ion, can only be recharged a few times before they become unusable. This may be cause for other sustainability concerns, given the environmental costs of lithium extraction. 

However, in a third-party validation in July last year, Cuberg’s cell cycle life was confirmed to have been extended to 672 cycles, with energy capacity of 380 Wh/kg, making it the world’s highest-performing and longest-lived lithium metal cell in a commercially representative cell size. 

Credit: Northvolt/Cuberg

Northvolt ranks first on the list of most-funded startups in Europe, with a total of €5.5bn raised to date. Furthermore, it has secured more than €50bn worth of contracts from customers including BMW, Fluence, Scania, Volkswagen, Volvo and Polestar.

Northvolt acquired Cuberg, founded in 2015 and based in San Leandro, California, in 2021 to help bring the startup’s next-generation lithium metal cell technology to scale. 

Playing the long sustainable aviation game

Proportionally, aviation, as an industry, is not that big a polluter; it is responsible for “only” approximately 2.5% of global greenhouse gas emissions. This can be compared to transport as a whole (14%) and other industries such as agriculture, forestry and land use (24%). 

However, as other industries begin to decarbonise, the difficult-to-abate aviation sector’s share of emissions will expand. What is even more alarming is that global passenger traffic is predicted to reach 19.3 billion by 2041, up from a forecasted 8.4 billion in 2023.

It is true that the lion’s share of emissions come from long-haul air travel, and aerospace engineers may be a long way yet from coming up with a zero-emission high-capacity propulsion system. 

Meanwhile, innovation must start somewhere. Technology being developed today for the lower capacity regional air travel segment will serve as the foundation for more sustainable narrowbody and dual-aisle aircraft further down the road. 

As such, the immediate effect of replacing short-haul aircraft with electric or hydrogen-electric planes may not be globally significant. However, the extrapolated implications of developments in areas such as battery and fuel-cell technology coupled with energy storage may just be one of the avenues to solving aviation’s fossil-fuel dependency. 

Countries such as the UK, Norway and Sweden have already set deadlines to entirely decarbonise domestic aviation within the next couple of decades. Swedish electric aircraft startup Heart Aerospace has received firm orders for 230 of its 30-seat ES-30, along with options for another 100 and letters of intent for additional 108 units. The plane is scheduled to enter service in 2028, with a scalable upgrade path as “future battery technology matures.” 

Furthermore, the global electric vertical take-off and landing vehicle (eVTOL) market has around 500 developers. Specifically, Cuberg has already received orders from established companies such as Boeing and urban air mobility (UAM) startups including BETA Technologies, Ampaire and Volt Aero. The company says it will deliver modules to select aviation customers throughout 2023. 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Northvolt targets zero-emission aviation with ‘superior’ lithium metal battery Read More »

holo-interactive:-leading-the-way-in-shaping-the-future-of-mixed-reality-copresence

Holo Interactive: Leading the Way in Shaping the Future of Mixed Reality Copresence

While still in its early stages, mixed reality copresence has shown vast potential for applications beyond gaming and entertainment. Advancements in virtual, augmented, and mixed reality technology pave the way for seamless connections between real and virtual worlds. We see the lines between real-world interactions and digital-world experiences blurring into virtual oblivion.

However, mixed reality copresence still has a long way to go before it becomes available for mainstream use. Barriers to adoption, such as affordability, availability, and tech limitations, among many others, must be addressed for this technology to truly impact how we live our lives. Holo Interactive, a reality computing lab and content studio, is at the forefront of finding solutions to overcome these barriers.

Botao Amber Hu, the founder and CEO of Holo Interactive, shares his insights on the state of the mixed reality industry and his company’s role in shaping its future.

From Small Steps to Giant Leaps in Making Big Realities

After inventing the award-winning affordable stereoscopic AR headset HoloKit X, Hu gained popularity and esteem in the industry. Developing MOFA, an immersive multiplayer AR live-action RPG, further set Hu’s name as a trailblazer in mixed reality.

MOFA - Mixed Reality Copresence game Holo Interactive

With a deep belief that mixed reality copresence holds the key to unlocking the true potential of headword AR, Hu established Holo Interactive to bridge the gaps that hinder the accessibility of mixed reality copresence in the mass consumer market.

Now working with a globally distributed team, Hu is rallying developers, engineers, and other industry professionals to embrace the motto “A dream you dream alone is only a dream. A dream you dream together is reality.”

The Holo Interactive team is leading the way as the premier lab for mixed reality co-presence experiences that are accessible to all. Over the years, Holo Interactive has been developing applications and innovative products that could widen the adoption of head-worn AR. Recently, the company hasalso released the HoloKit Unity SDK to empower developers in creating copresence experiences.

Leading the Way in Mixed Reality Copresence

HoloKit X, an immersive stereoscopic hardware designed as an iPhone accessory, enhances AR experiences by creating a more realistic and engaging visual experience that allows users to interact with their environment and digital content more naturally.

It harnesses the powerful capabilities of iPhone and ARKit to deliver exceptional AR experiences to iPhone users. With its multi-modal control inputs and copresence functionality, it can provide face-to-face shared experiences with other users in real time, fostering a sense of presence and social interaction in AR environments.

HoloKit X Mixed Reality Copresence

Aside from creating AR hardware, Holo Interactive is shaping the future of mixed reality by giving developers access to tools that would enable them to create MR solutions. “The current market situation and our unique position within the ecosystem make now an ideal time to release an open-sourced SDK for HoloKit X,” Hu told ARPost in a written interview.

He explained that this strategic move enables them to establish their presence in the MR ecosystem, tap into the growing interest in AR/MR technologies, and empower developers to create copresence experiences.

By opening the HoloKit SDK to third-party users, Holo Interactive hopes to become the “Arduino for head-worn AR.“We want to encourage more people to experiment with their work in mixed reality copresence and to open-source their creations, inspiring others within the community,” said Hu.

In addition, he hopes that lowering the barriers to entry for mixed reality copresence projects and embracing open-source practices will accelerate progress in the field of MR.

Breaking the Barriers to Widespread Adoption

Immersive technologies are already transforming our lives. VR is gaining widespread use across industries. AR has also gone a long way since Pokémon Go first went viral. However, head-worn AR still faces challenges to widespread adoption.

Botao Amber Hu
Botao Amber Hu

According to Hu, “Head-worn AR has the potential to turn our world into a ‘software-defined reality’, allowing us to interact with the real world and others in novel ways, a concept known as co-presence. This exciting future, however, is not without hurdles.”

Asked about the barriers AR faces, Hu enumerates four obstacles: affordability of high-quality AR devices, efficiency of input methods, development of killer applications that drive adoption, and psychological barriers to social acceptance.

Holo Interactive is on a mission to break these barriers to adoption. Hu believes that addressing these challenges can help ensure that AR technology reaches its full potential, positively impacting our lives and the way we interact with the world around us.

Holo Interactive: Leading the Way in Shaping the Future of Mixed Reality Copresence Read More »

digital-artist-behind-iconic-ps5-campaign-launches-evolving-vr-art-gallery

Digital Artist Behind Iconic PS5 Campaign Launches Evolving VR Art Gallery

You might not recognize the name Maxim Zhestkov, but if you paid any attention to the launch of PlayStation 5, you’ll almost certainly recognize his iconic digital art which accompanied the reveal of the console. Now Zhestkov has launched a virtual gallery that he says will feature and ever-growing collection of his digital works.

Maxim Zhestkov is the artist behind the satisfying swarm of particles that that accompanied the reveal of PS5 back in 2020.

Much of Zhestkov’s work similarly employs space, motion, shapes, and sound, which makes virtual reality the perfect medium for others to experience it.

To that end Zhestkov has released a new VR experience called Modules, a virtual gallery where he’s shared 11 different works which users can explore at their own pace and from any angle, complete with artist commentary on each piece.

Modules is rendered in real-time and available on both Quest headsets and PC VR (as well as non-VR via Steam). Ironically, despite Zhestkov’s work on the PS5 reveal, the project isn’t available on PSVR 2.

Zhestkov says that Modules will “expand to contain [my] entire body of work.”

One of the scenes in ‘Modules’ | Image courtesy Maxim Zhestkov

“Over the course of years, the project will grow as the artist grows, expanding into new territories and blurring the boundaries between art, games, and reality,” he says.

The project’s website contains a roadmap of future expansions, with an ‘Interactive’ segment coming in Fall 2023, followed by ‘Collaborative’ and ‘Creative’ segments next year.

Digital Artist Behind Iconic PS5 Campaign Launches Evolving VR Art Gallery Read More »

“metaversed”:-a-book-review-and-author-interview

“Metaversed”: A Book Review and Author Interview

Metaversed: See Beyond the Hype is the new book by Samantha G. Wolfe and Luis Bravo Martins introducing the metaverse stripped of its over-inflated, pie-in-the-sky expectation cloud built up by marketers. The book presents a practical and balanced approach to using the metaverse as it exists today and preparing for how it might exist tomorrow.

ARPost received a copy of Metaversed and had the opportunity to interview the authors on how it came together and what they hope it will achieve.

Preparing for the Metaverse

Metaversed begins with an important and common question in the industry: how do we prepare for the metaverse when we can’t agree on what it is?

“Taking the internet and bringing another dimension to it and setting it free in the phygital world […] it’s almost impossible to fully understand the extent of this shift.”

– Chapter One: Predictions

Early on, the authors present a working definition of the metaverse. This isn’t for the authors to throw their definition into the war of words already taking place around the metaverse, but rather so that everyone reading Metaversed has a common starting point.

“To the authors, the metaverse is the next stage of the internet and results from the evolution of a wide variety of emerging exponential technologies maturing simultaneously, converging and enabling a new interconnected relationship between physical and digital.”

– Chapter One: Predictions

Metaversed isn’t just about technology, but how technology impacts us as a society and as individuals – and about the societal trends that are helping to usher in the metaverse. These include movements towards remote work and education, decentralization, social media, and the creator economy.

“The challenges we’re about to face will need a multidisciplinary effort. Business professionals from all areas, teachers, lawyers, scientists, historians, and sociologists, everyone can contribute with their experience and knowledge so we can start preparing for this tremendous shift.”

– Chapter One: Predictions

A Book Written for Anybody

Metaversed is written for a reader in any profession to encompass the entire metaverse. Chapter two presents all of the technologies playing into the development of the metaverse. That includes immersive technologies like the spatial web, XR hardware, and digital twins. It also includes Web3 and blockchain, cloud computing, and AI and ML.

“I feel like we went through a hype cycle of ‘the metaverse’ as a term and now we’re kind of past that. People are looking beyond that and asking, ‘What is this, really?’” said Wolfe. “I’m hoping that as people get past all of that hype they can ask ‘What does this mean to me, and what does this mean to my business?’”

Metaversed See Beyond the Hype bookReaders of ARPost might be principally interested in immersive technologies. Understanding the role that these technologies will play in larger shifts in the coming years requires an understanding of other technologies even though they may feel removed.

“The main topic is to bring in people that are not in on all of the metaverse discussion,” said Martins. “We need to have those people. We need to have a version of the metaverse that isn’t just created by technologists like us.”

The book also discusses governments and standards organizations furthering the metaverse through protecting users and ensuring interoperability respectively. A lot of the value of the metaverse will be created by users – much as with the current web, but more equitable.

“A true creator economy has been set in motion where communities are not only spawning creators but overall helping them to remain independent and relevant.[…] With several new platforms available in the gaming industry and in the so-called Web3 businesses, new avenues for distributing digital products and content are being envisioned and built.”

– Chapter Four: New Rules

Life and Work in the Metaverse

The largest single chapter in the book, “Metaversed Markets” is an exhaustive exploration of how different industries are using the extant iteration of the metaverse and how they may adapt to its development. While the bulk of Metaversed discussed opportunities in the metaverse and how to realize them, four chapters are dedicated exclusively to challenges in the metaverse.

“When living in a hybrid reality of digital and physical objects, spaces, and people that we seemingly use and own, will it all be real? The memories of our time immersed in those worlds won’t tell us otherwise. […] We can pick up our lessons learned of the risks involved and plan ahead for a better, positive metaverse. But, to do that, we need to first identify key challenges.”

– Chapter Nine: Understanding Reality

These challenges have some to do with technologies that haven’t yet been realized or optimized, but mainly pertain to the human experience of adapting to and living in the metaverse.

“The whole purpose is exactly that – to try to shed light on not just the potential of the metaverse […] but more than that to try to pass on the challenges of the metaverse,” said Martins. “Presenting the challenges is not negative – it’s facing those challenges […] At the end of the day, what we want is to contribute to a more ethical metaverse.”

Metaversed expresses hope that governments and organizations like the XR Safety Initiative will help to mitigate some risks. It also recognizes that a lot of responsibility will be put on users themselves.

“Even if it’s uncomfortable, we need to discuss how emerging tech can be monitored and regulated. We don’t have to cross our fingers and hope that big tech companies figure it out themselves (again).”

– Chapter Ten: Privacy and Safety in the Metaverse

“Unanswered Questions”

“Because we’re faced with so many unanswered questions and unsolved technical challenges, there should be no shame in saying ‘I don’t know,’ or ‘We don’t know’ when asked about the future […] for better or worse, we’re in this together.”

– Chapter Twelve: The New Humanity

The thing that struck me the most about Metaversed was its honesty. The authors are confident in their predictions but never present those predictions as already being facts. Overall, it feels like a conversation rather than a keynote or a sales pitch.

“At the end of the day, tech runs so quickly and changes so completely unexpectedly […] it’s sort of an exercise,” said Martins. “Hopefully what we can offer is more of the logic of thought.”

How “Metaversed” Came to Be

Wolfe and Martins have a long history, despite having yet to meet in person. The two began talking after Martins read “Marketing New Realities,” which Wolfe co-wrote with Cathy Hackl in 2017. Then, Martins was a guest speaker at Wolfe’s courses at New York University’s Steinhardt School. Martins was invited to write a book and knew who to talk to for a coauthor.

“It started with this opportunity that came about from the publisher. Around that time there was this huge push regarding the metaverse and I was thinking about doing something on the flipside, focusing entirely on the challenges,” said Martins. “I decided that that approach wouldn’t be the best possible way to explain to people who don’t know much or aren’t as involved.”

Wolfe’s coming on board provided the balance that Martins was looking for. It also expanded the vast network of experts that contributed their insights to Metaversed.

“He wanted to write this book about what can go wrong but I tend to be quite positive,” said Wolfe. “I also tend to look at how all of this applies to businesses.”

Despite being based in different countries and working on the book largely asynchronously, the two decided to write Metaversed with one voice, rather than passing chapters back and forth. While the book doesn’t feel divided (at least, not to people who don’t know the authors very closely) both of them have chapters that they feel they put more into.

“In the end, I think we were all very involved in doing the writing and – of course – the research,” said Martins. “There were chapters which were being run by one of us or by the other one, and some – particularly the chapters in the beginning – were very consensual.”

A Digestible Book, if Not in One Sitting

Metaversed: See Beyond the Hype is currently available on Amazon. The book, weighing in at over 300 pages, may or may not be a lot to read from cover to cover depending on where you are on your metaverse journey. However, the book was also designed to be incredibly navigable, making it easy to read or reread as you see fit.

“Metaversed”: A Book Review and Author Interview Read More »

eye-tracking-is-a-game-changer-for-xr-that-goes-far-beyond-foveated-rendering

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering Read More »

transatlantic-chip-wars?-uk-needs-to-up-its-policy-game,-leading-startup-says

Transatlantic chip wars? UK needs to up its policy game, leading startup says

Transatlantic chip wars? UK needs to up its policy game, leading startup says

Linnea Ahlgren

Story by

Linnea Ahlgren

While the UK is being labelled as “closed for business” and Rishi Sunak is playing Unicorn Kingdom in Silicon Valley, the British chip industry risks losing some of its strongest players due to a lack of supportive policies. 

Based in Cambridge, UK, Pragmatic Semiconductor, funded in part by the CIA’s investment branch In-Q-Tel, has created an ultra-thin, ultra-low-cost, flexible integrated circuit (FlexIC). Instead of relying on silicon, it is made from indium gallium zinc oxide at a fraction of the cost.

The application of the technology spans a wide range of sectors, including healthcare, pharmaceuticals, packaging and games. In the words of Pragmatics, it offers “digital traceability and interactivity to everyday objects.”

Scott White is the Founder and Executive Director, Strategic Initiatives, of Pragmatic. According to White, the company could end up leaving British shores if the UK government’s semiconductor strategy fails to meet expectations. 

So what would British politicians need to offer to provide adequate support to rival the allure of the US $52.7 billion CHIPS Act? White tells TNW that Pragmatic wants to see the government support innovative new companies through public procurement. 

“By creating home-grown revenue opportunities, and becoming a major customer for new semiconductor technologies addressing key national priorities such as net zero and affordable healthcare, the government can provide the reassurance and certainty that investors need to support startups and scaleups,” White said. 

Following the lead of Arm?

The current lack of ability to effectively raise funding for the business in the UK means that Pragmatic could move its operations overseas. Furthermore, it could potentially list outside of the UK in the future, following in the footsteps of Cambridge-compatriot Arm. Earlier this year, in a significant blow to London, the chip architecture giant and crown jewel in the UK tech industry chose to only list the company in New York.

What would a sufficient strategy look like in more detail? White believes that annual public sector procurement targets, commitments for public institutions to ‘buy British’, and encouraging public bodies, like NHS Trusts, to explore uses of the technology, would provide the required opportunities.

Furthermore, such a strategy would address both supply and demand, ultimately making “the UK a more attractive place from which innovative semiconductor companies can build and maintain a global base.” 

Funding from the government, the CIA and… China

After a Series C $125 million round (an oversubscription by more than 50%) late in 2022, the CIA’s investment branch In-Q-Tel, also referred to as IQT, owns part of Pragmatic. British Patient Capital, a subsidiary of the UK government’s economic development bank, also participated in the funding.

The company has now raised over $190 million to date and employs over 200 people. Puhua Capital, a Hangzhou-based VC focused on health and technology, has also invested an undisclosed amount. Although, Pragmatics has intentionally kept Chinese investment low, due the sensitive geopolitical situation. 

The geopolitics of chip-making capabilities

According to Chris Miller, the author of Chip War: The Fight for the World’s Most Critical Technology, the process of designing and manufacturing chips is the most complex technological process that humans have ever undertaken. In Miller’s words, the supply chain needed to produce an advanced chip “stretches across multiple continents, involves some of the most purified materials, and the most precise machine tools ever made.” 

In 2022, the global semiconductor market size was over $​​573 billion, and is predicted to grow to $1,380.79 billion by 2029. Meanwhile, Miller further believes that it is not only a matter of business, economics or technology, but also a question of political relevance as to which countries have these capabilities and which don’t.

As such, successful startups like Pragmatic could find themselves caught in strategic tug-of-wars, stretching well beyond the scope of applied technological excellence. 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Transatlantic chip wars? UK needs to up its policy game, leading startup says Read More »

meta-reaffirms-commitment-to-metaverse-vision,-has-no-plans-to-slow-billions-in-reality-labs-investments

Meta Reaffirms Commitment to Metaverse Vision, Has No Plans to Slow Billions in Reality Labs Investments

Meta announced its latest quarterly results, revealing that the company’s Reality Labs metaverse division is again reporting a loss of nearly $4 billion. The bright side? Meta’s still investing billions into XR, and it’s not showing any signs of stopping.

Meta revealed in its Q1 2023 financial results that its family of apps is now being used by over 3 billion people, an increase of 5% year-over-year, but its metaverse investments are still operating at heavy losses.

Reality Labs is responsible for R&D for its most forward-looking projects, including the Quest virtual reality headset platform, and its work in augmented reality and artificial intelligence. Meta CEO Mark Zuckerberg has warned shareholders in the past that Meta’s XR investments may not flourish until 2030.

Here’s a look at the related income losses and revenue for Reality Labs since it was formed as a distinct entity in Q4 2020:

Image created by Road to VR using data courtesy Meta

Meta reports Reality Labs generated $339 million in revenue during its first quarter of the year, a small fraction of the company’s 28.65 billion quarterly revenue. The bulk of that was generated from its family of apps—Facebook, Messenger, Instagram, and WhatsApp.

While the $3.99 billion loss may show the company is tightening its belt in contrast to Q4 2022, which was at an eye-watering $4.28 billion, Meta says we should still expect those losses to continue to increase year-over-year in 2023.

This follows the company’s second big round of layoffs, the most recent of which this month has affected VR teams at Reality Labs, Downpour Interactive (Onward) and Ready at Dawn (Lone Echo, Echo VR). The company says a third round is due to come in May, which will affect the company’s business groups.

Dubbed by Zuckerberg as the company’s “year of efficiency,” the Meta founder and chief said this during the earning call regarding the company’s layoffs:

“This has been a difficult process. But after this is done, I think we’re going to have a much more stable environment for our employees. For the rest of the year, I expect us to focus on improving our distributed work model, delivering AI tools to improve productivity, and removing unnecessary processes across the company.”

Beyond its investment in AI, Zuckerberg says the recent characterization claiming the company has somehow moved away from focusing on the metaverse is “not accurate.”

“We’ve been focusing on both AI and the metaverse for years now, and we will continue to focus on both,” Zuckerberg says, noting that breakthroughs in both areas are essentially shared, such as computer vision, procedurally generated virtual worlds, and its work on AR glasses.

Notably, Zuckerberg says the number of titles in the Quest store with at least $25 million in revenue has doubled since last year, with more than half of Quest daily actives now spend more than an hour using their device.

The company previously confirmed a Quest 3 headset is set to release this year, which is said to be slightly pricier than the $400 Quest 2 headset with features “designed to appeal to VR enthusiasts.”

Meta Reaffirms Commitment to Metaverse Vision, Has No Plans to Slow Billions in Reality Labs Investments Read More »

one-of-vr’s-smartest-room-scale-games-finally-comes-to-quest-2

One of VR’s Smartest Room-scale Games Finally Comes to Quest 2

Room-scale puzzle Eye of the Temple (2021) is available on Quest 2 starting today, bringing one of VR’s most clever room-scale experiences to a platform where it probably makes the most sense.

Update (April 27th, 2023): Eye of the Temple is now live on the Quest Store for Quest 2, bringing its innovative room-scale puzzling to the standalone headset.

Ported to Quest with the help of Salmi Games, Eye of the Temple lets you explore a vast and treacherous temple and uncover the ancient legend of the Eye. Just make sure to have plenty of space in your room for plenty of walking, whipping, and hopefully no tripping.

Check out the new launch trailer, linked below:

Original Article (April 13th, 2023): Released on SteamVR headsets in 2021 by indie developer Rune Skovbo Johansen, Eye of the Temple is a unique puzzle that we haven’t seen before or since.

The game’s innovative locomotion style lets you explore a massive temple complex with your own two feet, ushering you to jump onto moving platforms of all shapes and sizes, which importantly takes place within a 2×2m physical space.

What results is a mechanically pleasing and immersive experience that teleportation or even joystick-controller smooth locomotion simply can’t provide. We liked it so much at the time, we even gave it Road to VR’s 2021 Excellence in Locomotion award.

Skovbo Johansen says the secret to the unique locomotion style is keeping the player in the center of the play area, which he says are “all about how the platforms are positioned relative to each other.”

Take a look at how it works in the explainer video below:

While most PC VR tethers provide enough slack to get around the required 2×2m play area, the amount of turning and jumping you’ll do in the physical space really pushes the user’s ability to ‘tune out’ the cable to the limit, as you have to unwind yourself and hop over the tether constantly—something you might not notice as much in less physical games.

There’s no word on when we can expect Eye of the Temple to release on Quest 2, which critically removes any cable faffing woes you may have.

In the meanwhile, catch the trailer below, and follow along with Skovbo Johansen on Twitter where he regularly posts updates on the game’s development.

One of VR’s Smartest Room-scale Games Finally Comes to Quest 2 Read More »

‘propagation-vr’-sequel-coming-to-quest-&-steamvr-next-week,-gameplay-trailer-here

‘Propagation VR’ Sequel Coming to Quest & SteamVR Next Week, Gameplay Trailer Here

Propagation VR (2020), the VR survival horror game for PC VR headsets, is getting a sequel called Propagation: Paradise Hotel, and it’s coming next week.

Update (April 27th, 2023): WanadevStudio announced Propagation: Paradise Hotel is coming on May 4th to Quest 2 and SteamVR headsets. You can now wishlist it on the Quest Store and Steam.

In Propagation: Paradise Hotel you are a solo adventurer taking on the role of Emily Diaz, who must explore the Paradise Hotel’s dark surroundings to find her lost twin sister Ashley. Use items, weapons, and tools as you progress through the story, which is filled with savage creatures thanks to a strange illness.

Check out the final gameplay trailer below:

Original Article (December 3rd, 2021): During Upload VR’s showcase, developer WanadevStudio unveiled the upcoming sequel, which promises to be an “intense VR survival horror adventure with thrilling storytelling, in which you will explore dark environments, make terrifying encounters and get your adrenaline pumping.”

WanadevStudio says the sequel will be a single-player adventure taking place in the Propagation universe, which will serve up a story that focuses on exploration, stealth, and action. And plenty of zombies and mutants.

Propagation VR launched for free on Steam back in September 2020, garnering it an ‘Overwhelmingly Positive’ user rating on the platform for its visceral zombie-shooting experience.

Wanadev estimates a late 2022 release on SteamVR headsets for Paradise Hotel (see update). The studio hasn’t mentioned whether the game is coming to other platforms besides SteamVR, however it has done so with its previous title Ragnarock (2021), a Viking-themed rhythm game launched for both SteamVR and Oculus Quest.

‘Propagation VR’ Sequel Coming to Quest & SteamVR Next Week, Gameplay Trailer Here Read More »