syndication

“fascists”:-elon-musk-responds-to-proposed-fines-for-disinformation-on-x

“Fascists”: Elon Musk responds to proposed fines for disinformation on X

Being responsible is so hard —

“Elon Musk’s had more positions on free speech than the Kama Sutra,” says lawmaker.

A smartphone displays Elon Musk's profile on X, the app formerly known as Twitter.

Getty Images | Dan Kitwood

Elon Musk has lambasted Australia’s government as “fascists” over proposed laws that could levy substantial fines on social media companies if they fail to comply with rules to combat the spread of disinformation and online scams.

The billionaire owner of social media site X posted the word “fascists” on Friday in response to the bill, which would strengthen the Australian media regulator’s ability to hold companies responsible for the content on their platforms and levy potential fines of up to 5 percent of global revenue. The bill, which was proposed this week, has yet to be passed.

Musk’s comments drew rebukes from senior Australian politicians, with Stephen Jones, Australia’s finance minister, telling national broadcaster ABC that it was “crackpot stuff” and the legislation was a matter of sovereignty.

Bill Shorten, the former leader of the Labor Party and a cabinet minister, accused the billionaire of only championing free speech when it was in his commercial interests. “Elon Musk’s had more positions on free speech than the Kama Sutra,” Shorten said in an interview with Australian radio.

The exchange marks the second time that Musk has confronted Australia over technology regulation.

In May, he accused the country’s eSafety Commissioner of censorship after the government agency took X to court in an effort to force it to remove graphic videos of a stabbing attack in Sydney. A court later denied the eSafety Commissioner’s application.

Musk has also been embroiled in a bitter dispute with authorities in Brazil, where the Supreme Court ruled last month that X should be blocked over its failure to remove or suspend certain accounts accused of spreading misinformation and hateful content.

Australia has been at the forefront of efforts to regulate the technology sector, pitting it against some of the world’s largest social media companies.

This week, the government pledged to introduce a minimum age limit for social media use to tackle “screen addiction” among young people.

In March, Canberra threatened to take action against Meta after the owner of Facebook and Instagram said it would withdraw from a world-first deal to pay media companies to link to news stories.

The government also introduced new data privacy measures to parliament on Thursday that would impose hefty fines and potential jail terms of up to seven years for people found guilty of “doxxing” individuals or groups.

Prime Minister Anthony Albanese’s government had pledged to outlaw doxxing—the publication of personal details online for malicious purposes—this year after the details of a private WhatsApp group containing hundreds of Jewish Australians were published online.

Australia is one of the first countries to pursue laws outlawing doxxing. It is also expected to introduce a tranche of laws in the coming months to regulate how personal data can be used by artificial intelligence.

“These reforms give more teeth to the regulation,” said Monique Azzopardi at law firm Clayton Utz.

© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.

“Fascists”: Elon Musk responds to proposed fines for disinformation on X Read More »

proposed-underwater-data-center-surprises-regulators-who-hadn’t-heard-about-it

Proposed underwater data center surprises regulators who hadn’t heard about it

Proposed underwater data center surprises regulators who hadn’t heard about it

BalticServers.com

Data centers powering the generative AI boom are gulping water and exhausting electricity at what some researchers view as an unsustainable pace. Two entrepreneurs who met in high school a few years ago want to overcome that crunch with a fresh experiment: sinking the cloud into the sea.

Sam Mendel and Eric Kim launched their company, NetworkOcean, out of startup accelerator Y Combinator on August 15 by announcing plans to dunk a small capsule filled with GPU servers into San Francisco Bay within a month. “There’s this vital opportunity to build more efficient computer infrastructure that we’re gonna rely on for decades to come,” Mendel says.

The founders contend that moving data centers off land would slow ocean temperature rise by drawing less power and letting seawater cool the capsule’s shell, supplementing its internal cooling system. NetworkOcean’s founders have said a location in the bay would deliver fast processing speeds for the region’s buzzing AI economy.  

But scientists who study the hundreds of square miles of brackish water say even the slightest heat or disturbance from NetworkOcean’s submersible could trigger toxic algae blooms and harm wildlife. And WIRED inquiries to several California and US agencies who oversee the bay found that NetworkOcean has been pursuing its initial test of an underwater data center without having sought, much less received, any permits from key regulators.

The outreach by WIRED prompted at least two agencies—the Bay Conservation and Development Commission and the San Francisco Regional Water Quality Control Board—to email NetworkOcean that testing without permits could run afoul of laws, according to public records and spokespeople for the agencies. Fines from the BCDC can run up to hundreds of thousands of dollars.

The nascent technology has already been in hot water in California. In 2016, the state’s coastal commission issued a previously unreported notice to Microsoft saying that the tech giant had violated the law the year before by plunging an unpermitted server vessel into San Luis Obispo Bay, about 250 miles south of San Francisco. The months-long test, part of what was known as Project Natick, had ended without apparent environmental harm by the time the agency learned of it, so officials decided not to fine Microsoft, according to the notice seen by WIRED.

The renewed scrutiny of underwater data centers has surfaced an increasingly common tension between innovative efforts to combat global climate change and long-standing environmental laws. Permitting takes months, if not years, and can cost millions of dollars, potentially impeding progress. Advocates of the laws argue that the process allows for time and input to better weigh trade-offs.

“Things are overregulated because people often don’t do the right thing,” says Thomas Mumley, recently retired assistant executive officer of the bay water board. “You give an inch, they take a mile. We have to be cautious.”

Over the last two weeks, including during an interview at the WIRED office, NetworkOcean’s founders have provided driblets of details about their evolving plans. Their current intention is to test their underwater vessel for about an hour, just below the surface of what Mendel would only describe as a privately owned and operated portion of the bay that he says is not subject to regulatory oversight. He insists that a permit is not required based on the location, design, and minimal impact. “We have been told by our potential testing site that our setup is environmentally benign,” Mendel says.

Mumley, the retired regulator, calls the assertion about not needing a permit “absurd.” Both Bella Castrodale, the BCDC’s lead enforcement attorney, and Keith Lichten, a water board division manager, say private sites and a quick dip in the bay aren’t exempt from permitting. Several other experts in bay rules tell WIRED that even if some quirk does preclude oversight, they believe NetworkOcean is sending a poor message to the public by not coordinating with regulators.

“Just because these centers would be out of sight does not mean they are not a major disturbance,” says Jon Rosenfield, science director at San Francisco Baykeeper, a nonprofit that investigates industrial polluters.

School project

Mendel and Kim say they tried to develop an underwater renewable energy device together during high school in Southern California before moving onto non-nautical pursuits. Mendel, 23, dropped out of college in 2022 and founded a platform for social media influencers.

About a year ago, he built a small web server using the DIY system Raspberry Pi to host another personal project, and temporarily floated the equipment in San Francisco Bay by attaching it to a buoy from a private boat in the Sausalito area. (Mendel declined to answer questions about permits.) After talking with Kim, also 23, about this experiment, the two decided to move in together and start NetworkOcean.

Their pitch is that underwater data centers are more affordable to develop and maintain, especially as electricity shortages limit sites on land. Surrounding a tank of hot servers with water naturally helps cools them, avoiding the massive resource drain of air-conditioning and also improving on the similar benefits of floating data centers. Developers of offshore wind farms are eager to electrify NetworkOcean vessels, Mendel says.

Proposed underwater data center surprises regulators who hadn’t heard about it Read More »

these-household-brands-want-to-redefine-what-counts-as-“recyclable”

These household brands want to redefine what counts as “recyclable”

These household brands want to redefine what counts as “recyclable”

Olga Pankova/Moment via Getty Images

This story was originally published by ProPublica, a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.

Most of the products in the typical kitchen use plastics that are virtually impossible to recycle.

The film that acts as a lid on Dole Sunshine fruit bowls, the rings securing jars of McCormick dried herbs, the straws attached to Juicy Juice boxes, the bags that hold Cheez-Its and Cheerios—they’re all destined for the dumpster.

Now a trade group representing those brands and hundreds more is pressuring regulators to make plastic appear more environmentally friendly, a proposal experts say could worsen a crisis that is flooding the planet and our bodies with the toxic material.

The Consumer Brands Association believes companies should be able to stamp “recyclable” on products that are technically “capable” of being recycled, even if they’re all but guaranteed to end up in a landfill. As ProPublica previously reported, the group argued for a looser definition of “recyclable” in written comments to the Federal Trade Commission as the agency revises the Green Guides—guidelines for advertising products with sustainable attributes.

The association’s board of directors includes officials from some of the world’s richest companies, such as PepsiCo, Procter & Gamble, Coca-Cola, Land O’Lakes, Keurig Dr Pepper, Hormel Foods Corporation, Molson Coors Beverage Company, Campbell Soup, Kellanova, Mondelez International, Conagra Brands, J.M. Smucker, and Clorox.

Some of the companies own brands that project health, wellness, and sustainability. That includes General Mills, owner of Annie’s macaroni and cheese; The Honest Co., whose soaps and baby wipes line the shelves at Whole Foods; and Colgate-Palmolive, which owns the natural deodorant Tom’s of Maine.

ProPublica contacted the 51 companies on the association’s board of directors to ask if they agreed with the trade group’s definition of “recyclable.” Most did not respond. None said they disagreed with the definition. Nine companies referred ProPublica back to the association.

“The makers of America’s household brands are committed to creating a more circular economy which is why the industry has set sustainability goals and invested in consumer education tools” with “detailed recycling instructions,” Joseph Aquilina, the association’s vice president and deputy general counsel, wrote in an email.

The Green Guides are meant to increase consumer trust in sustainable products. Though these guidelines are not laws, they serve as a national reference for companies and other government agencies for how to define terms like “compostable,” “nontoxic” and “recyclable.” The Federal Trade Commission is revising the guides for the first time since 2012.

Most of the plastic we encounter is functionally not recyclable. It’s too expensive or technically difficult to deal with the health risks posed by the dyes and flame retardants found in many products. Collecting, sorting, storing and shipping the plastic for reprocessing often costs much more than plowing it into a landfill. Though some newer technologies have pushed the boundaries of what’s possible, these plastic-recycling techniques are inefficient and exist in such limited quantities that experts say they can’t be relied upon. The reality is: Only 5 percent of Americans’ discarded plastic gets recycled. And while soda bottles and milk jugs can be turned into new products, other common forms of plastic, like flimsy candy wrappers and chip bags, are destined for trash heaps and oceans, where they can linger for centuries without breaking down.

The current Green Guides allow companies to label products and packaging as “recyclable” if at least 60 percent of Americans have access to facilities that will take the material. As written, the guidelines don’t specify whether it’s enough for the facilities to simply collect and sort the items or if there needs to be a reasonable expectation that the material will be made into something new.

These household brands want to redefine what counts as “recyclable” Read More »

the-golden-age-of-offbeat-arctic-research

The Golden Age of offbeat Arctic research

cold war dreamers —

The Cold War spawned some odd military projects that were doomed to fail.

At the US Army’s Camp Century on the Greenland ice sheet, an Army truck equipped with a railroad wheel conversion rides on 1,300 feet of track under the snow.

Enlarge / At the US Army’s Camp Century on the Greenland ice sheet, an Army truck equipped with a railroad wheel conversion rides on 1,300 feet of track under the snow.

In recent years, the Arctic has become a magnet for climate change anxiety, with scientists nervously monitoring the Greenland ice sheet for signs of melting and fretting over rampant environmental degradation. It wasn’t always that way.

At the height of the Cold War in the 1950s, as the fear of nuclear Armageddon hung over American and Soviet citizens, ­idealistic scientists and engineers saw the vast Arctic region as a place of unlimited potential for creating a bold new future. Greenland emerged as the most tantalizing proving ground for their research.

Scientists and engineers working for and with the US military cooked up a rash of audacious cold-region projects—some innovative, many spit-balled, and most quickly abandoned. They were the stuff of science fiction: disposing of nuclear waste by letting it melt through the ice; moving people, supplies, and missiles below the ice using subways, some perhaps atomic powered; testing hovercraft to zip over impassable crevasses; making furniture from a frozen mix of ice and soil; and even building a nuclear-powered city under the ice sheet.

Today, many of their ideas, and the fever dreams that spawned them, survive only in the yellowed pages and covers of magazines like “REAL: the exciting magazine FOR MEN” and dozens of obscure Army technical reports.

Karl and Bernhard Philberth, both physicists and ordained priests, thought Greenland’s ice sheet the perfect repository for nuclear waste. Not all the waste—first they’d reprocess spent reactor fuel so that the long-lived nuclides would be recycled. The remaining, mostly short-lived radionuclides would be fused into glass or ceramic and surrounded by a few inches of lead for transport. They imagined several million radioactive medicine balls about 16 inches in diameter scattered over a small area of the ice sheet (about 300 square miles) far from the coast.

Because the balls were so radioactive, and thus warm, they would melt their way into the ice, each with the energy of a bit less than two dozen 100-watt incandescent light bulbs—a reasonable leap from Karl Philberth’s expertise designing heated ice drills that worked by melting their way through glaciers. The hope was that by the time the ice carrying the balls emerged at the coast thousands or tens of thousands of years later, the radioactivity would have decayed away. One of the physicists later reported that the idea was shown to him, by God, in a vision.

US Army test of the Snowblast in Greenland in the 1950s, a machine designed to smooth snow runways.

Enlarge / US Army test of the Snowblast in Greenland in the 1950s, a machine designed to smooth snow runways.

Of course, the plan had plenty of unknowns and led to heated discussion at scientific meetings when it was presented—what, for example, would happen if the balls got crushed or caught up in flows of meltwater near the base of the ice sheet. And would the radioactive balls warm the ice so much that the ice flowed faster at the base, speeding the balls’ trip to the coast?

Logistical challenges, scientific doubt, and politics sunk the project. Producing millions of radioactive glass balls wasn’t yet practical, and the Danes, who at the time controlled Greenland, were never keen on allowing nuclear waste disposal on what they saw as their island. Some skeptics even worried about climate change melting the ice. Nonetheless, the Philberths made visits to the ice sheet and published peer-reviewed scientific papers about their waste dream.

The Golden Age of offbeat Arctic research Read More »

americans-misunderstand-their-contribution-to-deteriorating-environment

Americans misunderstand their contribution to deteriorating environment

Power lines are cast in silhouette as the Creek Fire creeps up on on the Shaver Springs community off of Tollhouse Road on Tuesday, Sept. 8, 2020, in Auberry, California.

Enlarge / Power lines are cast in silhouette as the Creek Fire creeps up on on the Shaver Springs community off of Tollhouse Road on Tuesday, Sept. 8, 2020, in Auberry, California.

This article originally appeared on Inside Climate News, a nonprofit, independent news organization that covers climate, energy and the environment. It is republished with permission. Sign up for their newsletter here

Most people are “very” or “extremely” concerned about the state of the natural world, a new global public opinion survey shows.

Roughly 70 percent of 22,000 people polled online earlier this year agreed that human activities were pushing the Earth past “tipping points,” thresholds beyond which nature cannot recover, like loss of the Amazon rainforest or collapse of the Atlantic Ocean’s currents. The same number of respondents said the world needs to reduce carbon emissions within the next decade.

Just under 40 percent of respondents said technological advances can solve environmental challenges.

The Global Commons survey, conducted for two collectives of “economic thinkers” and scientists known as Earth4All and the Global Commons Alliance, polled people across 22 countries, including low-, middle- and high-income nations. The survey’s stated aim was to assess public opinion about “societal transformations” and “planetary stewardship.”

The results, released Thursday, highlight that people living under diverse circumstances seem to share worries about the health of ecosystems and the environmental problems future generations will inherit.

Explore the latest news about what’s at stake for the climate during this election season.

But there were some regional differences. People living in emerging economies, including Kenya and India, perceived themselves to be more exposed to environmental and climate shocks, like drought, flooding, and extreme weather. That group expressed higher levels of concern about the environment, though 59 percent of all respondents said they are “very” or “extremely” worried about “the state of nature today,” and another 29 percent are at least somewhat concerned.

Americans are included in the global majority, but a more complex picture emerged in the details of the survey, conducted by Ipsos.

Roughly one in two Americans said they are not very or not at all exposed to environmental and climate change risks. Those perceptions contrast sharply with empirical evidence showing that climate change is having an impact in nearly every corner of the United States. A warming planet has intensified hurricanes battering coasts, droughts striking middle American farms, and wildfires threatening homes and air quality across the country. And climate shocks are driving up prices of some food, like chocolate and olive oil, and consumer goods.

Americans also largely believe they do not bear responsibility for global environmental problems. Only about 15 percent of US respondents said that high- and middle-income Americans share responsibility for climate change and natural destruction. Instead, they attribute the most blame to businesses and governments of wealthy countries.

Those survey responses suggest that at least half of Americans may not feel they have any skin in the game when it comes to addressing global environmental problems, according to Geoff Dabelko, a professor at Ohio University and expert in environmental policy and security.

Translating concern about the environment to actual change requires people to believe they have something at stake, Dabelko said. “It’s troubling that Americans aren’t making that connection.”

While fossil fuel companies have long campaigned to shape public perception in a way that absolves their industry of fault for ecosystem destruction and climate change, individual behavior does play a role. Americans have some of the highest per-capita consumption rates in the world.

The world’s wealthiest 10 percent are responsible for nearly half the world’s carbon emissions, along with ecosystem destruction and related social impacts. For instance, American consumption of gold, tropical hardwoods like mahogany and cedar and other commodities has been linked to the destruction of the Amazon rainforest and attacks on Indigenous people defending their territories from extractive activities.

The United States is one of the world’s wealthiest countries and home to 38 percent of the world’s millionaires (the largest share). But a person doesn’t need to be a millionaire to fit within the cohort of the world’s wealthiest. Americans without children earning more than $60,000 a year after tax, and families of three with an after-tax household income above $130,000, are in the richest 1 percent of the world’s population.

United Nations emissions gap reports have said that to reach global climate goals, the world’s wealthiest people must cut their personal emissions by at least a factor of 30. High-income Americans’ emissions footprint is largely a consequence of lifestyle choices like living in large homes, flying often, opting for personal vehicles over public transportation, and conspicuous consumption of fast fashion and other consumer goods.

Americans misunderstand their contribution to deteriorating environment Read More »

nvidia’s-ai-chips-are-cheaper-to-rent-in-china-than-us

Nvidia’s AI chips are cheaper to rent in China than US

secondhand channels —

Supply of processors helps Chinese startups advance AI technology despite US restrictions.

Nvidia’s AI chips are cheaper to rent in China than US

VGG | Getty Images

The cost of renting cloud services using Nvidia’s leading artificial intelligence chips is lower in China than in the US, a sign that the advanced processors are easily reaching the Chinese market despite Washington’s export restrictions.

Four small-scale Chinese cloud providers charge local tech groups roughly $6 an hour to use a server with eight Nvidia A100 processors in a base configuration, companies and customers told the Financial Times. Small cloud vendors in the US charge about $10 an hour for the same setup.

The low prices, according to people in the AI and cloud industry, are an indication of plentiful supply of Nvidia chips in China and the circumvention of US measures designed to prevent access to cutting-edge technologies.

The A100 and H100, which is also readily available, are among Nvidia’s most powerful AI accelerators and are used to train the large language models that power AI applications. The Silicon Valley company has been banned from shipping the A100 to China since autumn 2022 and has never been allowed to sell the H100 in the country.

Chip resellers and tech startups said the products were relatively easy to procure. Inventories of the A100 and H100 are openly advertised for sale on Chinese social media and ecommerce sites such as Xiaohongshu and Alibaba’s Taobao, as well as in electronics markets, at slight markups to pricing abroad.

China’s larger cloud operators such as Alibaba and ByteDance, known for their reliability and security, charge double to quadruple the price of smaller local vendors for similar Nvidia A100 servers, according to pricing from the two operators and customers.

After discounts, both Chinese tech giants offer packages for prices comparable to Amazon Web Services, which charges $15 to $32 an hour. Alibaba and ByteDance did not respond to requests for comment.

“The big players have to think about compliance, so they are at a disadvantage. They don’t want to use smuggled chips,” said a Chinese startup founder. “Smaller vendors are less concerned.”

He estimated there were more than 100,000 Nvidia H100 processors in the country based on their widespread availability in the market. The Nvidia chips are each roughly the size of a book, making them relatively easy for smugglers to ferry across borders, undermining Washington’s efforts to limit China’s AI progress.

“We bought our H100s from a company that smuggled them in from Japan,” said a startup founder in the automation field who paid about 500,000 yuan ($70,000) for two cards this year. “They etched off the serial numbers.”

Nvidia said it sold its processors “primarily to well-known partners … who work with us to ensure that all sales comply with US export control rules”.

“Our pre-owned products are available through many second-hand channels,” the company added. “Although we cannot track products after they are sold, if we determine that any customer is violating US export controls, we will take appropriate action.”

The head of a small Chinese cloud vendor said low domestic costs helped offset the higher prices that providers paid for smuggled Nvidia processors. “Engineers are cheap, power is cheap, and competition is fierce,” he said.

In Shenzhen’s Huaqiangbei electronics market, salespeople speaking to the FT quoted the equivalent of $23,000–$30,000 for Nvidia’s H100 plug-in cards. Online sellers quote the equivalent of $31,000–$33,000.

Nvidia charges customers $20,000–$23,000 for H100 chips after recently cutting prices, according to Dylan Patel of SemiAnalysis.

One data center vendor in China said servers made by Silicon Valley’s Supermicro and fitted with eight H100 chips hit a peak selling price of 3.2 million yuan after the Biden administration tightened export restrictions in October. He said prices had since fallen to 2.5 million yuan as supply constraints eased.

Several people involved in the trade said merchants in Malaysia, Japan, and Indonesia often shipped Supermicro servers or Nvidia processors to Hong Kong before bringing them across the border to Shenzhen.

The black market trade depends on difficult-to-counter workarounds to Washington’s export regulations, experts said.

For example, while subsidiaries of Chinese companies are banned from buying advanced AI chips outside the country, their executives could establish new companies in countries such as Japan or Malaysia to make the purchases.

“It’s hard to completely enforce export controls beyond the US border,” said an American sanctions expert. “That’s why the regulations create obligations for the shipper to look into end users and [the] commerce [department] adds companies believed to be flouting the rules to the [banned] entity list.”

Additional reporting by Michael Acton in San Francisco.

© 2024 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.

Nvidia’s AI chips are cheaper to rent in China than US Read More »

harmful-“nudify”-websites-used-google,-apple,-and-discord-sign-on-systems

Harmful “nudify” websites used Google, Apple, and Discord sign-on systems

Harmful “nudify” websites used Google, Apple, and Discord sign-on systems

Major technology companies, including Google, Apple, and Discord, have been enabling people to quickly sign up to harmful “undress” websites, which use AI to remove clothes from real photos to make victims appear to be “nude” without their consent. More than a dozen of these deepfake websites have been using login buttons from the tech companies for months.

A WIRED analysis found 16 of the biggest so-called undress and “nudify” websites using the sign-in infrastructure from Google, Apple, Discord, Twitter, Patreon, and Line. This approach allows people to easily create accounts on the deepfake websites—offering them a veneer of credibility—before they pay for credits and generate images.

While bots and websites that create nonconsensual intimate images of women and girls have existed for years, the number has increased with the introduction of generative AI. This kind of “undress” abuse is alarmingly widespread, with teenage boys allegedly creating images of their classmates. Tech companies have been slow to deal with the scale of the issues, critics say, with the websites appearing highly in search results, paid advertisements promoting them on social media, and apps showing up in app stores.

“This is a continuation of a trend that normalizes sexual violence against women and girls by Big Tech,” says Adam Dodge, a lawyer and founder of EndTAB (Ending Technology-Enabled Abuse). “Sign-in APIs are tools of convenience. We should never be making sexual violence an act of convenience,” he says. “We should be putting up walls around the access to these apps, and instead we’re giving people a drawbridge.”

The sign-in tools analyzed by WIRED, which are deployed through APIs and common authentication methods, allow people to use existing accounts to join the deepfake websites. Google’s login system appeared on 16 websites, Discord’s appeared on 13, and Apple’s on six. X’s button was on three websites, with Patreon and messaging service Line’s both appearing on the same two websites.

WIRED is not naming the websites, since they enable abuse. Several are part of wider networks and owned by the same individuals or companies. The login systems have been used despite the tech companies broadly having rules that state developers cannot use their services in ways that would enable harm, harassment, or invade people’s privacy.

After being contacted by WIRED, spokespeople for Discord and Apple said they have removed the developer accounts connected to their websites. Google said it will take action against developers when it finds its terms have been violated. Patreon said it prohibits accounts that allow explicit imagery to be created, and Line confirmed it is investigating but said it could not comment on specific websites. X did not reply to a request for comment about the way its systems are being used.

In the hours after Jud Hoffman, Discord vice president of trust and safety, told WIRED it had terminated the websites’ access to its APIs for violating its developer policy, one of the undress websites posted in a Telegram channel that authorization via Discord was “temporarily unavailable” and claimed it was trying to restore access. That undress service did not respond to WIRED’s request for comment about its operations.

Harmful “nudify” websites used Google, Apple, and Discord sign-on systems Read More »

electric-vehicle-battery-fires—what-to-know-and-how-to-react

Electric vehicle battery fires—what to know and how to react

sick burns —

It’s very rare, but lithium-ion batteries in electric vehicles can catch fire.

battery pack

Enlarge / The battery pack of a Volkswagen ID. Buzz electric microbus on the assembly line during a media tour of the Volkswagen AG multipurpose and commercial vehicle plant in Hannover, Germany, on Thursday, June 16, 2022.

Lithium-ion battery fires can be intense and frightening. As someone who used to repair second-hand smartphones, I’ve extinguished my fair share of flaming iPhones with punctured lithium-ion batteries. And the type of smartphone battery in your pocket right now is similar to what’s inside of electric vehicles. Except, the EV battery stores way more energy—so much energy that some firefighters are receiving special training to extinguish the extra-intense EV flames that are emitted by burning EV batteries after road accidents.

If you’ve been reading the news about EVs, you’ve likely encountered plenty of scary articles about battery fires on the rise. Recently, the US National Transportation Safety Board and the California Highway Patrol announced they are investigating a Tesla semi truck fire that ignited after the vehicle struck a tree. The lithium-ion battery burned for around four hours.

Does this mean that you should worry about your personal electric vehicle as a potential fire hazard? Not really. It makes more sense to worry about a gas-powered vehicle going up in flames than an electric vehicle, since EVs are less likely to catch fire than their more traditional transportation counterparts.

“Fires because of battery manufacturing defects are really very rare,” says Matthew McDowell, a codirector of Georgia Tech’s Advanced Battery Center. “Especially in electric vehicles, because they also have battery management systems.” The software keeps tabs on the different cells that comprise an EV’s battery and can help prevent the battery from being pushed beyond its limits.

How do electric vehicle fires happen?

During a crash that damages the EV battery, a fire may start with what’s called thermal runaway. EV batteries aren’t one solid brick. Rather, think of these batteries as a collection of many smaller batteries, called cells, pressed up against each other. With thermal runaway, a chemical reaction located in one of the cells lights an initial fire, and the heat soon spreads to each adjacent cell until the entire EV battery is burning.

Greg Less, director of the University of Michigan’s Battery Lab, breaks down EV battery fires into two distinct categories: accidents and manufacturing defects. He considers accidents to be everything from a collision that punctures the battery to a charging mishap. “Let’s take those off the table,” says Less. “Because, I think people understand that, regardless of the vehicle type, if you’re in an accident, there could be a fire.”

While all EV battery fires are hard to put out, fires from manufacturing defects are likely more concerning to consumers, due to their seeming randomness. (Think back to when all those Samsung phones had to be recalled because battery issues made them fire hazards.) How do these rare issues with EV battery manufacturing cause fires at what may feel like random moments?

It all comes down to how the batteries are engineered. “There’s some level of the engineering that has gone wrong and caused the cell to short, which then starts generating heat,” says Less. “Heat causes the liquid electrolyte to evaporate, creating a gas inside the cell. When the heat gets high enough, it catches fire, explodes, and then propagates to other cells.” These kinds of defects are likely what caused the highly publicized recent EV fires in South Korea, one of which damaged over a hundred vehicles in a parking lot.

How to react if your EV catches fire

According to the National Fire Prevention Agency, if an EV ever catches fire while you’re behind the wheel, immediately find a safe way to pull over and get the car away from the main road. Then, turn off the engine and make sure everyone leaves the vehicle immediately. Don’t delay things by grabbing personal belongings, just get out. Remain over 100 feet away from the burning car as you call 911 and request the fire department.

Also, you shouldn’t attempt to put out the flame yourself. This is a chemical fire, so a couple buckets of water won’t sufficiently smother the flames. EV battery fires can take first responders around 10 times more water to extinguish than a fire in a gas-powered vehicle. Sometimes the firefighters may decide to let the battery just burn itself out, rather than dousing it with water.

Once an EV battery catches fire, it’s possible for the chemical fire to reignite after the initial burn dies down. It’s even possible for the battery to go up in flames again days later. “Both firefighters and secondary responders, such as vehicle recovery or tow companies, also need to be aware of the potential for stranded energy that may remain in the undamaged portions of the battery,” says Thomas Barth, an investigator and biomechanics engineer for the NTSB, in an emailed statement. “This energy can pose risks for electric shock or cause the vehicle to reignite.”

Although it may be tempting to go back into the car and grab your wallet or other important items if the flame grows smaller or goes out for a second, resist the urge. Wait until your local fire department arrives to assess the overall situation and give you the all clear. Staying far away from the car also helps minimize your potential for breathing in unhealthy fumes emitted from the battery fire.

How could EV batteries be safer?

In addition to quick recalls and replacements of potentially faulty lithium-ion batteries, both researchers I spoke with were excited about future possibilities for a different kind of battery, called solid-state, to make EVs even more reliable. “These batteries could potentially show greater thermal stability than lithium-ion batteries,” says McDowell. “When it heats up a lot, it may just remain pretty stable.” With a solid-state battery, the liquid electrolyte is no longer part of battery cells, removing the most flammable aspect of battery design.

These solid-state batteries are already available in some smaller electronics, but producing large versions of the batteries at vast scale continues to be a hurdle that EV manufacturers are working to overcome.

This story originally appeared on wired.com.

Electric vehicle battery fires—what to know and how to react Read More »

from-recycling-to-food:-can-we-eat-plastic-munching-microbes?

From recycling to food: Can we eat plastic-munching microbes?

breaking it down —

Researchers are trying to turn plastic-eating bacteria into food source for humans.

From recycling to food: Can we eat plastic-munching microbes?

Olga Pankova/Moment via Getty Images

In 2019, an agency within the US Department of Defense released a call for research projects to help the military deal with the copious amount of plastic waste generated when troops are sent to work in remote locations or disaster zones. The agency wanted a system that could convert food wrappers and water bottles, among other things, into usable products, such as fuel and rations. The system needed to be small enough to fit in a Humvee and capable of running on little energy. It also needed to harness the power of plastic-eating microbes.

“When we started this project four years ago, the ideas were there. And in theory, it made sense,” said Stephen Techtmann, a microbiologist at Michigan Technological University, who leads one of the three research groups receiving funding. Nevertheless, he said, in the beginning, the effort “felt a lot more science-fiction than really something that would work.”

That uncertainty was key. The Defense Advanced Research Projects Agency, or DARPA, supports high-risk, high-reward projects. This means there’s a good chance that any individual effort will end in failure. But when a project does succeed, it has the potential to be a true scientific breakthrough. “Our goal is to go from disbelief, like, ‘You’re kidding me. You want to do what?’ to ‘You know, that might be actually feasible,’” said Leonard Tender, a program manager at DARPA who is overseeing the plastic waste projects.

The problems with plastic production and disposal are well-known. According to the United Nations Environment Program, the world creates about 440 million tons of plastic waste per year. Much of it ends up in landfills or in the ocean, where microplastics, plastic pellets, and plastic bags pose a threat to wildlife. Many governments and experts agree that solving the problem will require reducing production, and some countries and US states have additionally introduced policies to encourage recycling.

For years, scientists have also been experimenting with various species of plastic-eating bacteria. But DARPA is taking a slightly different approach in seeking a compact and mobile solution that uses plastic to create something else entirely: food for humans.

The goal, Techtmann hastens to add, is not to feed people plastic. Rather, the hope is that the plastic-devouring microbes in his system will themselves prove fit for human consumption. While Techtmann believes most of the project will be ready in a year or two, it’s this food step that could take longer. His team is currently doing toxicity testing, and then they will submit their results to the Food and Drug Administration for review. Even if all that goes smoothly, an additional challenge awaits. There’s an ick factor, said Techtmann, “that I think would have to be overcome.”

The military isn’t the only entity working to turn microbes into nutrition. From Korea to Finland, a small number of researchers, as well as some companies, are exploring whether microorganisms might one day help feed the world’s growing population.

Two birds, one stone

According to Tender, DARPA’s call for proposals was aimed at solving two problems at once. First, the agency hoped to reduce what he called supply-chain vulnerability: During war, the military needs to transport supplies to troops in remote locations, which creates a safety risk for people in the vehicle. Additionally, the agency wanted to stop using hazardous burn pits as a means of dealing with plastic waste. “Getting those waste products off of those sites responsibly is a huge lift,” Tender said.

The Michigan Tech system begins with a mechanical shredder, which reduces the plastic to small shards that then move into a reactor, where they soak in ammonium hydroxide under high heat. Some plastics, such as PET, which is commonly used to make disposable water bottles, break down at this point. Other plastics used in military food packaging—namely polyethylene and polypropylene—are passed along to another reactor, where they are subject to much higher heat and an absence of oxygen.

Under these conditions, the polyethylene and polypropylene are converted into compounds that can be upcycled into fuels and lubricants. David Shonnard, a chemical engineer at Michigan Tech who oversaw this component of the project, has developed a startup company called Resurgent Innovation to commercialize some of the technology. (Other members of the research team, said Shonnard, are pursuing additional patents related to other parts of the system.)

From recycling to food: Can we eat plastic-munching microbes? Read More »

microsoft-to-host-security-summit-after-crowdstrike-disaster

Microsoft to host security summit after CrowdStrike disaster

Bugging out —

Redmond wants to improve the resilience of Windows to buggy software.

Photo of a Windows BSOD

Microsoft is stepping up its plans to make Windows more resilient to buggy software after a botched CrowdStrike update took down millions of PCs and servers in a global IT outage.

The tech giant has in the past month intensified talks with partners about adapting the security procedures around its operating system to better withstand the kind of software error that crashed 8.5 million Windows devices on July 19.

Critics say that any changes by Microsoft would amount to a concession of shortcomings in Windows’ handling of third-party security software that could have been addressed sooner.

Yet they would also prove controversial among security vendors that would have to make radical changes to their products, and force many Microsoft customers to adapt their software.

Last month’s outages—which are estimated to have caused billions of dollars in damages after grounding thousands of flights and disrupting hospital appointments worldwide—heightened scrutiny from regulators and business leaders over the extent of access that third-party software vendors have to the core, or kernel, of Windows operating systems.

Microsoft will host a summit next month for government representatives and cyber security companies, including CrowdStrike, to “discuss concrete steps we will all take to improve security and resiliency for our joint customers,” Microsoft said on Friday.

The gathering will take place on September 10 at Microsoft’s headquarters near Seattle, it said in a blog post.

Bugs in the kernel can quickly crash an entire operating system, triggering the millions of “blue screens of death” that appeared around the globe after CrowdStrike’s faulty software update was sent out to clients’ devices.

Microsoft told the Financial Times it was considering several options to make its systems more stable and had not ruled out completely blocking access to the Windows kernel—an option some rivals fear would put their software at a disadvantage to the company’s internal security product, Microsoft Defender.

“All of the competitors are concerned that [Microsoft] will use this to prefer their own products over third-party alternatives,” said Ryan Kalember, head of cyber security strategy at Proofpoint.

Microsoft may also demand new testing procedures from cyber security vendors rather than adapting the Windows system itself.

Apple, which was not hit by the outages, blocks all third-party providers from accessing the kernel of its MacOS operating system, forcing them to operate in the more limited “user-mode.”

Microsoft has previously said it could not do the same, after coming to an understanding with the European Commission in 2009 that it would give third parties the same access to its systems as that for Microsoft Defender.

Some experts said, however, that this voluntary commitment to the EU had not tied Microsoft’s hands in the way it claimed, arguing that the company had always been free to make the changes now under consideration.

“These are technical decisions of Microsoft that were not part of [the arrangement],” said Thomas Graf, a partner at Cleary Gottlieb in Brussels who was involved in the case.

“The text [of the understanding] does not require them to give access to the kernel,” added AJ Grotto, a former senior director for cyber security policy at the White House.

Grotto said Microsoft shared some of the blame for the July disruption since the outages would not have been possible without its decision to allow access to the kernel.

Nevertheless, while it might boost a system’s resilience, blocking kernel access could also bring “real trade-offs” for the compatibility with other software that had made Windows so popular among business customers, Forrester analyst Allie Mellen said.

“That would be a fundamental shift for Microsoft’s philosophy and business model,” she added.

Operating exclusively outside the kernel may lower the risk of triggering mass outages but it was also “very limiting” for security vendors and could make their products “less effective” against hackers, Mellen added.

Operating within the kernel gave security companies more information about potential threats and enabled their defensive tools to activate before malware could take hold, she added.

An alternative option could be to replicate the model used by the open-source operating system Linux, which uses a filtering mechanism that creates a segregated environment within the kernel in which software, including cyber defense tools, can run.

But the complexity of overhauling how other security software works with Windows means that any changes will be hard for regulators to police and Microsoft will have strong incentives to favor its own products, rivals said.

It “sounds good on paper, but the devil is in the details,” said Matthew Prince, chief executive of digital services group Cloudflare.

© 2024 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.

Microsoft to host security summit after CrowdStrike disaster Read More »

how-accurate-are-wearable-fitness-trackers?-less-than-you-might think

How accurate are wearable fitness trackers? Less than you might think

some misleading metrics —

Wide variance underscores need for a standardized approach to validation of devices.

How accurate are wearable fitness trackers? Less than you might think

Corey Gaskin

Back in 2010, Gary Wolf, then the editor of Wired magazine, delivered a TED talk in Cannes called “the quantified self.” It was about what he termed a “new fad” among tech enthusiasts. These early adopters were using gadgets to monitor everything from their physiological data to their mood and even the number of nappies their children used.

Wolf acknowledged that these people were outliers—tech geeks fascinated by data—but their behavior has since permeated mainstream culture.

From the smartwatches that track our steps and heart rate, to the fitness bands that log sleep patterns and calories burned, these gadgets are now ubiquitous. Their popularity is emblematic of a modern obsession with quantification—the idea that if something isn’t logged, it doesn’t count.

At least half the people in any given room are likely wearing a device, such as a fitness tracker, that quantifies some aspect of their lives. Wearables are being adopted at a pace reminiscent of the mobile phone boom of the late 2000s.

However, the quantified self movement still grapples with an important question: Can wearable devices truly measure what they claim to?

Along with my colleagues Maximus Baldwin, Alison Keogh, Brian Caulfield, and Rob Argent, I recently published an umbrella review (a systematic review of systematic reviews) examining the scientific literature on whether consumer wearable devices can accurately measure metrics like heart rate, aerobic capacity, energy expenditure, sleep, and step count.

At a surface level, our results were quite positive. Accepting some error, wearable devices can measure heart rate with an error rate of plus or minus 3 percent, depending on factors like skin tone, exercise intensity, and activity type. They can also accurately measure heart rate variability and show good sensitivity and specificity for detecting arrhythmia, a problem with the rate of a person’s heartbeat.

Additionally, they can accurately estimate what’s known as cardiorespiratory fitness, which is how the circulatory and respiratory systems supply oxygen to the muscles during physical activity. This can be quantified by something called VO2Max, which is a measure of how much oxygen your body uses while exercising.

The ability of wearables to accurately measure this is better when those predictions are generated during exercise (rather than at rest). In the realm of physical activity, wearables generally underestimate step counts by about 9 percent.

Challenging endeavour

However, discrepancies were larger for energy expenditure (the number of calories you burn when exercising) with error margins ranging from minus-21.27 percent to 14.76 percent, depending on the device used and the activity undertaken.

Results weren’t much better for sleep. Wearables tend to overestimate total sleep time and sleep efficiency, typically by more than 10 percent. They also tend to underestimate sleep onset latency (a lag in getting to sleep) and wakefulness after sleep onset. Errors ranged from 12 percent to 180 percent, compared to the gold standard measurements used in sleep studies, known as polysomnography.

The upshot is that, despite the promising capabilities of wearables, we found conducting and synthesizing research in this field to be very challenging. One hurdle we encountered was the inconsistent methodologies employed by different research groups when validating a given device.

This lack of standardization leads to conflicting results and makes it difficult to draw definitive conclusions about a device’s accuracy. A classic example from our research: one study might assess heart rate accuracy during high-intensity interval training, while another focuses on sedentary activities, leading to discrepancies that can’t be easily reconciled.

Other issues include varying sample sizes, participant demographics, and experimental conditions—all of which add layers of complexity to the interpretation of our findings.

What does it mean for me?

Perhaps most importantly, the rapid pace at which new wearable devices are released exacerbates these issues. With most companies following a yearly release cycle, we and other researchers find it challenging to keep up. The timeline for planning a study, obtaining ethical approval, recruiting and testing participants, analyzing results, and publishing can often exceed 12 months.

By the time a study is published, the device under investigation is likely to already be obsolete, replaced by a newer model with potentially different specifications and performance characteristics. This is demonstrated by our finding that less than 5 percent of the consumer wearables that have been released to date have been validated for the range of physiological signals they purport to measure.

What do our results mean for you? As wearable technologies continue to permeate various facets of health and lifestyle, it is important to approach manufacturers’ claims with a healthy dose of skepticism. Gaps in research, inconsistent methodologies, and the rapid pace of new device releases underscore the need for a more formalized and standardized approach to the validation of devices.

The goal here would be to foster collaborative synergies between formal certification bodies, academic research consortia, popular media influencers, and the industry so that we can augment the depth and reach of wearable technology evaluation.

Efforts are already underway to establish a collaborative network that can foster a richer, multifaceted dialogue that resonates with a broad spectrum of stakeholders—ensuring that wearables are not just innovative gadgets but reliable tools for health and wellness.The Conversation

Cailbhe Doherty, assistant professor in the School of Public Health, Physiotherapy and Sports Science, University College Dublin. This article is republished from The Conversation under a Creative Commons license. Read the original article.

How accurate are wearable fitness trackers? Less than you might think Read More »

amd-signs-$4.9-billion-deal-to-challenge-nvidia’s-ai-infrastructure-lead

AMD signs $4.9 billion deal to challenge Nvidia’s AI infrastructure lead

chip wars —

Company hopes acquisition of ZT Systems will accelerate adoption of its data center chips.

Visitors walk past the AMD booth at the 2024 Mobile World Congress

AMD has agreed to buy artificial intelligence infrastructure group ZT Systems in a $4.9 billion cash and stock transaction, extending a run of AI investments by the chip company as it seeks to challenge market leader Nvidia.

The California-based group said the acquisition would help accelerate the adoption of its Instinct line of AI data center chips, which compete with Nvidia’s popular graphics processing units (GPUs).

ZT Systems, a private company founded three decades ago, builds custom computing infrastructure for the biggest AI “hyperscalers.” While the company does not disclose its customers, the hyperscalers include the likes of Microsoft, Meta, and Amazon.

The deal marks AMD’s biggest acquisition since it bought Xilinx for $35 billion in 2022.

“It brings a thousand world-class design engineers into our team, it allows us to develop silicon and systems in parallel and, most importantly, get the newest AI infrastructure up and running in data centers as fast as possible,” AMD’s chief executive Lisa Su told the Financial Times.

“It really helps us deploy our technology much faster because this is what our customers are telling us [they need],” Su added.

The transaction is expected to close in the first half of 2025, subject to regulatory approval, after which New Jersey-based ZT Systems will be folded into AMD’s data center business group. The $4.9bn valuation includes up to $400mn contingent on “certain post-closing milestones.”

Citi and Latham & Watkins are advising AMD, while ZT Systems has retained Goldman Sachs and Paul, Weiss.

The move comes as AMD seeks to break Nvidia’s stranglehold on the AI data center chip market, which earlier this year saw Nvidia temporarily become the world’s most valuable company as big tech companies pour billions of dollars into its chips to train and deploy powerful new AI models.

Part of Nvidia’s success stems from its “systems” approach to the AI chip market, offering end-to-end computing infrastructure that includes pre-packaged server racks, networking equipment, and software tools to make it easier for developers to build AI applications on its chips.

AMD’s acquisition shows the chipmaker building out its own “systems” offering. The company rolled out its MI300 line of AI chips last year, and says it will launch its next-generation MI350 chip in 2025 to compete with Nvidia’s new Blackwell line of GPUs.

In May, Microsoft was one of the first AI hyperscalers to adopt the MI300, building it into its Azure cloud platform to run AI models such as OpenAI’s GPT-4. AMD’s quarterly revenue for the chips surpassed $1 billion for the first time in the three months to June 30.

But while AMD has feted the MI300 as its fastest-ever product ramp, its data center revenue still represented a fraction of the $22.6 billion that Nvidia’s data center business raked in for the quarter to the end of April.

In March, ZT Systems announced a partnership with Nvidia to build custom AI infrastructure using its Blackwell chips. “I think we certainly believe ZT as part of AMD will significantly accelerate the adoption of AMD AI solutions,” Su said, but “we have customer commitments and we are certainly going to honour those”.

Su added that she expected regulators’ review of the deal to focus on the US and Europe.

In addition to increasing its research and development spending, AMD says it has invested more than $1 billion over the past year to expand its AI hardware and software ecosystem.

In July the company announced it was acquiring Finnish AI start-up Silo AI for $665 million, the largest acquisition of a privately held AI startup in Europe in a decade.

© 2024 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.

AMD signs $4.9 billion deal to challenge Nvidia’s AI infrastructure lead Read More »