“Decisions around where data centers get built have shifted dramatically over the last six months, with access to power now playing the most significant role in location scouting,” Joshi said. “The grid can’t keep pace with AI demands, so the industry is taking control with onsite power generation.”
Soluna, like other data center developers looking to rely on renewable energy, buys the excess power from wind, hydro, and solar plants that they can’t sell to the grid. By the end of the year, Soluna will have three facilities totaling 123 megawatts of capacity in Kentucky and Texas and seven projects in the works with upwards of 800 total megawatts.
Belizaire and I talked about how in Texas, where I report from, there’s plenty of curtailed energy from wind and solar farms because of the region’s transmission capacity. In West Texas, other data center developers are also taking advantage of the unused wind energy, far from major load centers like Dallas and Houston, by co-locating their giant warehouses full of advanced computers and high-powered cooling systems with the excess energy.
One data center developer using curtailed renewable power in Texas is IREN. The firm owns and operates facilities optimized for Bitcoin mining and AI. It developed a 7.5-gigawatt facility in Childress and broke ground on a 1.4-gigawatt data center in Sweetwater.
IREN purchases power through the state grid’s wholesale market during periods of oversupply, said Kent Draper, the company’s chief commercial officer, and reduces its consumption when prices are high. It’s able to do that by turning off its computers and minimizing power demand from its data centers.
But curtailment is an issue all over the world, Belizaire said, from Oklahoma, North Dakota, South Dakota, California, and Arizona in the US, to Northern Ireland, Germany, Portugal, and Australia.
“Anywhere where you have large utility-scale renewable development that’s been built out, you’re going to find it,” Belizaire said.
In a March analysis, the US Energy Information Administration reported that solar and wind power curtailments are increasing in California. In 2024, the grid operator for most of California curtailed 3.4 million megawatt hours of utility-scale wind and solar output, a 29 percent increase from the amount of electricity curtailed in 2023.
Data center operators are turning away from the grid to build their own power plants.
Sisters Abigail and Jennifer Lindsey stand on their rural property on May 27 outside New Braunfels, Texas, where they posted a sign in opposition to a large data center and power plant planned across the street. Credit: Dylan Baddour/Inside Climate News
NEW BRAUNFELS, Texas—Abigail Lindsey worries the days of peace and quiet might be nearing an end at the rural, wooded property where she lives with her son. On the old ranch across the street, developers want to build an expansive complex of supercomputers for artificial intelligence, plus a large, private power plant to run it.
The plant would be big enough to power a major city, with 1,200 megawatts of planned generation capacity fueled by West Texas shale gas. It will only supply the new data center, and possibly other large data centers recently proposed, down the road.
“It just sucks,” Lindsey said, sitting on her deck in the shade of tall oak trees, outside the city of New Braunfels. “They’ve come in and will completely destroy our way of life: dark skies, quiet and peaceful.”
The project is one of many others like it proposed in Texas, where a frantic race to boot up energy-hungry data centers has led many developers to plan their own gas-fired power plants rather than wait for connection to the state’s public grid. Egged on by supportive government policies, this buildout promises to lock in strong gas demand for a generation to come.
The data center and power plant planned across from Lindsey’s home is a partnership between an AI startup called CloudBurst and the natural gas pipeline giant Energy Transfer. It was Energy Transfer’s first-ever contract to supply gas for a data center, but it is unlikely to be its last. In a press release, the company said it was “in discussions with a number of data center developers and expects this to be the first of many agreements.”
Previously, conventional wisdom assumed that this new generation of digital infrastructure would be powered by emissions-free energy sources like wind, solar and battery power, which have lately seen explosive growth. So far, that vision isn’t panning out, as desires to build quickly overcome concerns about sustainability.
“There is such a shortage of data center capacity and power,” said Kent Draper, chief commercial officer at Australian data center developer IREN, which has projects in West Texas. “Even the large hyperscalers are willing to turn a blind eye to their renewable goals for some period of time in order to get access.”
The Hays Energy Project is a 990 MW gas-fired power plant near San Marcos, Texas.
Credit: Dylan Baddour/Inside Climate News
The Hays Energy Project is a 990 MW gas-fired power plant near San Marcos, Texas. Credit: Dylan Baddour/Inside Climate News
IREN prioritizes renewable energy for its data centers—giant warehouses full of advanced computers and high-powered cooling systems that can be configured to produce crypto currency or generate artificial intelligence. In Texas, that’s only possible because the company began work here years ago, early enough to secure a timely connection to the state’s grid, Draper said.
There were more than 2,000 active generation interconnection requests as of April 30, totalling 411,600 MW of capacity, according to grid operator ERCOT. A bill awaiting signature on Gov. Greg Abbott’s desk, S.B. 6, looks to filter out unserious large-load projects bloating the queue by imposing a $100,000 fee for interconnection studies.
Wind and solar farms require vast acreage and generate energy intermittently, so they work best as part of a diversified electrical grid that collectively provides power day and night. But as the AI gold rush gathered momentum, a surge of new project proposals has created years-long wait times to connect to the grid, prompting many developers to bypass it and build their own power supply.
Operating alone, a wind or solar farm can’t run a data center. Battery technologies still can’t store such large amounts of energy for the length of time required to provide steady, uninterrupted power for 24 hours per day, as data centers require. Small nuclear reactors have been touted as a means to meet data center demand, but the first new units remain a decade from commercial deployment, while the AI boom is here today.
Now, Draper said, gas companies approach IREN all the time, offering to quickly provide additional power generation.
Gas provides almost half of all power generation capacity in Texas, far more than any other source. But the amount of gas power in Texas has remained flat for 20 years, while wind and solar have grown sharply, according to records from the US Energy Information Administration. Facing a tidal wave of proposed AI projects, state lawmakers have taken steps to try to slow the expansion of renewable energy and position gas as the predominant supply for a new era of demand.
This buildout promises strong demand and high gas prices for a generation to come, a boon to Texas’ fossil fuel industry, the largest in the nation. It also means more air pollution and emissions of planet-warming greenhouse gases, even as the world continues to barrel past temperature records.
Texas, with 9 percent of the US population, accounted for about 15 percent of current gas-powered generation capacity in the country but 26 percent of planned future generation at the end of 2024, according to data from Global Energy Monitor. Both the current and planned shares are far more than any other state.
GEM identified 42 new gas turbine projects under construction, in development, or announced in Texas before the start of this year. None of those projects are sited at data centers. However, other projects announced since then, like CloudBurst and Energy Transfer outside New Braunfels, will include dedicated gas power plants on site at data centers.
For gas companies, the boom in artificial intelligence has quickly become an unexpected gold mine. US gas production has risen steadily over 20 years since the fracking boom began, but gas prices have tumbled since 2024, dragged down by surging supply and weak demand.
“The sudden emergence of data center demand further brightens the outlook for the renaissance in gas pricing,” said a 2025 oil and gas outlook report by East Daley Analytics, a Colorado-based energy intelligence firm. “The obvious benefit to producers is increased drilling opportunities.”
It forecast up to a 20 percent increase in US gas production by 2030, driven primarily by a growing gas export sector on the Gulf Coast. Several large export projects will finish construction in the coming years, with demand for up to 12 billion cubic feet of gas per day, the report said, while new power generation for data centers would account for 7 billion cubic feet per day of additional demand. That means profits for power providers, but also higher costs for consumers.
Natural gas, a mixture primarily composed of methane, burns much cleaner than coal but still creates air pollution, including soot, some hazardous chemicals, and greenhouse gases. Unburned methane released into the atmosphere has more than 80 times the near-term warming effect of carbon dioxide, leading some studies to conclude that ubiquitous leaks in gas supply infrastructure make it as impactful as coal to the global climate.
Credit: Dylan Baddour/Inside Climate News
It’s a power source that’s heralded for its ability to get online fast, said Ed Hirs, an energy economics lecturer at the University of Houston. But the years-long wait times for turbines have quickly become the industry’s largest constraint in an otherwise positive outlook.
“If you’re looking at a five-year lead time, that’s not going to help Alexa or Siri today,” Hirs said.
The reliance on gas power for data centers is a departure from previous thought, said Larry Fink, founder of global investment firm BlackRock, speaking to a crowd of industry executives at an oil and gas conference in Houston in March.
About four years ago, if someone said they were building a data center, they said it must be powered by renewables, he recounted. Two years ago, it was a preference.
“Today?” Fink said. “They care about power.”
Gas plants for data centers
Since the start of this year, developers have announced a flurry of gas power deals for data centers. In the small city of Abilene, the builders of Stargate, one of the world’s largest data center projects, applied for permits in January to build 360 MW of gas power generation, authorized to emit 1.6 million tons of greenhouse gases and 14 tons of hazardous air pollutants per year. Later, the company announced the acquisition of an additional 4,500 MW of gas power generation capacity.
Also in January, a startup called Sailfish announced ambitious plans for a 2,600-acre, 5,000 MW cluster of data centers in the tiny North Texas town of Tolar, population 940.
“Traditional grid interconnections simply can’t keep pace with hyperscalers’ power demands, especially as AI accelerates energy requirements,” Sailfish founder Ryan Hughes told the website Data Center Dynamics at the time. “Our on-site natural gas power islands will let customers scale quickly.”
CloudBurst and Energy Transfer announced their data center and power plant outside New Braunfels in February, and another company partnership also announced plans for a 250 MW gas plant and data center near Odessa in West Texas. In May, a developer called Tract announced a 1,500-acre, 2,000 MW data center campus with some on-site generation and some purchased gas power near the small Central Texas town of Lockhart.
Not all new data centers need gas plants. A 120 MW South Texas data center project announced in April would use entirely wind power, while an enormous, 5,000 MW megaproject outside Laredo announced in March hopes to eventually run entirely on private wind, solar, and hydrogen power (though it will use gas at first). Another collection of six data centers planned in North Texas hopes to draw 1,400 MW from the grid.
Altogether, Texas’ grid operator predicts statewide power demand will nearly double within five years, driven largely by data centers for artificial intelligence. It mirrors a similar situation unfolding across the country, according to analysis by S&P Global.
“There is huge concern about the carbon footprint of this stuff,” said Dan Stanzione, executive director of the Texas Advanced Computing Center at the University of Texas at Austin. “If we could decarbonize the power grid, then there is no carbon footprint for this.”
However, despite massive recent expansions of renewable power generation, the boom in artificial intelligence appears to be moving the country farther from, not closer to, its decarbonization goals.
Restrictions on renewable energy
Looking forward to a buildout of power supply, state lawmakers have proposed or passed new rules to support the deployment of more gas generation and slow the surging expansion of wind and solar power projects. Supporters of these bills say they aim to utilize Texas’ position as the nation’s top gas producer.
Some energy experts say the rules proposed throughout the legislative session could dismantle the state’s leadership in renewables as well as the state’s ability to provide cheap and reliable power.
“It absolutely would [slow] if not completely stop renewable energy,” said Doug Lewin, a Texas energy consultant, about one of the proposed rules in March. “That would really be extremely harmful to the Texas economy.”
While the bills deemed as “industry killers” for renewables missed key deadlines, failing to reach Abbott’s desk, they illustrate some lawmakers’ aspirations for the state’s energy industry.
One failed bill, S.B. 388, would have required every watt of new solar brought online to be accompanied by a watt of new gas. Another set of twin bills, H.B. 3356 and S.B. 715, would have forced existing wind and solar companies to buy fossil-fuel based power or connect to a battery storage resource to cover the hours the energy plants are not operating.
When the Legislature last met in 2023, it created a $5 billion public “energy fund” to finance new gas plants but not wind or solar farms. It also created a new tax abatement program that excluded wind and solar. This year’s budget added another $5 billion to double the fund.
Bluebonnet Electric Cooperative is currently completing construction on a 190 MW gas-fired peaker plant near the town of Maxwell in Caldwell County.
Credit: Dylan Baddour/Inside Climate News
Bluebonnet Electric Cooperative is currently completing construction on a 190 MW gas-fired peaker plant near the town of Maxwell in Caldwell County. Credit: Dylan Baddour/Inside Climate News
Among the lawmakers leading the effort to scale back the state’s deployment of renewables is state Sen. Lois Kolkhorst, a Republican from Brenham. One bill she co-sponsored, S.B. 819, aimed to create new siting rules for utility-scale renewable projects and would have required them to get permits from the Public Utility Commission that no other energy source—coal, gas or nuclear—needs. “It’s just something that is clearly meant to kneecap an industry,” Lewin said about the bill, which failed to pass.
Kolkhorst said the bill sought to balance the state’s need for power while respecting landowners across the state.
Former state Rep. John Davis, now a board member at Conservative Texans for Energy Innovation, said the session shows how renewables have become a red meat issue.
More than 20 years ago, Davis and Kolkhorst worked together in the Capitol as Texas deregulated its energy market, which encouraged renewables to enter the grid’s mix, he said. Now Davis herds sheep and goats on his family’s West Texas ranch, where seven wind turbines provide roughly 40 percent of their income.
He never could have dreamed how significant renewable energy would become for the state grid, he said. That’s why he’s disappointed with the direction the legislature is headed with renewables.
“I can’t think of anything more conservative, as a conservative, than wind and solar,” Davis said. “These are things God gave us—use them and harness them.”
A report published in April finds that targeted limitations on solar and wind development in Texas could increase electricity costs for consumers and businesses. The report, done by Aurora Energy Research for the Texas Association of Business, said restricting the further deployment of renewables would drive power prices up 14 percent by 2035.
“Texas is at a crossroads in its energy future,” said Olivier Beaufils, a top executive at Aurora Energy Research. “We need policies that support an all-of-the-above approach to meet the expected surge in power demand.”
Likewise, the commercial intelligence firm Wood Mackenzie expects the power demand from data centers to drive up prices of gas and wholesale consumer electricity.
Pollution from gas plants
Even when new power plants aren’t built on the site of data centers, they might still be developed because of demand from the server farms.
For example, in 2023, developer Marathon Digital started up a Bitcoin mine in the small town of Granbury on the site of the 1,100 MW Wolf Hollow II gas power plant. It held contracts to purchase 300 MW from the plant.
One year later, the power plant operator sought permits to install eight additional “peaker” gas turbines able to produce up to 352 MW of electricity. These small units, designed to turn on intermittently during hours of peak demand, release more pollution than typical gas turbines.
Those additional units would be approved to release 796,000 tons per year of greenhouse gases, 251 tons per year of nitrogen oxides and 56 tons per year of soot, according to permitting documents. That application is currently facing challenges from neighboring residents in state administrative courts.
About 150 miles away, neighbors are challenging another gas plant permit application in the tiny town of Blue. At 1,200 MW, the $1.2 billion plant proposed by Sandow Lakes Energy Co. would be among the largest in the state and would almost entirely serve private customers, likely including the large data centers that operate about 20 miles away.
Travis Brown and Hugh Brown, no relation, stand by a sign marking the site of a proposed 1,200 MW gas-fired power plant in their town of Blue on May 7.
Credit: Dylan Baddour/Inside Climate News
Travis Brown and Hugh Brown, no relation, stand by a sign marking the site of a proposed 1,200 MW gas-fired power plant in their town of Blue on May 7. Credit: Dylan Baddour/Inside Climate News
This plan bothers Hugh Brown, who moved out to these green, rolling hills of rural Lee County in 1975, searching for solitude. Now he lives on 153 wooded acres that he’s turned into a sanctuary for wildlife.
“What I’ve had here is a quiet, thoughtful life,” said Brown, skinny with a long grey beard. “I like not hearing what anyone else is doing.”
He worries about the constant roar of giant cooling fans, the bright lights overnight and the air pollution. According to permitting documents, the power plant would be authorized to emit 462 tons per year of ammonia gas, 254 tons per year of nitrogen oxides, 153 tons per year of particulate matter, or soot, and almost 18 tons per year of “hazardous air pollutants,” a collection of chemicals that are known to cause cancer or other serious health impacts.
It would also be authorized to emit 3.9 million tons of greenhouse gases per year, about as much as 72,000 standard passenger vehicles.
“It would be horrendous,” Brown said. “There will be a constant roaring of gigantic fans.”
In a statement, Sandow Lakes Energy denied that the power plant will be loud. “The sound level at the nearest property line will be similar to a quiet library,” the statement said.
Sandow Lakes Energy said the plant will support the local tax base and provide hundreds of temporary construction jobs and dozens of permanent jobs. Sandow also provided several letters signed by area residents who support the plant.
“We recognize the critical need for reliable, efficient, and environmentally responsible energy production to support our region’s growth and economic development,” wrote Nathan Bland, president of the municipal development district in Rockdale, about 20 miles from the project site.
Brown stands next to a pond on his property ringed with cypress trees he planted 30 years ago.
Credit: Dylan Baddour/Inside Climate News
Brown stands next to a pond on his property ringed with cypress trees he planted 30 years ago. Credit: Dylan Baddour/Inside Climate News
Sandow says the plant will be connected to Texas’ public grid, and many supporting letters for the project cited a need for grid reliability. But according to permitting documents, the 1,200 MW plant will supply only 80 MW to the grid and only temporarily, with the rest going to private customers.
“Electricity will continue to be sold to the public until all of the private customers have completed projects slated to accept the power being generated,” said a permit review by the Texas Commission on Environmental Quality.
Sandow has declined to name those customers. However, the plant is part of Sandow’s massive, master-planned mixed-use development in rural Lee and Milam counties, where several energy-hungry tenants are already operating, including Riot Platforms, the largest cryptocurrency mine on the continent. The seven-building complex in Rockdale is built to use up to 700 MW, and in April, it announced the acquisition of a neighboring, 125 MW cryptocurrency mine, previously operated by Rhodium. Another mine by Bitmain, also one of the world’s largest Bitcoin companies, has 560 MW of operating capacity with plans to add 180 more in 2026.
In April, residents of Blue gathered at the volunteer fire department building for a public meeting with Texas regulators and Sandow to discuss questions and concerns over the project. Brown, owner of the wildlife sanctuary, spoke into a microphone and noted that the power plant was placed at the far edge of Sandow’s 33,000-acre development, 20 miles from the industrial complex in Rockdale but near many homes in Blue.
“You don’t want to put it up into the middle of your property where you could deal with the negative consequences,” Brown said, speaking to the developers. “So it looks to me like you are wanting to make money, in the process of which you want to strew grief in your path and make us bear the environmental costs of your profit.”
Inside Climate News’ Peter Aldhous contributed to this report.
Inside the laptop/desktop examination bay at SK TES’s Fredericksburg, Va. site.
Credit: SK tes
Inside the laptop/desktop examination bay at SK TES’s Fredericksburg, Va. site. Credit: SK tes
The details of each unit—CPU, memory, HDD size—are taken down and added to the asset tag, and the device is sent on to be physically examined. This step is important because “many a concealed drive finds its way into this line,” Kent Green, manager of this site, told me. Inside the machines coming from big firms, there are sometimes little USB, SD, SATA, or M.2 drives hiding out. Some were make-do solutions installed by IT and not documented, and others were put there by employees tired of waiting for more storage. “Some managers have been pretty surprised when they learn what we found,” Green said.
With everything wiped and with some sense of what they’re made of, each device gets a rating. It’s a three-character system, like “A-3-6,” based on function, cosmetic condition, and component value. Based on needs, trends, and other data, devices that are cleared for resale go to either wholesale, retail, component harvesting, or scrap.
Full-body laptop skins
Wiping down and prepping a laptop, potentially for a full-cover adhesive skin.
Credit: SK TES
Wiping down and prepping a laptop, potentially for a full-cover adhesive skin. Credit: SK TES
If a device has retail value, it heads into a section of this giant facility where workers do further checks. Automated software plays sounds on the speakers, checks that every keyboard key is sending signals, and checks that laptop batteries are at 80 percent capacity or better. At the end of the line is my favorite discovery: full-body laptop skins.
Some laptops—certain Lenovo, Dell, and HP models—are so ubiquitous in corporate fleets that it’s worth buying an adhesive laminating sticker in their exact shape. They’re an uncanny match for the matte black, silver, and slightly less silver finishes of the laptops, covering up any blemishes and scratches. Watching one of the workers apply this made me jealous of their ability to essentially reset a laptop’s condition (so one could apply whole new layers of swag stickers, of course). Once rated, tested, and stickered, laptops go into a clever “cradle” box, get the UN 3481 “battery inside” sticker, and can be sold through retail.
Although big participants in the technology industry may be able to lobby the administration to “loosen up” restrictions on new power sources, small to medium-sized players were in a “holding pattern” as they waited to see if permitting obstacles and tariffs on renewables equipment were lifted, said Ninan.
“On average, [operators] are most likely going to try to find ways of absorbing additional costs and going to dirtier sources,” he said.
Amazon, which is the largest corporate purchaser of renewable energy globally, said carbon-free energy must remain an important part of the energy mix to meet surging demand for power, keep costs down, and hit climate goals.
“Renewable energy can often be less expensive than alternatives because there’s no fuel to purchase. Some of the purchasing agreements we have signed historically were ‘no brainers’ because they reduced our power costs,” said Kevin Miller, vice-president of Global Data Centers at Amazon Web Services.
Efforts by state and local governments to stymie renewables could also hit the sector. In Texas—the third-largest US data center market after Virginia, according to S&P Global Market Intelligence—bills are being debated that increase regulation on solar and wind projects.
“We have a huge opportunity in front of us with these data centers,” said Doug Lewin, president of Stoic Energy. “Virginia can only take so many, and you can build faster here, but any of these bills passing would kill that in the crib.”
The renewables crackdown will make it harder for “hyperscale” data centers run by companies such as Equinix, Microsoft, Google, and Meta to offset their emissions and invest in renewable energy sources.
“Demand [for renewables] has reached an all-time high,” said Christopher Wellise, sustainability vice-president at Equinix. “So when you couple that with the additional constraints, there could be some near to midterm challenges.”
Data centers powering the generative AI boom are gulping water and exhausting electricity at what some researchers view as an unsustainable pace. Two entrepreneurs who met in high school a few years ago want to overcome that crunch with a fresh experiment: sinking the cloud into the sea.
Sam Mendel and Eric Kim launched their company, NetworkOcean, out of startup accelerator Y Combinator on August 15 by announcing plans to dunk a small capsule filled with GPU servers into San Francisco Bay within a month. “There’s this vital opportunity to build more efficient computer infrastructure that we’re gonna rely on for decades to come,” Mendel says.
The founders contend that moving data centers off land would slow ocean temperature rise by drawing less power and letting seawater cool the capsule’s shell, supplementing its internal cooling system. NetworkOcean’s founders have said a location in the bay would deliver fast processing speeds for the region’s buzzing AI economy.
But scientists who study the hundreds of square miles of brackish water say even the slightest heat or disturbance from NetworkOcean’s submersible could trigger toxic algae blooms and harm wildlife. And WIRED inquiries to several California and US agencies who oversee the bay found that NetworkOcean has been pursuing its initial test of an underwater data center without having sought, much less received, any permits from key regulators.
The outreach by WIRED prompted at least two agencies—the Bay Conservation and Development Commission and the San Francisco Regional Water Quality Control Board—to email NetworkOcean that testing without permits could run afoul of laws, according to public records and spokespeople for the agencies. Fines from the BCDC can run up to hundreds of thousands of dollars.
The nascent technology has already been in hot water in California. In 2016, the state’s coastal commission issued a previously unreported notice to Microsoft saying that the tech giant had violated the law the year before by plunging an unpermitted server vessel into San Luis Obispo Bay, about 250 miles south of San Francisco. The months-long test, part of what was known as Project Natick, had ended without apparent environmental harm by the time the agency learned of it, so officials decided not to fine Microsoft, according to the notice seen by WIRED.
The renewed scrutiny of underwater data centers has surfaced an increasingly common tension between innovative efforts to combat global climate change and long-standing environmental laws. Permitting takes months, if not years, and can cost millions of dollars, potentially impeding progress. Advocates of the laws argue that the process allows for time and input to better weigh trade-offs.
“Things are overregulated because people often don’t do the right thing,” says Thomas Mumley, recently retired assistant executive officer of the bay water board. “You give an inch, they take a mile. We have to be cautious.”
Over the last two weeks, including during an interview at the WIRED office, NetworkOcean’s founders have provided driblets of details about their evolving plans. Their current intention is to test their underwater vessel for about an hour, just below the surface of what Mendel would only describe as a privately owned and operated portion of the bay that he says is not subject to regulatory oversight. He insists that a permit is not required based on the location, design, and minimal impact. “We have been told by our potential testing site that our setup is environmentally benign,” Mendel says.
Mumley, the retired regulator, calls the assertion about not needing a permit “absurd.” Both Bella Castrodale, the BCDC’s lead enforcement attorney, and Keith Lichten, a water board division manager, say private sites and a quick dip in the bay aren’t exempt from permitting. Several other experts in bay rules tell WIRED that even if some quirk does preclude oversight, they believe NetworkOcean is sending a poor message to the public by not coordinating with regulators.
“Just because these centers would be out of sight does not mean they are not a major disturbance,” says Jon Rosenfield, science director at San Francisco Baykeeper, a nonprofit that investigates industrial polluters.
School project
Mendel and Kim say they tried to develop an underwater renewable energy device together during high school in Southern California before moving onto non-nautical pursuits. Mendel, 23, dropped out of college in 2022 and founded a platform for social media influencers.
About a year ago, he built a small web server using the DIY system Raspberry Pi to host another personal project, and temporarily floated the equipment in San Francisco Bay by attaching it to a buoy from a private boat in the Sausalito area. (Mendel declined to answer questions about permits.) After talking with Kim, also 23, about this experiment, the two decided to move in together and start NetworkOcean.
Their pitch is that underwater data centers are more affordable to develop and maintain, especially as electricity shortages limit sites on land. Surrounding a tank of hot servers with water naturally helps cools them, avoiding the massive resource drain of air-conditioning and also improving on the similar benefits of floating data centers. Developers of offshore wind farms are eager to electrify NetworkOcean vessels, Mendel says.
AMD has agreed to buy artificial intelligence infrastructure group ZT Systems in a $4.9 billion cash and stock transaction, extending a run of AI investments by the chip company as it seeks to challenge market leader Nvidia.
The California-based group said the acquisition would help accelerate the adoption of its Instinct line of AI data center chips, which compete with Nvidia’s popular graphics processing units (GPUs).
ZT Systems, a private company founded three decades ago, builds custom computing infrastructure for the biggest AI “hyperscalers.” While the company does not disclose its customers, the hyperscalers include the likes of Microsoft, Meta, and Amazon.
The deal marks AMD’s biggest acquisition since it bought Xilinx for $35 billion in 2022.
“It brings a thousand world-class design engineers into our team, it allows us to develop silicon and systems in parallel and, most importantly, get the newest AI infrastructure up and running in data centers as fast as possible,” AMD’s chief executive Lisa Su told the Financial Times.
“It really helps us deploy our technology much faster because this is what our customers are telling us [they need],” Su added.
The transaction is expected to close in the first half of 2025, subject to regulatory approval, after which New Jersey-based ZT Systems will be folded into AMD’s data center business group. The $4.9bn valuation includes up to $400mn contingent on “certain post-closing milestones.”
Citi and Latham & Watkins are advising AMD, while ZT Systems has retained Goldman Sachs and Paul, Weiss.
The move comes as AMD seeks to break Nvidia’s stranglehold on the AI data center chip market, which earlier this year saw Nvidia temporarily become the world’s most valuable company as big tech companies pour billions of dollars into its chips to train and deploy powerful new AI models.
Part of Nvidia’s success stems from its “systems” approach to the AI chip market, offering end-to-end computing infrastructure that includes pre-packaged server racks, networking equipment, and software tools to make it easier for developers to build AI applications on its chips.
AMD’s acquisition shows the chipmaker building out its own “systems” offering. The company rolled out its MI300 line of AI chips last year, and says it will launch its next-generation MI350 chip in 2025 to compete with Nvidia’s new Blackwell line of GPUs.
In May, Microsoft was one of the first AI hyperscalers to adopt the MI300, building it into its Azure cloud platform to run AI models such as OpenAI’s GPT-4. AMD’s quarterly revenue for the chips surpassed $1 billion for the first time in the three months to June 30.
But while AMD has feted the MI300 as its fastest-ever product ramp, its data center revenue still represented a fraction of the $22.6 billion that Nvidia’s data center business raked in for the quarter to the end of April.
In March, ZT Systems announced a partnership with Nvidia to build custom AI infrastructure using its Blackwell chips. “I think we certainly believe ZT as part of AMD will significantly accelerate the adoption of AMD AI solutions,” Su said, but “we have customer commitments and we are certainly going to honour those”.
Su added that she expected regulators’ review of the deal to focus on the US and Europe.
In addition to increasing its research and development spending, AMD says it has invested more than $1 billion over the past year to expand its AI hardware and software ecosystem.
In July the company announced it was acquiring Finnish AI start-up Silo AI for $665 million, the largest acquisition of a privately held AI startup in Europe in a decade.
This article was produced for ProPublica’s Local Reporting Network in partnership with The Seattle Times. Sign up for Dispatches to get stories like this one as soon as they are published.
When lawmakers in Washington set out to expand a lucrative tax break for the state’s data center industry in 2022, they included what some considered an essential provision: a study of the energy-hungry industry’s impact on the state’s electrical grid.
Gov. Jay Inslee vetoed that provision but let the tax break expansion go forward. As The Seattle Times and ProPublica recently reported, the industry has continued to grow and now threatens Washington’s effort to eliminate carbon emissions from electricity generation.
Washington’s experience with addressing the power demand of data centers parallels the struggles playing out in other states around the country where the industry has rapidly grown and tax breaks are a factor.
Virginia, home to the nation’s largest data center market, once debated running data centers on carbon-emitting diesel generators during power shortages to keep the lights on in the area. (That plan faced significant public pushback from environmental groups, and an area utility is exploring other options.)
Dominion Energy, the utility that serves most of Virginia’s data centers, has said that it intends to meet state requirements to decarbonize the grid by 2045, but that the task would be more challenging with rising demands driven largely by data centers, Inside Climate News reported. The utility also has indicated that new natural gas plants will be needed.
Some Virginia lawmakers and the state’s Republican governor have proposed reversing or dramatically altering the clean energy goals.
A northern Virginia lawmaker instead proposed attaching strings to the state’s data center tax break. This year, he introduced legislation saying data centers would only qualify if they maximized energy efficiency and found renewable resources. The bill died in Virginia’s General Assembly. But the state authorized a study of the industry and how tax breaks impact the grid.
“If we’re going to have data centers, which we all know to be huge consumers of electricity, let’s require them to be as efficient as possible,” said state Delegate Richard “Rip” Sullivan Jr., the Democrat who sponsored the original bill. “Let’s require them to use as little energy as possible to do their job.”
Inslee’s 2022 veto of a study similar to Virginia’s cited the fact that Northwest power planners already include data centers in their estimates of regional demand. But supporters of the legislation said their goal was to obtain more precise answers about Washington-specific electricity needs.
Georgia lawmakers this year passed a bill to halt the state’s data center tax break until data center power use could be analyzed. In the meantime, according to media reports, the state’s largest utility said it would use fossil fuels to make up an energy shortfall caused in part by data centers. Georgia Gov. Brian Kemp then vetoed the tax break pause in May.
Lawmakers in Connecticut and South Carolina have also debated policies to tackle data center power usage in the past year.
“Maybe we want to entice more of them to come. I just want to make sure that we understand the pros and the cons of that before we do it,” South Carolina’s Senate Majority Leader Shane Massey said in May, according to the South Carolina Daily Gazette.
Countries such as Ireland, Singapore, and the Netherlands have at times forced data centers to halt construction to limit strains on the power grid, according to a report by the nonprofit Tony Blair Institute for Global Change. The report’s recommendations for addressing data center power usage include encouraging the private sector to invest directly in renewables.
Sajjad Moazeni, a University of Washington professor who studies artificial intelligence and data center power consumption, said states should consider electricity impacts when formulating data center legislation. Moazeni’s recent research found that in just one day, ChatGPT, a popular artificial intelligence tool, used roughly as much power as 33,000 U.S. households use in a year.
“A policy can help both push companies to make these data centers more efficient and preserve a cleaner, better environment for us,” Moazeni said. “Policymakers need to consider a larger set of metrics on power usage and efficiency.”
Eli Sanders contributed research while a student with the Technology, Law and Public Policy Clinic at the University of Washington School of Law.
Cooling pipes at a Google data center in Douglas County, Georgia.
Google’s greenhouse gas emissions have surged 48 percent in the past five years due to the expansion of its data centers that underpin artificial intelligence systems, leaving its commitment to get to “net zero” by 2030 in doubt.
The Silicon Valley company’s pollution amounted to 14.3 million tonnes of carbon equivalent in 2023, a 48 percent increase from its 2019 baseline and a 13 percent rise since last year, Google said in its annual environmental report on Tuesday.
Google said the jump highlighted “the challenge of reducing emissions” at the same time as it invests in the build-out of large language models and their associated applications and infrastructure, admitting that “the future environmental impact of AI” was “complex and difficult to predict.”
Chief Sustainability Officer Kate Brandt said the company remained committed to the 2030 target but stressed the “extremely ambitious” nature of the goal.
“We do still expect our emissions to continue to rise before dropping towards our goal,” said Brandt.
She added that Google was “working very hard” on reducing its emissions, including by signing deals for clean energy. There was also a “tremendous opportunity for climate solutions that are enabled by AI,” said Brandt.
As Big Tech giants including Google, Amazon, and Microsoft have outlined plans to invest tens of billions of dollars into AI, climate experts have raised concerns about the environmental impacts of the power-intensive tools and systems.
In May, Microsoft admitted that its emissions had risen by almost a third since 2020, in large part due to the construction of data centers. However, Microsoft co-founder Bill Gates last week also argued that AI would help propel climate solutions.
Meanwhile, energy generation and transmission constraints are already posing a challenge for the companies seeking to build out the new technology. Analysts at Bernstein said in June that AI would “double the rate of US electricity demand growth and total consumption could outstrip current supply in the next two years.”
In Tuesday’s report, Google said its 2023 energy-related emissions—which come primarily from data center electricity consumption—rose 37 percent year on year and overall represented a quarter of its total greenhouse gas emissions.
Google’s supply chain emissions—its largest chunk, representing 75 percent of its total emissions—also rose 8 percent. Google said they would “continue to rise in the near term” as a result in part of the build-out of the infrastructure needed to run AI systems.
Google has pledged to achieve net zero across its direct and indirect greenhouse gas emissions by 2030 and to run on carbon-free energy during every hour of every day within each grid it operates by the same date.
However, the company warned in Tuesday’s report that the “termination” of some clean energy projects during 2023 had pushed down the amount of renewables it had access to.
Meanwhile, the company’s data center electricity consumption had “outpaced” Google’s ability to bring more clean power projects online in the US and Asia-Pacific regions.
Google’s data center electricity consumption increased 17 percent in 2023, and amounted to approximately 7-10 percent of global data center electricity consumption, the company estimated. Its data centers also consumed 17 percent more water in 2023 than during the previous year, Google said.
Enlarge/ A Google sign stands in front of the building on the sidelines of the opening of the new Google Cloud data center in Hesse, Hanau, opened in October 2023.
On Wednesday, authorities arrested former Google software engineer Linwei Ding in Newark, California, on charges of stealing AI trade secrets from the company. The US Department of Justice alleges that Ding, a Chinese national, committed the theft while secretly working with two China-based companies.
According to the indictment, Ding, who was hired by Google in 2019 and had access to confidential information about the company’s data centers, began uploading hundreds of files into a personal Google Cloud account two years ago.
The trade secrets Ding allegedly copied contained “detailed information about the architecture and functionality of GPU and TPU chips and systems, the software that allows the chips to communicate and execute tasks, and the software that orchestrates thousands of chips into a supercomputer capable of executing at the cutting edge of machine learning and AI technology,” according to the indictment.
Shortly after the alleged theft began, Ding was offered the position of chief technology officer at an early-stage technology company in China that touted its use of AI technology. The company offered him a monthly salary of about $14,800, plus an annual bonus and company stock. Ding reportedly traveled to China, participated in investor meetings, and sought to raise capital for the company.
Investigators reviewed surveillance camera footage that showed another employee scanning Ding’s name badge at the entrance of the building where Ding worked at Google, making him look like he was working from his office when he was actually traveling.
Ding also founded and served as the chief executive of a separate China-based startup company that aspired to train “large AI models powered by supercomputing chips,” according to the indictment. Prosecutors say Ding did not disclose either affiliation to Google, which described him as a junior employee. He resigned from Google on December 26 of last year.
The FBI served a search warrant at Ding’s home in January, seizing his electronic devices and later executing an additional warrant for the contents of his personal accounts. Authorities found more than 500 unique files of confidential information that Ding allegedly stole from Google. The indictment says that Ding copied the files into the Apple Notes application on his Google-issued Apple MacBook, then converted the Apple Notes into PDF files and uploaded them to an external account to evade detection.
“We have strict safeguards to prevent the theft of our confidential commercial information and trade secrets,” Google spokesperson José Castañeda told Ars Technica. “After an investigation, we found that this employee stole numerous documents, and we quickly referred the case to law enforcement. We are grateful to the FBI for helping protect our information and will continue cooperating with them closely.”
Attorney General Merrick Garland announced the case against the 38-year-old at an American Bar Association conference in San Francisco. Ding faces four counts of federal trade secret theft, each carrying a potential sentence of up to 10 years in prison.