Author name: DJ Henderson

“so-aggravating”:-outdated-ads-start-appearing-on-ps5-home-screen

“So aggravating”: Outdated ads start appearing on PS5 home screen

Ad station —

Players are annoyed as new home screen needs work.

PlayStation 5

Getty

PlayStation 5 owners are reporting advertisements on the device’s home screen. Frustratingly, the ads seem to be rather difficult to disable, and some are also outdated ads and/or confusing content.

The ads, visible on users’ home screens when they hover over a game title, can only be removed if you disconnect from the Internet, IGN reported today. However, that would block a lot of the console’s functionality. The PS5 dashboard previously had ads but not on the home screen.

Before this recent development, people would see game art if they hovered over a game icon on the PS5’s home screen. Now, doing so reportedly brings up dated advertisements. For example, IGN reported seeing an ad for Spider-Man: Across the Spider-Verse “coming soon exclusively in cinemas” when hovering over the Marvel’s Spider-Man: Miles Morales game. Webheads will of course recall that the Spider-Verse movie came out in June 2023.

Similarly, going to NBA 2K25 reportedly shows an ad for gaining early access. But the game came out early this month.

Per IGN, it seems that the console is “pulling in the latest news for each game, whether it be a YouTube video, patch notes, or even the announcement of a different game entirely.” That means that not all games are showing advertisements. Instead, some show an image for a YouTube video about the game or a note about patch notes or updates for the game.

There also seem to be some mix-ups, with MP1st reporting seeing an ad for the LEGO Horizon Adventures game when hovering over the icon for Horizon Zero Dawn. The publication wrote: “The ad also make[s it] confusing a bit, as… it looks like you’re playing LEGO Horizon Adventures and not the actual Horizon game we’re on.”

Some games, like Astro Bot, however, don’t seem to be affected by the changes, per IGN.

Annoyed and confused

Gamers noticing the change have taken to the web to share their annoyance, disappointment, and, at times, confusion about the content suddenly forced into the PS5’s home screen.

“As someone playing through the Spiderman series now, this confused the hell out of me,” Crack_an_ag said via Reddit.

Others are urging Sony to either remove the feature or fix it so that it can be helpful, while others argue that the feature couldn’t be helpful regardless.

“Forcing every single game to make its latest news story its dashboard art is SO stupid as no one game uses the news feature consistently,” Reddit user jackcos wrote.

Sam88FPS, meanwhile, noted that ads drove them from Xbox to PlayStation:

One of the main reasons I moved away from Xbox was the fact they started to build the Xbox UI around ads and pushing [Game Pass]. Hopefully Sony listens more because Xbox absolutely refused to, in fact, they even added full screen startup ads lmao.

It’s unclear what exactly prompted this change. Some suspect it’s related to firmware update 24.06-10.00.00. But that update came out on September 12, and, as IGN noted, its patch notes don’t say anything about this. Considering the obvious problems and mix of content being populated, it’s possible that Sony is working out some kinks and that eventually the content shown on users’ home screens will become more relevant or consistent. The change has also come a few days after a developer claimed that Sony lost $400 million after pulling the Concord online game after just two weeks, prompting digs at Sony and unconfirmed theories that Sony is trying to make up for financial losses with ads.

Ars Technica has reached out to Sony about why it decided to add non-removable ads to the PS5 home screen and about the outdated and otherwise perplexing content being displayed. We’ll let you know if we hear back.

“So aggravating”: Outdated ads start appearing on PS5 home screen Read More »

verizon-customers-face-mass-scale-outage-across-the-us

Verizon customers face mass-scale outage across the US

5Gpocalypse —

More than 100,000 reports appeared on Downdetector.

A map showing hotspots of outages primarily in the east coast and central US, but some in California as well

Enlarge / A Downdetector map showing where Verizon outages are reported.

Wireless customers of Verizon and AT&T have found that they cannot make calls, send or receive text messages, or download any mobile data. As of this article’s publication, it appears the problem has yet to be resolved.

Users took to social media throughout the morning to complain that their phones were showing “SOS” mode, which allows emergency calls but nothing else. This is what phones sometimes offer when the user has no SIM registered on the device. Resetting the device and other common solutions do not resolve the issue. For much of the morning, Verizon offered no response to the reports.

Within hours, more than 100,000 users reported problems on the website Downdetector. The problem does not appear isolated to any particular part of the country; users in California reported problems, and so did users on the East Coast and in Chicago, among other places.

By 10 am, some AT&T users also began reporting problems. Outage maps based on user-reported data found that the outages were especially common in parts of the country otherwise affected by Hurricane Helene.

After a period of silence, Verizon acknowledged the problem in a public statement. “We are aware of an issue impacting service for some customers,” a spokesperson told NBC News and others. “Our engineers are engaged and we are working quickly to identify and solve the issue.”

However, the spokesperson did not specify why the outage was occurring. It’s not the first major online service outage this year, though. AT&T experienced an outage previously, and the CrowdStrike-related outage of Microsoft services caused chaos and made headlines in July.

Update 5: 37 PM ET:  Some users are reporting they have regained service, and Verizon confirmed this in another statement: “Verizon engineers are making progress on our network issue and service has started to be restored. We know how much people rely on Verizon and apologize for any inconvenience some of our customers experienced today. We continue to work around the clock to fully resolve this issue.”

Verizon customers face mass-scale outage across the US Read More »

for-the-first-time-since-1882,-uk-will-have-no-coal-fired-power-plants

For the first time since 1882, UK will have no coal-fired power plants

Into the black —

A combination of government policy and economics spells the end of UK’s coal use.

Image of cooling towers and smoke stacks against a dusk sky.

Enlarge / The Ratcliffe-on-Soar plant is set to shut down for good today.

On Monday, the UK will see the closure of its last operational coal power plant, Ratcliffe-on-Soar, which has been operating since 1968. The closure of the plant, which had a capacity of 2,000 megawatts, will bring an end to the history of the country’s coal use, which started with the opening of the first coal-fired power station in 1882. Coal played a central part in the UK’s power system in the interim, in some years providing over 90 percent of its total electricity.

But a number of factors combined to place coal in a long-term decline: the growth of natural gas-powered plants and renewables, pollution controls, carbon pricing, and a government goal to hit net-zero greenhouse gas emissions by 2050.

From boom to bust

It’s difficult to overstate the importance of coal to the UK grid. It was providing over 90 percent of the UK’s electricity as recently as 1956. The total amount of power generated continued to climb well after that, reaching a peak of 212 terawatt hours of production by 1980. And the construction of new coal plants was under consideration as recently as the late 2000s. According to the organization Carbon Brief’s excellent timeline of coal use in the UK, continuing the use of coal with carbon capture was given consideration.

But several factors slowed the use of fuel ahead of any climate goals set out by the UK, some of which have parallels to the US’s situation. The European Union, which included the UK at the time, instituted new rules to address acid rain, which raised the cost of coal plants. In addition, the exploitation of oil and gas deposits in the North Sea provided access to an alternative fuel. Meanwhile, major gains in efficiency and the shift of some heavy industry overseas cut demand in the UK significantly.

Through their effect on coal use, these changes also lowered employment in coal mining. The mining sector has sometimes been a significant force in UK politics, but the decline of coal reduced the number of people employed in the sector, reducing its political influence.

These had all reduced the use of coal even before governments started taking any aggressive steps to limit climate change. But, by 2005, the EU implemented a carbon trading system that put a cost on emissions. By 2008, the UK government adopted national emissions targets, which have been maintained and strengthened since then by both Labour and Conservative governments up until Rishi Sunak, who was voted out of office before he had altered the UK’s trajectory. What started as a pledge for a 60 percent reduction in greenhouse gas emissions by 2050 now requires the UK to hit net zero by that date.

Renewables, natural gas, and efficiency have all squeezed coal off the UK grid.

Enlarge / Renewables, natural gas, and efficiency have all squeezed coal off the UK grid.

These have included a floor on the price of carbon that ensures fossil-powered plants pay a cost for emissions that’s significant enough to promote the transition to renewables, even if prices in the EU’s carbon trading scheme are too low for that. And that transition has been rapid, with the total generations by renewables nearly tripling in the decade since 2013, heavily aided by the growth of offshore wind.

How to clean up the power sector

The trends were significant enough that, in 2015, the UK announced that it would target the end of coal in 2025, despite the fact that the first coal-free day on the grid wouldn’t come until two years after. But two years after that landmark, however, the UK was seeing entire weeks where no coal-fired plants were active.

To limit the worst impacts of climate change, it will be critical for other countries to follow the UK’s lead. So it’s worthwhile to consider how a country that was committed to coal relatively recently could manage such a rapid transition. There are a few UK-specific factors that won’t be possible to replicate everywhere. The first is that most of its coal infrastructure was quite old—Ratcliffe-on-Soar dates from the 1960s—and so it required replacement in any case. Part of the reason for its aging coal fleet was the local availability of relatively cheap natural gas, something that might not be true elsewhere, which put economic pressure on coal generation.

Another key factor is that the ever-shrinking number of people employed by coal power didn’t exert significant pressure on government policies. Despite the existence of a vocal group of climate contrarians in the UK, the issue never became heavily politicized. Both Labour and Conservative governments maintained a fact-based approach to climate change and set policies accordingly. That’s notably not the case in countries like the US and Australia.

But other factors are going to be applicable to a wide variety of countries. As the UK was moving away from coal, renewables became the cheapest way to generate power in much of the world. Coal is also the most polluting source of electrical power, providing ample reasons for regulation that have little to do with climate. Forcing coal users to pay even a fraction of its externalized costs on human health and the environment serve to make it even less economical compared to alternatives.

If these later factors can drive a move away from coal despite government inertia, then it can pay significant dividends in the fight to limit climate change. Inspired in part by the success in moving its grid off coal, the new Labour government in the UK has moved up its timeline for decarbonizing its power sector to 2030 (up from the previous Conservative government’s target of 2035).

For the first time since 1882, UK will have no coal-fired power plants Read More »

opinion:-how-to-design-a-us-data-privacy-law

Opinion: How to design a US data privacy law

robust privacy protection —

Op-ed: Why you should care about the GDPR, and how the US could develop a better version.

General data protection regulation GDPR logo on padlock with blue color background.

Nick Dedeke is an associate teaching professor at Northeastern University, Boston. His research interests include digital transformation strategies, ethics, and privacy. His research has been published in IEEE Management Review, IEEE Spectrum, and the Journal of Business Ethics. He holds a PhD in Industrial Engineering from the University of Kaiserslautern-Landau, Germany. The opinions in this piece do not necessarily reflect the views of Ars Technica.

In an earlier article, I discussed a few of the flaws in Europe’s flagship data privacy law, the General Data Protection Regulation (GDPR). Building on that critique, I would now like to go further, proposing specifications for developing a robust privacy protection regime in the US.

Writers must overcome several hurdles to have a chance at persuading readers about possible flaws in the GDPR. First, some readers are skeptical of any piece criticizing the GDPR because they believe the law is still too young to evaluate. Second, some are suspicious of any piece criticizing the GDPR because they suspect that the authors might be covert supporters of Big Tech’s anti-GDPR agenda. (I can assure readers that I am not, nor have I ever, worked to support any agenda of Big Tech companies.)

In this piece, I will highlight the price of ignoring the GDPR. Then, I will present several conceptual flaws of the GDPR that have been acknowledged by one of the lead architects of the law. Next, I will propose certain characteristics and design requirements that countries like the United States should consider when developing a privacy protection law. Lastly, I provide a few reasons why everyone should care about this project.

The high price of ignoring the GDPR

People sometimes assume that the GDPR is mostly a “bureaucratic headache”—but this perspective is no longer valid. Consider the following actions by administrators of the GDPR in different countries.

  • In May 2023, the Irish authorities hit Meta with a fine of $1.3 billion for unlawfully transferring personal data from the European Union to the US.
  • On July 16, 2021, the Luxembourg National Commission for Data Protection (CNDP) issued a fine of 746 million euros ($888 million) to Amazon Inc. The fine was issued due to a complaint from 10,000 people against Amazon in May 2018 orchestrated by a French privacy rights group.
  • On September 5, 2022, Ireland’s Data Protection Commission (DPC) issued a 405 million-euro GDPR fine to Meta Ireland as a penalty for violating GDPR’s stipulation regarding the lawfulness of children’s data (see other fines here).

In other words, the GDPR is not merely a bureaucratic matter; it can trigger hefty, unexpected fines. The notion that the GDPR can be ignored is a fatal error.

9 conceptual flaws of the GDPR: Perspective of the GDPR’s lead architect

Axel Voss is one of the lead architects of the GDPR. He is a member of the European Parliament and authored the 2011 initiative report titled “Comprehensive Approach to Personal Data Protection in the EU” when he was the European Parliament’s rapporteur. His call for action resulted in the development of the GDPR legislation. After observing the unfulfilled promises of the GDPR, Voss wrote a position paper highlighting the law’s weaknesses. I want to mention nine of the flaws that Voss described.

First, while the GDPR was excellent in theory and pointed a path toward the improvement of standards for data protection, it is an overly bureaucratic law created largely using a top-down approach by EU bureaucrats.

Second, the law is based on the premise that data protection should be a fundamental right of EU persons. Hence, the stipulations are absolute and one-sided or laser-focused only on protecting the “fundamental rights and freedoms” of natural persons. In making this change, the GDPR architects have transferred the relationship between the state and the citizen and applied it to the relationship between citizens and companies and the relationship between companies and their peers. This construction is one reason why the obligations imposed on data controllers and processors are rigid.

Third, the GDPR law aims to empower the data subjects by giving them rights and enshrining these rights into law. Specifically, the law enshrines nine data subject rights into law. They are: the right to be informed, the right to access, the right to rectification, the right to be forgotten/or to erasure, the right to data portability, the right to restrict processing, the right to object to the processing of personal data, the right to object to automated processing and the right to withdraw consent. As with any list, there is always a concern that some rights may be missing. If critical rights are omitted from the GDPR, it would hinder the effectiveness of the law in protecting privacy and data protection. Specifically, in the case of the GDPR, the protected data subject rights are not exhaustive.

Fourth, the GDPR is grounded on a prohibition and limitation approach to data protection. For example, the principle of purpose limitation excludes chance discoveries in science. This ignores the reality that current technologies, e.g., machine learning and artificial Intelligence applications, function differently. Hence, these old data protection mindsets, such as data minimization and storage limitation, are not workable anymore.

Fifth, the GDPR, on principle, posits that every processing of personal data restricts the data subject’s right to data protection. It requires, therefore, that each of these processes needs a justification based on the law. The GDPR deems any processing of personal data as a potential risk and forbids its processing in principle. It only allows processing if a legal ground is met. Such an anti-processing and anti-sharing approach may not make sense in a data-driven economy.

Sixth, the law does not distinguish between low-risk and high-risk applications by imposing the same obligations for each type of data processing application, with a few exceptions requiring consultation of the Data Processing Administrator for high-risk applications.

Seventh, the GDPR also excludes exemptions for low-risk processing scenarios or when SMEs, startups, non-commercial entities, or private citizens are the data controllers. Further, there are no exemptions or provisions that protect the rights of the controller and of third parties for such scenarios in which the data controller has a legitimate interest in protecting business and trade secrets, fulfilling confidentiality obligations, or the economic interest in avoiding huge and disproportionate efforts to meet GDPR obligations.

Eighth, the GDPR lacks a mechanism that allows SMEs and startups to shift the compliance burden onto third parties, which then store and process data.

Ninth, the GPR relies heavily on government-based bureaucratic monitoring and administration of GDPR privacy compliance. This means an extensive bureaucratic system is needed to manage the compliance regime.

There are other issues with GDPR enforcement (see pieces by Matt Burgess and Anda Bologa) and its negative impacts on the EU’s digital economy and on Irish technology companies. This piece will focus only on the nine flaws described above. These nine flaws are some of the reasons why the US authorities should not simply copy the GDPR.

The good news is that many of these flaws can be resolved.

Opinion: How to design a US data privacy law Read More »

ibm-opens-its-quantum-computing-stack-to-third-parties

IBM opens its quantum-computing stack to third parties

Image of a large collection of copper-colored metal plates and wires, all surrounding a small, black piece of silicon.

Enlarge / The small quantum processor (center) surrounded by cables that carry microwave signals to it, and the refrigeration hardware.

As we described earlier this year, operating a quantum computer will require a significant investment in classical computing resources, given the amount of measurements and control operations that need to be executed and interpreted. That means that operating a quantum computer will also require a software stack to control and interpret the flow of information from the quantum side.

But software also gets involved well before anything gets executed. While it’s possible to execute algorithms on quantum hardware by defining the full set of commands sent to the hardware, most users are going to want to focus on algorithm development, rather than the details of controlling any single piece of quantum hardware. “If everyone’s got to get down and know what the noise is, [use] performance management tools, they’ve got to know how to compile a quantum circuit through hardware, you’ve got to become an expert in too much to be able to do the algorithm discovery,” said IBM’s Jay Gambetta. So, part of the software stack that companies are developing to control their quantum hardware includes software that converts abstract representations of quantum algorithms into the series of commands needed to execute them.

IBM’s version of this software is called Qiskit (although it was made open source and has since been adopted by other companies). Recently, IBM made a couple of announcements regarding Qiskit, both benchmarking it in comparison to other software stacks and opening it up to third-party modules. We’ll take a look at what software stacks do before getting into the details of what’s new.

What’s the software stack do?

It’s tempting to view IBM’s Qiskit as the equivalent of a compiler. And at the most basic level, that’s a reasonable analogy, in that it takes algorithms defined by humans and converts them to things that can be executed by hardware. But there are significant differences in the details. A compiler for a classical computer produces code that the computer’s processor converts to internal instructions that are used to configure the processor hardware and execute operations.

Even when using what’s termed “machine language,” programmers don’t directly control the hardware; programmers have no control over where on the hardware things are executed (ie, which processor or execution unit within that processor), or even the order instructions are executed in.

Things are very different for quantum computers, at least at present. For starters, everything that happens on the processor is controlled by external hardware, which typically act by generating a series of laser or microwave pulses. So, software like IBM’s Qiskit or Microsoft’s Q# act by converting the code they’re given into commands that are sent to hardware that’s external to the processor.

These “compilers” must also keep track of exactly which part of the processor things are happening on. Quantum computers act by performing specific operations (called gates) on individual or pairs of qubits; to do that, you have to know exactly which qubit you’re addressing. And, for things like superconducting qubits, where there can be device-to-device variations, which hardware qubits you end up using can have a significant effect on the outcome of the calculations.

As a result, most things like Qiskit provide the option of directly addressing the hardware. If a programmer chooses not to, however, the software can transform generic instructions into a precise series of actions that will execute whatever algorithm has been encoded. That involves the software stack making choices about which physical qubits to use, what gates and measurements to execute, and what order to execute them in.

The role of the software stack, however, is likely to expand considerably over the next few years. A number of companies are experimenting with hardware qubit designs that can flag when one type of common error occurs, and there has been progress with developing logical qubits that enable error correction. Ultimately, any company providing access to quantum computers will want to modify its software stack so that these features are enabled without requiring effort on the part of the people designing the algorithms.

IBM opens its quantum-computing stack to third parties Read More »

report:-apple-changes-film-strategy,-will-rarely-do-wide-theatrical-releases

Report: Apple changes film strategy, will rarely do wide theatrical releases

Small screen focus —

Apple TV+ has made more waves with TV shows than movies so far.

George Clooney and Brad Pitt stand in a doorway

Enlarge / A still from Wolfs, an Apple-produced film starring George Clooney and Brad Pitt.

Apple

For the past few years, Apple has been making big-budget movies meant to compete with the best traditional Hollywood studios have to offer, and it has been releasing them in theaters to drive ticket sales and awards buzz.

Much of that is about to change, according to a report from Bloomberg. The article claims that Apple is “rethinking its movie strategy” after several box office misfires, like Argylle and Napoleon.

It has already canceled the wide theatrical release of one of its tent pole movies, the George Clooney and Brad Pitt-led Wolfs. Most other upcoming big-budget movies from Apple will be released in just a few theaters, suggesting the plan is simple to ensure continued awards eligibility but not to put butts in seats.

Further, Apple plans to move away from super-budget films and to focus its portfolio on a dozen films a year at lower budgets. Just one major big-budget film is planned to get a wide theatrical release: F1. How that one performs could inform future changes to Apple’s strategy.

The report notes that Apple is not the only streamer changing its strategy. Netflix is reducing costs and bringing more movie production in-house, while Amazon is trying (so far unsuccessfully) to produce a higher volume of movies annually, but with a mixture of online-only and in-theater releases. It also points out that movie theater chains are feeling ever more financial pressure, as overall ticket sales haven’t matched their pre-pandemic levels despite occasional hits like Inside Out 2 and Deadpool & Wolverine.

Cinemas have been counting on streamers like Netflix and Apple to crank out films, but those hopes may be dashed if the media companies continue to pull back. For the most part, tech companies like Apple and Amazon have had better luck gaining buzz with television series than with feature films.

Report: Apple changes film strategy, will rarely do wide theatrical releases Read More »

google-and-meta-update-their-ai-models-amid-the-rise-of-“alphachip”

Google and Meta update their AI models amid the rise of “AlphaChip”

Running the AI News Gauntlet —

News about Gemini updates, Llama 3.2, and Google’s new AI-powered chip designer.

Cyberpunk concept showing a man running along a futuristic path full of monitors.

Enlarge / There’s been a lot of AI news this week, and covering it sometimes feels like running through a hall full of danging CRTs, just like this Getty Images illustration.

It’s been a wildly busy week in AI news thanks to OpenAI, including a controversial blog post from CEO Sam Altman, the wide rollout of Advanced Voice Mode, 5GW data center rumors, major staff shake-ups, and dramatic restructuring plans.

But the rest of the AI world doesn’t march to the same beat, doing its own thing and churning out new AI models and research by the minute. Here’s a roundup of some other notable AI news from the past week.

Google Gemini updates

On Tuesday, Google announced updates to its Gemini model lineup, including the release of two new production-ready models that iterate on past releases: Gemini-1.5-Pro-002 and Gemini-1.5-Flash-002. The company reported improvements in overall quality, with notable gains in math, long context handling, and vision tasks. Google claims a 7 percent increase in performance on the MMLU-Pro benchmark and a 20 percent improvement in math-related tasks. But as you know, if you’ve been reading Ars Technica for a while, AI typically benchmarks aren’t as useful as we would like them to be.

Along with model upgrades, Google introduced substantial price reductions for Gemini 1.5 Pro, cutting input token costs by 64 percent and output token costs by 52 percent for prompts under 128,000 tokens. As AI researcher Simon Willison noted on his blog, “For comparison, GPT-4o is currently $5/[million tokens] input and $15/m output and Claude 3.5 Sonnet is $3/m input and $15/m output. Gemini 1.5 Pro was already the cheapest of the frontier models and now it’s even cheaper.”

Google also increased rate limits, with Gemini 1.5 Flash now supporting 2,000 requests per minute and Gemini 1.5 Pro handling 1,000 requests per minute. Google reports that the latest models offer twice the output speed and three times lower latency compared to previous versions. These changes may make it easier and more cost-effective for developers to build applications with Gemini than before.

Meta launches Llama 3.2

On Wednesday, Meta announced the release of Llama 3.2, a significant update to its open-weights AI model lineup that we have covered extensively in the past. The new release includes vision-capable large language models (LLMs) in 11 billion and 90B parameter sizes, as well as lightweight text-only models of 1B and 3B parameters designed for edge and mobile devices. Meta claims the vision models are competitive with leading closed-source models on image recognition and visual understanding tasks, while the smaller models reportedly outperform similar-sized competitors on various text-based tasks.

Willison did some experiments with some of the smaller 3.2 models and reported impressive results for the models’ size. AI researcher Ethan Mollick showed off running Llama 3.2 on his iPhone using an app called PocketPal.

Meta also introduced the first official “Llama Stack” distributions, created to simplify development and deployment across different environments. As with previous releases, Meta is making the models available for free download, with license restrictions. The new models support long context windows of up to 128,000 tokens.

Google’s AlphaChip AI speeds up chip design

On Thursday, Google DeepMind announced what appears to be a significant advancement in AI-driven electronic chip design, AlphaChip. It began as a research project in 2020 and is now a reinforcement learning method for designing chip layouts. Google has reportedly used AlphaChip to create “superhuman chip layouts” in the last three generations of its Tensor Processing Units (TPUs), which are chips similar to GPUs designed to accelerate AI operations. Google claims AlphaChip can generate high-quality chip layouts in hours, compared to weeks or months of human effort. (Reportedly, Nvidia has also been using AI to help design its chips.)

Notably, Google also released a pre-trained checkpoint of AlphaChip on GitHub, sharing the model weights with the public. The company reported that AlphaChip’s impact has already extended beyond Google, with chip design companies like MediaTek adopting and building on the technology for their chips. According to Google, AlphaChip has sparked a new line of research in AI for chip design, potentially optimizing every stage of the chip design cycle from computer architecture to manufacturing.

That wasn’t everything that happened, but those are some major highlights. With the AI industry showing no signs of slowing down at the moment, we’ll see how next week goes.

Google and Meta update their AI models amid the rise of “AlphaChip” Read More »

black-hole-jet-appears-to-boost-rate-of-nova-explosions

Black hole jet appears to boost rate of nova explosions

Image of a bright point against a dark background, with a wavy, lumpy line of material extending diagonally from the point to the opposite corner of the image.

Enlarge / One of the jets emitted by galaxy M87’s central black hole.

The intense electromagnetic environment near a black hole can accelerate particles to a large fraction of the speed of light and sends the speeding particles along jets that extend from each of the object’s poles. In the case of the supermassive black holes found in the center of galaxies, these jets are truly colossal, blasting material not just out of the galaxy, but possibly out of the galaxy’s entire neighborhood.

But this week, scientists have described how the jets may be doing some strange things inside of a galaxy, as well. A study of the galaxy M87 showed that nova explosions appear to be occurring at an unusual high frequency in the neighborhood of one of the jets from the galaxy’s central black hole. But there’s absolutely no mechanism to explain why this might happen, and there’s no sign that it’s happening at the jet that’s traveling in the opposite direction.

Whether this effect is real, and whether we can come up with an explanation for it, may take some further observations.

Novas and wedges

M87 is one of the larger galaxies in our local patch of the Universe, and its central black hole has active jets. During an earlier period of regular observations, the Hubble Space Telescope had found that stellar explosions called novas appeared to be clustered around the jet.

This makes very little sense. Novas occur in systems with a large, hydrogen-rich star, with a nearby white dwarf in orbit. Over time, the white dwarf draws hydrogen off the surface of its companion, until it reaches a critical mass on its surface. At that point, a thermonuclear explosion blasts the remaining material off the white dwarf, and the cycle resets. Since the rate of material transfer tends to be fairly stable, novas in a stellar system will often repeat at regular intervals. And it’s not at all clear why a black hole’s jet would alter that regularity.

So, some of the people involved in the first study got time on the Hubble to go back and have another look. And for a big chunk of a year, every five days, Hubble was pointed at M87, allowing it to capture novas before they faded back out. All told, this picked up 94 novas that occurred near the center of the galaxy. Combined with 41 that had been identified during earlier work, this left a collection of 135 novas in this galaxy. The researchers then plotted these relative to the black hole and its jets.

The area containing the jet (upper right) experiences significantly more novas than the rest of the galaxy's core.

The area containing the jet (upper right) experiences significantly more novas than the rest of the galaxy’s core.

Lessing et. al.

Dividing the area around the center of the galaxy into 10 equal segments, the researchers counted the novas that occurred in each. In the nine segments that didn’t include the jet on the side of the galaxy facing Earth, the average number of novas was 12. In the segment that included the jet, the count was 25. Another way to look at this is that the highest count in a non-jet segment was only 16—and that was in a segment immediately next to the one with the jet in it. The researchers calculate the odds of this arrangement occurring at random as being about one in 1,310 (meaning less than 0.1 percent).

To get a separate measure of how unusual this is, the researchers placed 8 million novas around the center of the galaxy, with the distribution being random but biased to match the galaxy’s brightness under the assumption that novas will be more frequent in areas with more stars. This was then used to estimate how often novas should be expected in each of these segments. They then used a wide variety of wedges: “In order to reduce noise and avoid p-hacking when choosing the size of the wedge, we average the results for wedges between 30 and 45 degrees wide.”

Overall, the enhancement near the jet was low for either very narrow or very wide wedges, as you might expect—narrow wedges crop out too much of the area affected by the jet, while wide ones include a lot of space where you get the normal background rate. Things peak in the area of wedges that are 25 degrees wide, where the enrichment near the jet is about 2.6-fold. So, this appears to be real.

Black hole jet appears to boost rate of nova explosions Read More »

steam-doesn’t-want-to-pay-arbitration-fees,-tells-gamers-to-sue-instead

Steam doesn’t want to pay arbitration fees, tells gamers to sue instead

Mandatory litigation —

Valve previously sued a law firm in attempt to stop mass arbitration claims.

A pen and book resting atop a paper copy of a lawsuit.

Valve Corporation, tired of paying arbitration fees, has removed a mandatory arbitration clause from Steam’s subscriber agreement. Valve told gamers in yesterday’s update that they must sue the company in order to resolve disputes.

The subscriber agreement includes “changes to how disputes and claims between you and Valve are resolved,” Steam wrote in an email to users. “The updated dispute resolution provisions are in Section 10 and require all claims and disputes to proceed in court and not in arbitration. We’ve also removed the class action waiver and cost and fee-shifting provisions.”

The Steam agreement previously said that “you and Valve agree to resolve all disputes and claims between us in individual binding arbitration.” Now, it says that any claims “shall be commenced and maintained exclusively in any state or federal court located in King County, Washington, having subject matter jurisdiction.”

Steam’s email to users said the updated terms “will become effective immediately when you agree to it, including when you make most purchases, fund your Steam wallet, or otherwise accept it. Otherwise, the updated Steam Subscriber Agreement will become effective on November 1, 2024, unless you delete or discontinue use of your Steam account before then.” Steam also pushed a pop-up message to gamers asking them to agree to the new terms.

One likely factor in Valve’s decision to abandon arbitration is mentioned in a pending class-action lawsuit over game prices that was filed last month in US District Court for the Western District of Washington. The Steam users who filed the suit previously “mounted a sustained and ultimately successful challenge to the enforceability of Valve’s arbitration provision,” their lawsuit said. “Specifically, the named Plaintiffs won binding decisions from arbitrators rendering Valve’s arbitration provision unenforceable for both lack of notice and because it impermissibly seeks to bar public injunctive relief.”

Mandatory arbitration clauses are generally seen as bad for consumers, who are deprived of the ability to seek compensation through individual or class-action lawsuits. But many Steam users were able to easily get money from Valve through arbitration, according to law firms that filed the arbitration cases over allegedly inflated game prices.

Valve sued lawyers behind arbitration claims

Valve used to prefer arbitration because few consumers brought claims and the process kept the company’s legal costs low. But in October 2023, Valve sued a law firm in an attempt to stop it from submitting loads of arbitration claims on behalf of gamers.

Valve’s suit complained that “unscrupulous lawyers” at law firm Zaiger, LLC presented a plan to a potential funder “to recruit 75,000 clients and threaten Valve with arbitration on behalf of those clients, thus exposing Valve to potentially millions of dollars of arbitration fees alone: 75,000 potential arbitrations times $3,000 in fees per arbitration is two hundred and twenty-five million dollars.”

Valve said that Zaiger’s “extortive plan” was to “offer a settlement slightly less than the [arbitration] charge—$2,900 per claim or so—attempting to induce a quick resolution.”

“Zaiger targeted Valve and Steam users for its scheme precisely because the arbitration clause in the SSA [Steam Subscriber Agreement] is ‘favorable’ to Steam users in that Valve agrees to pay the fees and costs associated with arbitration,” Valve said.

Zaiger has a “Steam Claims” website that says, “Tens of thousands of Steam users have engaged Zaiger LLC to hold Steam’s owner, Valve, accountable for inflated prices of PC games.” The website said that through arbitration, “many consumers get compensation offers without doing anything beyond completing the initial form.” Another law firm called Mason LLP used a similar strategy to help gamers bring arbitration claims against Steam.

There hadn’t previously been many arbitration cases against Steam, Valve’s lawsuit against Zaiger said. “In the five years before Zaiger began threatening Valve, 2017 to 2022, there were only two instances where Valve and a Steam user could not resolve that user’s issue before proceeding to arbitration. Both of those arbitrations were resolved in Valve’s favor, and Valve paid all of the arbitrator fees and costs for both Valve and the impacted Steam user,” Valve said.

Valve’s lawsuit against Zaiger was dismissed without prejudice on August 20, 2024. The ruling in US District Court for the Western District of Washington said the case was dismissed because the court lacks jurisdiction over Zaiger.

Steam doesn’t want to pay arbitration fees, tells gamers to sue instead Read More »

rts-classics-starcraft,-starcraft-ii-make-their-way-to-pc-game-pass

RTS classics StarCraft, StarCraft II make their way to PC Game Pass

YOU MUST CONSTRUCT ADDITIONAL PYLONS —

The collection includes the 2017 remaster of the original StarCraft.

Phil Spencer’s Tokyo Game Show update.

Beloved real-time strategy classics StarCraft and StarCraft II will soon be available in Microsoft’s Game Pass subscription for PC, the company announced during the Tokyo Game Show.

It’s already free to play both StarCraft and StarCraft II‘s multiplayer modes on PC. This move to Game Pass will make the equally excellent single-player campaigns available to anyone with a subscription, though. Game Pass will also offer all the expansions for both games.

The subscription will provide access to StarCraft Remastered, a revamped version of the original 1998 game that came out in 2017, as well as the StarCraft II Campaign Collection, which includes all 70-plus single-player missions from StarCraft II‘s Wings of Liberty, Heart of the Swarm, Legacy of the Void, and Nova Covert Ops.

The announcement was the lone bit of new information in a brief video by Xbox boss Phil Spencer. He appeared in the video wearing a StarCraft T-shirt, which might have gotten StarCraft fans’ hopes up that the franchise would be getting a new game for the first time in over a decade.

That didn’t happen, of course, but the games’ addition to Game Pass will likely expose them to many new players who may have been too young to play the influential strategy titles when they debuted in 1998 and 2010.

These aren’t the first Blizzard games to be added to Game Pass since Microsoft acquired Activision-Blizzard. First came Diablo IV, then Overwatch 2—the latter was free-to-play already by the time it came to Game Pass, but Microsoft included it in the Game Pass distribution platform and offered cosmetics and goodies to Game Pass subscribers.

The StarCraft games will launch for PC Game Pass and Game Pass Ultimate subscribers on November 5.

Listing image by Microsoft

RTS classics StarCraft, StarCraft II make their way to PC Game Pass Read More »

the-war-of-words-between-spacex-and-the-faa-keeps-escalating

The war of words between SpaceX and the FAA keeps escalating

Elon Musk, SpaceX's founder and CEO, has called for the resignation of the FAA administrator.

Enlarge / Elon Musk, SpaceX’s founder and CEO, has called for the resignation of the FAA administrator.

The clash between SpaceX and the Federal Aviation Administration escalated this week, with Elon Musk calling for the head of the federal regulator to resign after he defended the FAA’s oversight and fines levied against the commercial launch company.

The FAA has said it doesn’t expect to determine whether to approve a launch license for SpaceX’s next Starship test flight until late November, two months later than the agency previously communicated to Musk’s launch company. Federal regulators are reviewing changes to the rocket’s trajectory necessary for SpaceX to bring Starship’s giant reusable Super Heavy booster back to the launch pad in South Texas. This will be the fifth full-scale test flight of Starship but the first time SpaceX attempts such a maneuver on the program.

This week, SpaceX assembled the full Starship rocket on its launch pad at the company’s Starbase facility near Brownsville, Texas. “Starship stacked for Flight 5 and ready for launch, pending regulatory approval,” SpaceX posted on X.

Apart from the Starship regulatory reviews, the FAA last week announced it is proposing more than $633,000 in fines on SpaceX due to alleged violations of the company’s launch license associated with two flights of the company’s Falcon 9 rocket from Florida. It is rare for the FAA’s commercial spaceflight division to fine launch companies.

Michael Whitaker, the FAA’s administrator, discussed the agency’s ongoing environmental and safety reviews of SpaceX’s Starship rocket in a hearing before a congressional subcommittee in Washington Tuesday. During the hearing, which primarily focused on the FAA’s oversight of Boeing’s commercial airplane business, one lawmaker asked Whitaker the FAA’s relationship with SpaceX.

Public interest

“I think safety is in the public interest and that’s our primary focus,” said Michael Whitaker, the FAA administrator, in response to questions from Rep. Kevin Kiley, a California Republican. “It’s the only tool we have to get compliance on safety matters,” he said, referring to the FAA’s fines.

The stainless-steel Super Heavy booster is larger than a Boeing 747 jumbo jet. SpaceX says the flight path to return the first stage of the rocket to land will mean a “slightly larger area could experience a sonic boom,” and a stainless-steel ring that jettisons from the top of the booster, called the hot-staging ring, will fall in a different location in the Gulf of Mexico just offshore from the rocket’s launch and landing site.

The FAA, which is primarily charged with ensuring rocket launches don’t endanger the public, is consulting with other agencies on these matters, along with issues involving SpaceX’s discharge of water into the environment around the Starship launch pad in Texas. The pad uses water to cool a steel flame deflector that sits under the 33 main engines of Starship’s Super Heavy booster.

SpaceX says fines levied against it this year by the Texas Commission on Environmental Quality (TCEQ) and the Environmental Protection Agency (EPA) related to the launch pad’s water system were “entirely tied to disagreements over paperwork” and not any dumping of pollutants into the environment around the Starship launch site.

SpaceX installed the water-cooled flame deflector under the Starship launch mount after the engine exhaust rocket’s first test flight excavated a large hole in the ground. Gwynne Shotwell, SpaceX’s president and chief operating officer, summed up her view of the issue in a hearing with Texas legislators in Austin on Tuesday.

“To protect that from happening again, we built this kind of upside-down shower head to basically cool the flame as the rocket was lifting off,” she said. “That was licensed and permitted by TCEQ. The EPA came in afterwards and didn’t like the license or the permit that we had for that, and wanted to turn it into a federal permit, which we are working on now.”

“We work very closely with organizations such as TCEQ,” Shotwell said. “You may have read a little bit of nonsense in the papers recently about that, but we’re working quite well with them.”

The war of words between SpaceX and the FAA keeps escalating Read More »

spread-of-deadly-eee-virus-explodes-5-fold-in-new-york;-one-death-reported

Spread of deadly EEE virus explodes 5-fold in New York; one death reported

Viral spread —

Normally only 2 or 3 counties have EEE-positive mosquitoes; there’s 15 this year.

An entomologist for the Louisville Metro Department of Public Health and Wellness in a swampland area on August 25, 2021 in Louisville, Kentucky collecting various mosquito species, and testing the samples for mosquito-borne diseases, such as EEE.

Enlarge / An entomologist for the Louisville Metro Department of Public Health and Wellness in a swampland area on August 25, 2021 in Louisville, Kentucky collecting various mosquito species, and testing the samples for mosquito-borne diseases, such as EEE.

New York is facing an unusual boom in mosquitoes toting the deadly eastern equine encephalitis (EEE) virus, which has already led to one rare death in the state and a declaration of an “imminent threat” by officials.

While the state’s surveillance system typically picks up EEE-positive mosquitoes in two or three counties each year, this year there have been 15 affected counties, which are scattered all across New York, State Health Commissioner James McDonald said this week.

“Eastern equine encephalitis is different this year,” McDonald said, noting the deadly nature of the infection, which has a mortality rate of between 30 and 50 percent. “Mosquitoes, once a nuisance, are now a threat,” McDonald added. “I urge all New Yorkers to prevent mosquito bites by using insect repellents, wearing long-sleeved clothing, and removing free-standing water near their homes. Fall is officially here, but mosquitoes will be around until we see multiple nights of below-freezing temperatures.”

On Monday, McDonald issued a Declaration of an Imminent Threat to Public Health for EEE, and Governor Kathy Hochul announced statewide actions to prevent infections. At the same time as the declaration, the officials reported the death of a New Yorker who developed EEE. The case, which was confirmed in Ulster County on September 20, is the state’s first EEE case since 2015.

The disease is very rare in New York. Between 1971 and 2024, there were only 12 cases of EEE reported in the state; seven cases were fatal.

Rare but deadly

EEE is generally rare in the US, with an average of only 11 cases reported per year, according to the Centers for Disease Control and Prevention. The virus lurks in wild birds and spreads to people and other animals via mosquitoes. The virus is particularly deadly in horses—as its name suggests—with mortality rates up to 90 percent. In people, most bites from a mosquito carrying the EEE virus do not lead to EEE. In fact, the CDC estimates that only about 4–5 percent of infected people develop the disease; most remain asymptomatic.

Fo those who develop EEE, the virus travels from the mosquito bite into the lymph system and spreads from there to cause a systemic infection. Initial symptoms are unspecific, including fever, headache, malaise, chills, joint pain, nausea, and vomiting. This can progress to inflammation of the brain and neurological symptoms, including altered mental state and seizures. Children under the age of 15 and adults over the age of 50 are most at risk.

The CDC estimates that about 30 percent of people who develop severe EEE die of the disease. But, with small numbers of cases over time, the reported mortality rates can vary. In Massachusetts, for instance, about 50 percent of the cases have been fatal. Among those who survive neuro-invasive disease, many are left severely disabled, and some die within a few years due to complications. There is no vaccine for EEE and no specific treatments.

Overall numbers

While New York seems to be experiencing an unusual surge of EEE-positive mosquitoes, the country as a whole is not necessarily seeing an uptick in cases. Only 10 cases from six states have been reported to the CDC this year. That count does not include the New York case, which would bring the total to 11, around the country’s average number of cases per year.

In addition to New York, the states that have reported cases are Massachusetts, Vermont, New Jersey, Rhode Island, Wisconsin, and New Hampshire. Most cases have been in the Northeast, where cases are typically reported between mid-June and early October before freezing temperatures kill off mosquito populations.

The death in New York is at least the second EEE death this year. In August, New Hampshire’s health department reported the death of an EEE case, and local media reports identified the person as a previously healthy 41-year-old man from Hampstead.

EEE gained attention last month when a small town in Massachusetts urged residents to follow an evening curfew to avoid mosquito bites.  The move came after the state announced its first EEE case this year (the state’s case count is now at four) and declared a “critical risk level” in four communities.

Between 2003 and 2023, the highest tally of cases in a year was in 2019, when states reported 38 EEE cases.

Spread of deadly EEE virus explodes 5-fold in New York; one death reported Read More »