Apple

apple-takes-over-third-party-apple-passwords-autofill-extension-for-firefox

Apple takes over third-party Apple Passwords autofill extension for Firefox

Over the last few years, Apple has steadily been building password manager-style features into macOS and iOS, including automatic password generation, password breach detection, and more. Starting with this year’s updates—iOS 18 and macOS 15 Sequoia—Apple broke all that functionality out into its own Passwords app, making it all even more visible as a competitor to traditional password managers like 1Password and Bitwarden.

One area where Apple has lagged behind its platform-agnostic competitors is in browser support. Users could easily autofill passwords in Safari on macOS, and Apple did support a basic extension for the Windows versions of Google Chrome and Microsoft Edge via iCloud for Windows. But the company only added a Chrome extension for macOS users in the summer of 2023, and it has never supported non-Chromium browsers at all.

That has finally changed, at least for Firefox users running macOS—Apple has an officially supported Passwords extension for Firefox that supports syncing and autofilling passwords in macOS Sonoma and macOS Sequoia. Currently, the extension doesn’t support older versions of macOS or any versions of Firefox for Windows or Linux. When you install the extension in Firefox on a Mac that’s already synced with your iCloud account, all you should need to do to sign in is input a six-digit code that macOS automatically generates for you. As with the Chromium extension, there’s no need to re-sign in to your iCloud account separately.

To enable this functionality, it looks like Apple has taken ownership of a third-party extension that supported autofilling Apple Passwords in Firefox—a GitHub page for the original extension is still available but says that Apple “are now the sole owners in charge of maintaining their own official iCloud Passwords extension.” That extension supports the versions of Windows that can run the official iCloud for Windows app, suggesting that Apple ought to be able to add official Windows support for the extension at some point down the line.

Apple takes over third-party Apple Passwords autofill extension for Firefox Read More »

join-us-tomorrow-for-ars-live:-how-asahi-linux-ports-open-software-to-apple’s-hardware

Join us tomorrow for Ars Live: How Asahi Linux ports open software to Apple’s hardware

One of the key differences between Apple’s Macs and the iPhone and iPad is that the Mac can still boot and run non-Apple operating systems. This is a feature that Apple specifically built for the Mac, one of many features meant to ease the transition from Intel’s chips to Apple’s own silicon.

The problem, at least at first, was that alternate operating systems like Windows and Linux didn’t work natively with Apple’s hardware, not least because of missing drivers for basic things like USB ports, GPUs, and power management. Enter the Asahi Linux project, a community-driven effort to make open-source software run on Apple’s hardware.

In just a few years, the team has taken Linux on Apple Silicon from “basically bootable” to “plays native Windows games and sounds great doing it.” And the team’s ultimate goal is to contribute enough code upstream that you no longer need a Linux distribution just for Apple Silicon Macs.

On December 4 at 3: 30 pm Eastern (1: 30 pm Pacific), Ars Technica Senior Technology Reporter Andrew Cunningham will host a livestreamed YouTube conversation with Asahi Linux Project Lead Hector Martin and Graphics Lead Alyssa Rosenzweig that will cover the project’s genesis and its progress, as well as what the future holds.

View livestream

Add to Google Calendar

Add to calendar (.ics download)

Join us tomorrow for Ars Live: How Asahi Linux ports open software to Apple’s hardware Read More »

the-upside-down-capacitor-in-mid-‘90s-macs,-proven-and-documented-by-hobbyists

The upside-down capacitor in mid-‘90s Macs, proven and documented by hobbyists

Brown notes that the predecessor Mac LC and LC II had the correct connections, as did the LC 475, which uses the same power supply scheme. This makes him “confident that Apple made a boo-boo on the LC III,” or “basically the hardware equivalent of a copy/paste error when you’re writing code.”

Making sure rehabbers don’t make the same mistake

Why was this not noticed earlier, other than a couple forum threads seen by dedicated board rehabbers? There are a few reasons. For one thing, the rail was only used for a serial port or certain expansion card needs, so a capacitor failure, or out-of-spec power output, may not have been noticed. The other bit is that the original capacitor was rated for 16V, so even with -5V across it, it might not have failed, at least while it was relatively fresh. And it would not have failed in quite so spectacular a fashion as to generate stories and myths.

As to whether Apple knew about this but decided against acting on a somewhat obscure fault, one that might never cause real problems? By all means, let us know if you worked at Apple during that time and can clue us in. Ars has emailed Apple with this tremendously relevant question, the day before Thanksgiving, and will update this post with any comment.

By posting his analysis, Brown hopes to provide anyone else re-capping one of these devices with a bright, reflective warning sign to ignore Apple’s markings and install C22 the electrically correct way. Brown, reached by email, said that he heard from another hobbyist that the reverse voltage “would explain why the replacement cap” they installed “blew up.” Some restoration types, like Retro Viator, noticed the problem and fixed it pre-detonation.

Modern rehabbers tend to use tantalum capacitors to replace the fluid-filled kind that probably damaged the board they’re working on. Tantalum tends to react more violently to too much or reverse voltage, Brown wrote me.

Should C22 or other faulty capacitors destroy your LC III board entirely, Brown notes that 68kMLA member max1zzz has made a reverse-engineered full logic board schematic.

The upside-down capacitor in mid-‘90s Macs, proven and documented by hobbyists Read More »

are-any-of-apple’s-official-magsafe-accessories-worth-buying?

Are any of Apple’s official MagSafe accessories worth buying?


When MagSafe was introduced, it promised an accessories revolution. Meh.

Apple’s current lineup of MagSafe accessories. Credit: Samuel Axon

When Apple introduced what it currently calls MagSafe in 2020, its marketing messaging suggested that the magnetic attachment standard for the iPhone would produce a boom in innovation in accessories, making things possible that simply weren’t before.

Four years later, that hasn’t really happened—either from third-party accessory makers or Apple’s own lineup of branded MagSafe products.

Instead, we have a lineup of accessories that matches pretty much what was available at launch in 2020: chargers, cases, and just a couple more unusual applications.

With the launch of the iPhone 16 just behind us and the holidays just in front of us, a bunch of people are moving to phones that support MagSafe for the first time. Apple loves an upsell, so it offers some first-party MagSafe accessories—some useful, some not worth the cash, given the premiums it sometimes charges.

Given all that, it’s a good time to check in and quickly point out which (if any) of these first-party MagSafe accessories might be worth grabbing alongside that new iPhone and which ones you should skip in favor of third-party offerings.

Cases with MagSafe

Look, we could write thousands of words about the variety of iPhone cases available, or even just about those that support MagSafe to some degree or another—and we still wouldn’t really scratch the surface. (Unless that surface was made with Apple’s leather-replacement FineWoven material—hey-o!)

It’s safe to say there’s a third-party case for every need and every type of person out there. If you want one that meets your exact needs, you’ll be able to find it. Just know that cases that are labeled as MagSafe-ready will allow charge through and will let the magnets align correctly between a MagSafe charger and an iPhone—that’s really the whole point of the “MagSafe” name.

But if you prefer to stick with Apple’s own cases, there are currently two options: the clear cases and the silicone cases.

A clear iPhone case on a table

The clear case is definitely the superior of Apple’s two first-party MagSafe cases. Credit: Samuel Axon

The clear cases actually have a circle where the edges of the MagSafe magnets are, which is pretty nice for getting the magnets to snap without any futzing—though it’s really not necessary, since, well, magnets attract. They have a firm plastic shell that is likely to do a good job of protecting your phone when you drop it.

The Silicone case is… fine. Frankly, it’s ludicrously priced for what it is. It offers no advantages over a plethora of third-party cases that cost exactly half as much.

Recommendation: The clear case has its advantages, but the silicone case is awfully expensive for what it is. Generally, third party is the way to go. There are lots of third-party cases from manufacturers who got licensed by Apple, and you can generally trust those will work with wireless charging just fine. That was the whole point of the MagSafe branding, after all.

The MagSafe charger

At $39 or $49 (depending on length, one meter or two), these charging cables are pretty pricey. But they’re also highly durable, relatively efficient, and super easy to use. In most cases, you might as well just use any old USB-C cable.

There are some situations where you might prefer this option, though—for example, if you prop your iPhone up against your bedside lamp like a nightstand clock, or if you (like me) listen to audiobooks on wired earbuds while you fall asleep via the USB-C port, but you want to make sure the phone is still charging.

A charger with cable sits on a table

The MagSafe charger for the iPhone. Credit: Samuel Axon

So the answer on Apple’s MagSafe charger is that it’s pretty specialized, but it’s arguably the best option for those who have some specific reason not to just use USB-C.

Recommendation: Just use a USB-C cable, unless you have a specific reason to go this route—shoutout to my fellow individuals who listen to audiobooks while falling asleep but need headphones so as not to keep their spouse awake but prefer wired earbuds that use the USB-C port over AirPods to avoid losing AirPods in the bed covers. I’m sure there are dozens of us! If you do go this route, Apple’s own cable is the safest pick.

Apple’s FineWoven Wallet with MagSafe

While I’d long known people with dense wallet cases for their iPhones, I was excited about Apple’s leather (and later FineWoven) wallet with MagSafe when it was announced. I felt the wallet cases I’d seen were way too bulky, making the phone less pleasant to use.

Unfortunately, Apple’s FineWoven Wallet with MagSafe might be the worst official MagSafe product.

The problem is that the “durable microtwill” material that Apple went with instead of leather is prone to scratching, as many owners have complained. That’s a bit frustrating for something that costs nearly $60.

Apple's MagSafe wallet on a table

The MagSafe wallet has too many limitations to be worthwhile for most people. Credit: Samuel Axon

The wallet also only holds a few cards, and putting cards here means you probably can’t or at least shouldn’t try to use wireless charging, because the cards would be between the charger and the phone. Apple itself warns against doing this.

For those reasons, skip the FineWoven Wallet. There are lots of better-designed iPhone wallet cases out there, even though they might not be so minimalistic.

Recommendation: Skip this one. It’s a great idea in theory, but in practice and execution, it just doesn’t deliver. There are zillions of great wallet cases out there if you don’t mind a bit of bulk—just know you’ll have some wireless charging issues with many cases.

Other categories offered by third parties

Frankly, a lot of the more interesting applications of MagSafe for the iPhone are only available through third parties.

There are monitor mounts for using the iPhone as a webcam with Macs; bedside table stands for charging the phone while it acts as a smart display; magnetic phone stands for car dashboards that let you use GPS while you drive using MagSafe; magnetic versions for attaching power banks and portable batteries; and of course, multi-device chargers similar to the infamously canceled Airpower charging pad Apple had planned to release at one point. (I have the Belkin Boost Charge Pro 3-in-1 on my desk, and it works great.)

It’s not the revolution of new applications that some imagined when MagSafe was launched, but that’s not really a surprise. Still, there are some quality products out there. It’s both strange and a pity that Apple hasn’t made most of them itself.

No revolution here

Truthfully, MagSafe never seemed like it would be a huge smash. iPhones already supported Qi wireless charging before it came along, so the idea of magnets keeping the device aligned with the charger was always the main appeal—its existence potentially saved some users from ending up with chargers that didn’t quite work right with their phones, provided those users bought officially licensed MagSafe accessories.

Apple’s MagSafe accessories are often overpriced compared to alternatives from Belkin and other frequent partners. MagSafe seemed to do a better job bringing some standards to certain third-party products than it did bringing life to Apple’s offerings, and it certainly did not bring about a revolution of new accessory categories to the iPhone.

Still, it’s hard to blame anyone for choosing to go with Apple’s versions; the world of third-party accessories can be messy, and going the first-party route is generally a surefire way to know you’re not going to have many problems, even if the sticker’s a bit steep.

You could shop for third-party options, but sometimes you want a sure thing. With the possible exception of the FineWoven Wallet, all of these Apple-made MagSafe products are sure things.

Photo of Samuel Axon

Samuel Axon is a senior editor at Ars Technica. He covers Apple, software development, gaming, AI, entertainment, and mixed reality. He has been writing about gaming and technology for nearly two decades at Engadget, PC World, Mashable, Vice, Polygon, Wired, and others. He previously ran a marketing and PR agency in the gaming industry, led editorial for the TV network CBS, and worked on social media marketing strategy for Samsung Mobile at the creative agency SPCSHP. He also is an independent software and game developer for iOS, Windows, and other platforms, and he is a graduate of DePaul University, where he studied interactive media and software development.

Are any of Apple’s official MagSafe accessories worth buying? Read More »

the-good,-the-bad,-and-the-ugly-behind-the-push-for-more-smart-displays

The good, the bad, and the ugly behind the push for more smart displays

After a couple of years without much happening, smart displays are in the news again. Aside from smart TVs, consumer screens that connect to the Internet have never reached a mainstream audience. However, there seems to be a resurgence to make smart displays more popular. The approaches that some companies are taking are better than those of others, revealing a good, bad, and ugly side behind the push.

Note that for this article, we’ll exclude smart TVs when discussing smart displays. Unlike the majority of smart displays, smart TVs are mainstream tech. So for this piece, we’ll mostly focus on devices like the Google Next Hub Max or Amazon Echo Show (as pictured above).

The good

When it comes to emerging technology, a great gauge for whether innovation is happening is by measuring how much a product solves a real user problem. Products seeking a problem to solve or that are glorified vehicles for ads and tracking don’t qualify.

If reports that Apple is working on its first smart display are true, there may be potential for it to solve the problem of managing multiple smart home devices from different companies.

Apple has declined to comment on reports from Bloomberg’s Mark Gurman of an Apple smart display under development. But Gurman recently claimed that the display will be able to be mounted on walls and “use AI to navigate apps.” Gurman said that it would incorporate Apple’s smart home framework HomeKit, which supports “hundreds of accessories” and can control third-party devices, like smart security cameras, thermostats, and lights. Per the November 12 report:

The product will be marketed as a way to control home appliances, chat with Siri, and hold intercom sessions via Apple’s FaceTime software. It will also be loaded with Apple apps, including ones for web browsing, listening to news updates and playing music. Users will be able to access their notes and calendar information, and the device can turn into a slideshow display for their photos.

If released, the device—said to be shaped like a 6-inch iPhone—would compete with the Nest Hub and Echo Show. Apple entering the smart display business could bring a heightened focus on privacy and push other companies to make privacy a bigger focus, too. Apple has already given us a peek at how it might handle smart home privacy with the HomePod. “All communication between HomePod and Apple servers is encrypted, and anonymous IDs protect your identity,” Apple’s HomePod privacy policy states.

The good, the bad, and the ugly behind the push for more smart displays Read More »

musi-fans-refuse-to-update-iphones-until-apple-unblocks-controversial-app

Musi fans refuse to update iPhones until Apple unblocks controversial app

“The public interest in the preservation of intellectual property rights weighs heavily against the injunction sought here, which would force Apple to distribute an app over the repeated and consistent objections of non-parties who allege their rights are infringed by the app,” Apple argued.

Musi fans vow loyalty

For Musi fans expressing their suffering on Reddit, Musi appears to be irreplaceable.

Unlike other free apps that continually play ads, Musi only serves ads when the app is initially opened, then allows uninterrupted listening. One Musi user also noted that Musi allows for an unlimited number of videos in a playlist, where YouTube caps playlists at 5,000 videos.

“Musi is the only playback system I have to play all 9k of my videos/songs in the same library,” the Musi fan said. “I honestly don’t just use Musi just cause it’s free. It has features no other app has, especially if you like to watch music videos while you listen to music.”

“Spotify isn’t cutting it,” one Reddit user whined.

“I hate Spotify,” another user agreed.

“I think of Musi every other day,” a third user who apparently lost the app after purchasing a new phone said. “Since I got my new iPhone, I have to settle for other music apps just to get by (not enough, of course) to listen to music in my car driving. I will be patiently waiting once Musi is available to redownload.”

Some Musi fans who still have access gloat in the threads, while others warn the litigation could soon doom the app for everyone.

Musi continues to perhaps optimistically tell users that the app is coming back, reassuring anyone whose app was accidentally offloaded that their libraries remain linked through iCloud and will be restored if it does.

Some users buy into Musi’s promises, while others seem skeptical that Musi can take on Apple. To many users still clinging to their Musi app, updating their phones has become too risky until the litigation resolves.

“Please,” one Musi fan begged. “Musi come back!!!”

Musi fans refuse to update iPhones until Apple unblocks controversial app Read More »

apple-tv+-spent-$20b-on-original-content-if-only-people-actually-watched.

Apple TV+ spent $20B on original content. If only people actually watched.

For example, Apple TV+ is embracing bundles, which is thought to help prevent subscribers from canceling streaming subscriptions. People can currently get Apple TV+ from a Comcast streaming bundle.

And as of last month people can subscribe to and view Apple TV+ through Amazon Prime Video. As my colleague Samuel Axon explained in October, this contradicts Apple’s long-standing approach to streaming “because Apple has long held ambitions of doing exactly what Amazon is doing here: establishing itself as the central library, viewing, search, and payment hub for a variety of subscription offerings.” But without support from Netflix, “Apple’s attempt to make the TV app a universal hub of content has been continually stymied,” Axon noted.

Something has got to give

With the broader streaming industry dealing with high production costs, disappointed subscribers, and growing competition, Apple, like many stakeholders, is looking for new approaches to entertainment. For Apple, that also reportedly includes fewer theatrical releases.

It may also one day mean joining what some streaming subscribers see as the dark side of streaming: advertisements. Apple TV+ currently remains ad-free, but there are suspicions that this could change, with Apple reportedly meeting with the United Kingdom’s TV ratings body recently about ad tracking and its hiring of ad executives.

Apple’s ad-free platform and comparatively low subscription prices are some of the biggest draws for Apple TV+ subscribers, however, which would make changes to either benefit controversial.

But after five years on the market and a reported $20 billion in spending, Apple can’t be happy with 0.3 percent of available streaming viewership. Awards and prestige help put Apple TV+ on the map, but Apple needs more subscribers and eyeballs on its expensive content to have a truly successful streaming business.

Apple TV+ spent $20B on original content. If only people actually watched. Read More »

apple’s-headphone-adapter-for-older-iphones-sells-out,-possibly-never-to-return

Apple’s headphone adapter for older iPhones sells out, possibly never to return

When Apple infamously ditched the headphone jack with the launch of the iPhone 7, it at least provided customers with a Lightning-to-3.5 mm adapter either right in the box or as a $9 standalone purchase in its online store. Now it looks like that adapter is being retired.

As MacRumors first noted, the adapter is showing as sold out in most regions, along with a few other Lightning accessories, like the even-more-archaic-seeming Lightning-to-VGA adapter. That includes the United States, where it is not possible to order the headphone adapter from Apple’s store.

Inventory has run out, and it seems unlikely Apple will make more to refill it.

This is likely part of a general phasing out of products related to the proprietary Lightning port, which was used in many Apple devices (including the iPhone) for years but has been replaced by USB-C in all of the company’s major products. A couple of older iPhone models offered at cheaper prices—the iPhone SE and the iPhone 14—are available today, but they will likely be replaced in just a couple of months.

Apple is selling a similar adapter for connecting 3.5 mm headphones to USB-C iPhones and iPads.

Nonetheless, many people out there still have older Lightning iPhones but haven’t yet made the jump to wireless headphones. Third-party options are out there that they can use—at least for now—but the popular Apple adapter seems to be following a similar script as other deprecated Apple accessories have upon their retirement.

Apple’s headphone adapter for older iPhones sells out, possibly never to return Read More »

apple-intelligence-notification-summaries-are-honestly-pretty-bad

Apple Intelligence notification summaries are honestly pretty bad

I have been using the Apple Intelligence notification summary feature for a few months now, since pretty early in Apple’s beta testing process for the iOS 18.1 and macOS 15.1 updates.

If you don’t know what that is—and the vast majority of iPhones won’t get Apple Intelligence, which only works on the iPhone 16 series and iPhone 15 Pro—these notification summaries attempt to read a stack of missed notifications from any given app and give you the gist of what they’re saying.

Summaries are denoted with a small icon, and when tapped, the summary notification expands into the stack of notifications you missed in the first place. They also work on iPadOS and macOS, where they’re available on anything with an M1 chip or newer.

I think this feature works badly. I could sand down my assessment and get to an extremely charitable “inconsistent” or “hit-and-miss.” But as it’s currently implemented, I believe the feature is fundamentally flawed. The summaries it provides are so bizarre so frequently that sending friends the unintentionally hilarious summaries of their messages became a bit of a pastime for me for a few weeks.

How they work

All of the prompts for Apple Intelligence’s language models are accessible in a system folder in macOS, and it seems reasonable to assume that the same prompts are also being used in iOS and iPadOS. Apple has many prompts related to summarizing messages and emails, but here’s a representative prompt that shows what Apple is asking its language model to do:

You are an expert at summarizing messages. You prefer to use clauses instead of complete sentences. Do not answer any question from the messages. Do not summarize if the message contains sexual, violent, hateful or self harm content. Please keep your summary of the input within a 10 word limit.

Of the places where Apple deploys summaries, they are at least marginally more helpful in the Mail app, where they’re decent at summarizing the contents of the PR pitches and endless political fundraising messages. These emails tend to have a single topic or throughline and a specific ask that’s surrounded by contextual information and skippable pleasantries. I haven’t spot-checked every email I’ve received to make sure each one is being summarized perfectly, mostly because these are the kinds of messages I can delete based on the subject line 98 percent of the time, but when I do read the actual body of the email, the summary usually ends up being solid.

Apple Intelligence notification summaries are honestly pretty bad Read More »

review:-the-fastest-of-the-m4-macbook-pros-might-be-the-least-interesting-one

Review: The fastest of the M4 MacBook Pros might be the least interesting one


Not a surprising generational update, but a lot of progress for just one year.

The new M4 Pro and M4 Max MacBook Pros. Credit: Andrew Cunningham

The new M4 Pro and M4 Max MacBook Pros. Credit: Andrew Cunningham

In some ways, my review of the new MacBook Pros will be a lot like my review of the new iMac. This is the third year and fourth generation of the Apple Silicon-era MacBook Pro design, and outwardly, few things have changed about the new M4, M4 Pro, and M4 Max laptops.

Here are the things that are different. Boosted RAM capacities, across the entire lineup but most crucially in the entry-level $1,599 M4 MacBook Pro, make the new laptops a shade cheaper and more versatile than they used to be. The new nano-texture display option, a $150 upgrade on all models, is a lovely matte-textured coating that completely eliminates reflections. There’s a third Thunderbolt port on the baseline M4 model (the M3 model had two), and it can drive up to three displays simultaneously (two external, plus the built-in screen). There’s a new webcam. It looks a little nicer and has a wide-angle lens that can show what’s on your desk instead of your face if you want it to. And there are new chips, which we’ll get to.

That is essentially the end of the list. If you are still using an Intel-era MacBook Pro, I’ll point you to our previous reviews, which mostly celebrate the improvements (more and different kids of ports, larger screens) while picking one or two nits (they are a bit larger and heavier than late-Intel MacBook Pros, and the display notch is an eyesore).

New chips: M4 and M4 Pro

That leaves us with the M4, M4 Pro, and M4 Max.

We’ve already talked a bunch about the M4 and M4 Pro in our reviews of the new iMac and the new Mac minis, but to recap, the M4 is a solid generational upgrade over the M3, thanks to its two extra efficiency cores on the CPU side. Comparatively, the M4 Pro is a much larger leap over the M3 Pro, mostly because the M3 Pro was such a mild update compared to the M2 Pro.

The M4’s single-core performance is between 14 and 21 percent faster than the M3s in our tests, and tests that use all the CPU cores are usually 20 or 30 percent faster. The GPU is occasionally as much as 33 percent faster than the M3 in our tests, though more often, the improvements are in the single or low double digits.

For the M4 Pro—bearing in mind that we tested the fully enabled version with 14 CPU cores and 20 GPU cores, and not the slightly cut down version sold in less expensive machines—single-core CPU performance is up by around 20-ish percent in our tests, in line with the regular M4’s performance advantage over the regular M3. The huge boost to CPU core count increases multicore performance by between 50 and 60 percent most of the time, a substantial boost that actually allows the M4 Pro to approach the CPU performance of the 2022 M1 Ultra. GPU performance is up by around 33 percent compared to M3 Pro, thanks to the additional GPU cores and memory bandwidth, but it’s still not as fast as any of Apple’s Max or Ultra chips, even the M1-series.

M4 Max

And finally, there’s the M4 Max (again, the fully enabled version, this one with 12 P-cores, 4 E-cores, 40 GPU cores, and 546GB/s of memory bandwidth). Single-core CPU performance is the biggest leap forward, jumping by between 18 and 28 percent in single-threaded benchmarks. Multi-core performance is generally up by between 15 and 20 percent. That’s a more-than-respectable generational leap, but it’s nowhere near what happened for the M4 Pro since both M3 Mac and M4 Max have the same CPU core counts.

The only weird thing we noticed in our testing was an inconsistent performance in our Handbrake video encoding test. Every time we ran it, it reliably took either five minutes and 20 seconds or four minutes and 30 seconds. For the slower result, power usage was also slightly reduced, which suggests to me that some kind of throttling is happening during this workload; we saw roughly these two results over and over across a dozen or so runs, each separated by at least five minutes to allow the Mac to cool back down. High Power mode didn’t make a difference in either direction.

CPU P/E-cores GPU cores RAM options Display support (including internal) Memory bandwidth
Apple M4 Max (low) 10/4 32 36GB Up to five 410GB/s
Apple M4 Max (high) 12/4 40 48/64/128GB Up to five 546GB/s
Apple M3 Max (high) 12/4 40 48/64/128GB Up to five 409.6GB/s
Apple M2 Max (high) 8/4 38 64/96GB Up to five 409.6GB/s

We shared our data with Apple and haven’t received a response. Note that we tested the M4 Max in the 16-inch MacBook Pro, and we’d expect any kind of throttling behavior to be slightly more noticeable in the 14-inch Pro since it has less room for cooling hardware.

The faster result is more in line with the rest of our multi-core tests for the M4 Max. Even the slower of the two results is faster than the M3 Max, albeit not by much. We also didn’t notice similar behavior for any of the other multi-core tests we ran. It’s worth keeping in mind if you plan to use the MacBook Pro for CPU-heavy, sustained workloads that will run for more than a few minutes at a time.

GPU performance in our tests varies widely compared to the M4 Max, with results ranging from as little as 10 or 15 percent (for 4K and 1440p GFXBench tests—the bigger boost to the 1080p version is coming partially from CPU improvements) to as high as 30 percent for the Cinebench 2024 GPU test. I suspect the benefits will vary depending on how much the apps you’re running benefit from the M4 Max’s improved memory bandwidth.

Power efficiency in the M4 Max isn’t dramatically different from the M3 Max—it’s more efficient by virtue of using roughly the same amount of power as the M3 Max and running a little faster, consuming less energy overall to do the same amount of work.

Credit: Andrew Cunningham

Finally, in a test of High Power mode, we did see some very small differences in the GFXBench scores, though not in other GPU-based tests like Cinebench and Blender or in any CPU-based tests. You might notice slightly better performance in games if you’re running them, but as with the M4 Pro, it doesn’t seem hugely beneficial. This is different from how it’s handled in many Windows PCs, including Snapdragon X Elite PCs with Arm-based chips in them because they do have substantially different performance in high-performance mode relative to the default “balanced” mode.

Nice to see you, yearly upgrade

The 14-inch and 16-inch MacBook Pros. The nano-texture glass displays eliminate all of the normal glossy-screen reflections and glare. Credit: Andrew Cunningham

The new MacBook Pros are all solid year-over-year upgrades, though they’ll be most interesting to people who bought their last MacBook Pro toward the end of the Intel era sometime in 2019 or 2020. The nano-texture display, extra speed, and extra RAM may be worth a look for owners of the M1 MacBook Pros if you truly need the best performance you can get in a laptop. But I’d still draw a pretty bright line between latter-day Intel Macs (aging, hot, getting toward the end of the line for macOS updates, not getting all the features of current macOS versions anyway) and any kind of Apple Silicon Mac (fully supported with all features, still-current designs, barely three years old at most).

Frankly, the computer that benefits the most is probably the $1,599 entry-level MacBook Pro, which, thanks to the 16GB RAM upgrade and improved multi-monitor support, is a fairly capable professional computer. Of all the places where Apple’s previous 8GB RAM floor felt inappropriate, it was in the M3 MacBook Pro. With the extra ports, high-refresh-rate screen, and nano-texture coating option, it’s a bit easier to articulate the kind of user who that laptop is actually for, separating it a bit from the 15-inch MacBook Air.

The M4 Pro version also deserves a shout-out for its particularly big performance jump compared to the M2 Pro and M3 Pro generations. It’s a little odd to have a MacBook Pro generation where the middle chip is the most impressive of the three, and that’s not to discount how fast the M4 Max is—it’s just the reality of the situation given Apple’s focus on efficiency rather than performance for the M3 Pro.

The good

  • RAM upgrades across the whole lineup. This particularly benefits the $1,599 M4 MacBook Air, which jumps from 8GB to 16GB
  • M4 and M4 Max are both respectable generational upgrades and offer substantial performance boosts from Intel or even M1 Macs
  • M4 Pro is a huge generational leap, as Apple’s M3 Pro used a more conservative design
  • Nano-texture display coating is very nice and not too expensive relative to the price of the laptops
  • Better multi-monitor support for M4 version
  • Other design things—ports, 120 Hz screen, keyboard, and trackpad—are all mostly the same as before and are all very nice

The bad

  • Occasional evidence of M4 Max performance throttling, though it’s inconsistent, and we only saw it in one of our benchmarks
  • Need to jump all the way to M4 Max to get the best GPU performance

The ugly

  • Expensive, especially once you start considering RAM and storage upgrades

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Review: The fastest of the M4 MacBook Pros might be the least interesting one Read More »

apple-botched-the-apple-intelligence-launch,-but-its-long-term-strategy-is-sound

Apple botched the Apple Intelligence launch, but its long-term strategy is sound


I’ve spent a week with Apple Intelligence—here are the takeaways.

Apple Intelligence includes features like Clean Up, which lets you pick from glowing objects it has recognized to remove them from a photo. Credit: Samuel Axon

Ask a few random people about Apple Intelligence and you’ll probably get quite different responses.

One might be excited about the new features. Another could opine that no one asked for this and the company is throwing away its reputation with creatives and artists to chase a fad. Another still might tell you that regardless of the potential value, Apple is simply too late to the game to make a mark.

The release of Apple’s first Apple Intelligence-branded AI tools in iOS 18.1 last week makes all those perspectives understandable.

The first wave of features in Apple’s delayed release shows promise—and some of them may be genuinely useful, especially with further refinement. At the same time, Apple’s approach seems rushed, as if the company is cutting some corners to catch up where some perceive it has fallen behind.

That impatient, unusually undisciplined approach to the rollout could undermine the value proposition of AI tools for many users. Nonetheless, Apple’s strategy might just work out in the long run.

What’s included in “Apple Intelligence”

I’m basing those conclusions on about a week spent with both the public release of iOS 18.1 and the developer beta of iOS 18.2. Between them, the majority of features announced back in June under the “Apple Intelligence” banner are present.

Let’s start with a quick rundown of which Apple Intelligence features are in each release.

iOS 18.1 public release

  • Writing Tools
    • Proofreading
    • Rewriting in friendly, professional, or concise voices
    • Summaries in prose, key points, bullet point list, or table format
  • Text summaries
    • Summarize text from Mail messages
    • Summarize text from Safari pages
  • Notifications
  • Reduce Interruptions – Intelligent filtering of notifications to include only ones deemed critical
  • Type to Siri
  • More conversational Siri
  • Photos
    • Clean Up (remove an object or person from the image)
    • Generate Memories videos/slideshows from plain language text prompts
    • Natural language search

iOS 18.2 developer beta (as of November 5, 2024)

  • Image Playground – A prompt-based image generation app akin to something like Dall-E or Midjourney but with a limited range of stylistic possibilities, fewer features, and more guardrails
  • Genmoji – Generate original emoji from a prompt
  • Image Wand – Similar to Image Playground but simplified within the Notes app
  • ChatGPT integration in Siri
  • Visual Intelligence – iPhone 16 and iPhone 16 Pro users can use the new Camera Control button to do a variety of tasks based on what’s in the camera’s view, including translation, information about places, and more
  • Writing Tools – Expanded with support for prompt-based edits to text

iOS 18.1 is out right now for everybody. iOS 18.2 is scheduled for a public launch sometime in December.

iOS 18.2 will introduce both Visual Intelligence and the ability to chat with ChatGPT via Siri.

Credit: Samuel Axon

iOS 18.2 will introduce both Visual Intelligence and the ability to chat with ChatGPT via Siri. Credit: Samuel Axon

A staggered rollout

For several years, Apple has released most of its major new software features for, say, the iPhone in one big software update in the fall. That timeline has gotten fuzzier in recent years, but the rollout of Apple Intelligence has moved further from that tradition than we’ve ever seen before.

Apple announced iOS 18 at its developer conference in June, suggesting that most if not all of the Apple Intelligence features would launch in that singular update alongside the new iPhones.

Much of the marketing leading up to and surrounding the iPhone 16 launch focused on Apple Intelligence, but in actuality, the iPhone 16 had none of the features under that label when it launched. The first wave hit with iOS 18.1 last week, over a month after the first consumers started getting their hands on iPhone 16 hardware. And even now, these features are in “beta,” and there has been a wait list.

Many of the most exciting Apple Intelligence features still aren’t here, with some planned for iOS 18.2’s launch in December and a few others coming even later. There will likely be a wait list for some of those, too.

The wait list part makes sense—some of these features put demand on cloud servers, and it’s reasonable to stagger the rollout to sidestep potential launch problems.

The rest doesn’t make as much sense. Between the beta label and the staggered features, it seems like Apple is rushing to satisfy expectations about Apple Intelligence before quality and consistency have fallen into place.

Making AI a harder sell

In some cases, this strategy has led to things feeling half-baked. For example, Writing Tools is available system-wide, but it’s a different experience for first-party apps that work with the new Writing Tools API than third-party apps that don’t. The former lets you approve changes piece by piece, but the latter puts you in a take-it-or-leave-it situation with the whole text. The Writing Tools API is coming in iOS 18.2, maintaining that gap for a couple of months, even for third-party apps whose developers would normally want to be on the ball with this.

Further, iOS 18.2 will allow users to tweak Writing Tools rewrites by specifying what they want in a text prompt, but that’s missing in iOS 18.1. Why launch Writing Tools with features missing and user experience inconsistencies when you could just launch the whole suite in December?

That’s just one example, but there are many similar ones. I think there are a couple of possible explanations:

  • Apple is trying to satisfy anxious investors and commentators who believe the company is already way too late to the generative AI sector.
  • With the original intent to launch it all in the first iOS 18 release, significant resources were spent on Apple Intelligence-focused advertising and marketing around the iPhone 16 in September—and when unexpected problems developing the software features led to a delay for the software launch, it was too late to change the marketing message. Ultimately, the company’s leadership may feel the pressure to make good on that pitch to users as quickly after the iPhone 16 launch as possible, even if it’s piecemeal.

I’m not sure which it is, but in either case, I don’t believe it was the right play.

So many consumers have their defenses up about AI features already, in part because other companies like Microsoft or Google rushed theirs to market without really thinking things through (or caring, if they had) and also because more and more people are naturally suspicious of whatever is labeled the next great thing in Silicon Valley (remember NFTs?). Apple had an opportunity to set itself apart in consumers’ perceptions about AI, but at least right now, that opportunity has been squandered.

Now, I’m not an AI doubter. I think these features and others can be useful, and I already use similar ones every day. I also commend Apple for allowing users to control whether these AI features are enabled at all, which should make AI skeptics more comfortable.

Notification summaries condense all the notifications from a single app into one or two lines, like with this lengthy Discord conversation here. Results are hit or miss.

Credit: Samuel Axon

Notification summaries condense all the notifications from a single app into one or two lines, like with this lengthy Discord conversation here. Results are hit or miss. Credit: Samuel Axon

That said, releasing half-finished bits and pieces of Apple Intelligence doesn’t fit the company’s framing of it as a singular, branded product, and it doesn’t do a lot to handle objections from users who are already assuming AI tools will be nonsense.

There’s so much confusion about AI that it makes sense to let those who are skeptical move at their own pace, and it also makes sense to sell them on the idea with fully baked implementations.

Apple still has a more sensible approach than most

Despite all this, I like the philosophy behind how Apple has thought about implementing its AI tools, even if the rollout has been a mess. It’s fundamentally distinct from what we’re seeing from a company like Microsoft, which seems hell-bent on putting AI chatbots everywhere it can to see which real-world use cases emerge organically.

There is no true, ChatGPT-like LLM chatbot in iOS 18.1. Technically, there’s one in iOS 18.2, but only because you can tell Siri to refer you to ChatGPT on a case-by-case basis.

Instead, Apple has introduced specific generative AI features peppered throughout the operating system meant to explicitly solve narrow user problems. Sure, they’re all built on models that have resemblances to the ones that power Claude or Midjourney, but they’re not built around this idea that you start up a chat dialogue with an LLM or an image generator and it’s up to you to find a way to make it useful for you.

The practical application of most of these features is clear, provided they end up working well (more on that shortly). As a professional writer, it’s easy for me to dismiss Writing Tools as unnecessary—but obviously, not everyone is a professional writer, or even a decent one. For example, I’ve long held that one of the most positive applications of large language models is their ability to let non-native speakers clean up their writing to make it meet native speakers’ standards. In theory, Apple’s Writing Tools can do that.

Apple Intelligence features augment or add additional flexibility or power to existing use cases across the OS, like this new way to generate photo memory movies via text prompt.

Credit: Samuel Axon

Apple Intelligence features augment or add additional flexibility or power to existing use cases across the OS, like this new way to generate photo memory movies via text prompt. Credit: Samuel Axon

I have no doubt that Genmoji will be popular—who doesn’t love a bit of fun in group texts with friends? And many months before iOS 18.1, I was already dropping senselessly gargantuan corporate email threads into ChatGPT and asking for quick summaries.

Apple is approaching AI in a user-centric way that stands in stark contrast to almost every other major player rolling out AI tools. Generative AI is an evolution from machine learning, which is something Apple has been using for everything from iPad screen palm rejection to autocorrect for a while now—to great effect, as we discussed in my interview with Apple AI chief John Giannandrea a few years ago. Apple just never wrapped it in a bow and called it AI until now.

But there was no good reason to rush these features out or to even brand them as “Apple Intelligence” and make a fuss about it. They’re natural extensions of what Apple was already doing. Since they’ve been rushed out the door with a spotlight shining on them, Apple’s AI ambitions have a rockier road ahead than the company might have hoped.

It could take a year or two for this all to come together

Using iOS 18.1, it’s clear that Apple’s large language models are not as effective or reliable as Claude or ChatGPT. It takes time to train models like these, and it looks like Apple started late.

Based on my hours spent with both Apple Intelligence and more established tools from cutting-edge AI companies, I feel the other models crossed a usefulness and reliability threshold a year or so ago. When ChatGPT first launched, it was more of a curiosity than a powerful tool. Now it’s a powerful tool, but that’s a relatively recent development.

In my time with Writing Tools and Notification Summaries in particular, Apple’s models subjectively appear to be around where ChatGPT or Claude were 18 months ago. Notification Summaries almost always miss crucial context in my experience. Writing Tools introduce errors where none existed before.

A writing suggestion shows an egregious grammatical error

It’s not hard to spot the huge error that Writing Tools introduced here. This happens all the time when I use it.

Credit: Samuel Axon

It’s not hard to spot the huge error that Writing Tools introduced here. This happens all the time when I use it. Credit: Samuel Axon

More mature models do these things, too, but at a much lower frequency. Unfortunately, Apple Intelligence isn’t far enough along to be broadly useful.

That said, I’m excited to see where Apple Intelligence will be in 24 months. I think the company is on the right track by using AI to target specific user needs rather than just putting a chatbot out there and letting people figure it out. It’s a much better approach than what we see with Microsoft’s Copilot. If Apple’s models cross that previously mentioned threshold of utility—and it’s only a matter of time before they do—the future of AI tools on Apple platforms could be great.

It’s just a shame that Apple didn’t seem to have the confidence to ignore the zeitgeisty commentators and roll out these features when they’re complete and ready, with messaging focusing on user problems instead of “hey, we’re taking AI seriously too.”

Most users don’t care if you’re taking AI seriously, but they do care if the tools you introduce can make their day-to-day lives better. I think they can—it will just take some patience. Users can be patient, but can Apple? It seems not.

Even so, there’s a real possibility that these early pains will be forgotten before long.

Photo of Samuel Axon

Samuel Axon is a senior editor at Ars Technica. He covers Apple, software development, gaming, AI, entertainment, and mixed reality. He has been writing about gaming and technology for nearly two decades at Engadget, PC World, Mashable, Vice, Polygon, Wired, and others. He previously ran a marketing and PR agency in the gaming industry, led editorial for the TV network CBS, and worked on social media marketing strategy for Samsung Mobile at the creative agency SPCSHP. He also is an independent software and game developer for iOS, Windows, and other platforms, and he is a graduate of DePaul University, where he studied interactive media and software development.

Apple botched the Apple Intelligence launch, but its long-term strategy is sound Read More »

corning-faces-antitrust-actions-for-its-gorilla-glass-dominance

Corning faces antitrust actions for its Gorilla Glass dominance

The European Commission (EC) has opened an antitrust investigation into US-based glass-maker Corning, claiming that its Gorilla Glass has dominated the mobile phone screen market due to restrictive deals and licensing.

Corning’s shatter-resistant alkali-aluminosilicate glass keeps its place atop the market, according to the EC’s announcement, because it both demands, and rewards with rebates, device makers that agree to “source all or nearly all of their (Gorilla Glass) demand from Corning.” Corning also allegedly required device makers to report competitive offers to the glass maker. The company is accused of exerting a similar pressure on “finishers,” or those firms that turn raw glass into finished phone screen protectors, as well as demanding finishers not pursue patent challenges against Corning.

“[T]he agreements that Corning put in place with OEMs and finishers may have excluded rival glass producers from large segments of the market, thereby reducing customer choice, increasing prices, and stifling innovation to the detriment of consumers worldwide,” the Commission wrote.

Ars has reached out to Corning for comment and will update this post with response.

Gorilla Glass does approach Xerox or Kleenex levels of brand name association with its function. New iterations of its thin, durable glass reach a bit further than the last and routinely pick up press coverage. Gorilla Glass 4 was pitched as being “up to two times stronger” than any “competitive” alternative. Gorilla Glass 5 could survive a 1.6-meter drop 80 percent of the time, and 6 built in more repetitive damage resistance.

Apple considers Corning’s glass products so essential to its products, like the ceramic shield on the iPhone 12, as to have invested $45 million into the company to expand its US manufacturing. The first iPhone was changed very shortly before launch to use Gorilla Glass instead of a plastic screen, per Steve Jobs’ insistence.

Corning faces antitrust actions for its Gorilla Glass dominance Read More »