Acting on a demand from the Trump administration, Apple has removed apps that let iPhone users report the locations of Immigration and Customs Enforcement (ICE) officers.
“We reached out to Apple today demanding they remove the ICEBlock app from their App Store—and Apple did so,” Attorney General Pam Bondi said in a statement to Fox News yesterday. “ICEBlock is designed to put ICE agents at risk just for doing their jobs, and violence against law enforcement is an intolerable red line that cannot be crossed.”
Apple confirmed it removed multiple apps after hearing from law enforcement. “We created the App Store to be a safe and trusted place to discover apps,” an Apple statement to news organizations said. “Based on information we’ve received from law enforcement about the safety risks associated with ICEBlock, we have removed it and similar apps from the App Store.”
The app removals follow a September 24 shooting at a Dallas ICE facility that resulted in the deaths of two immigrants in federal custody and the shooter. The shooter, identified as Joshua Jahn, “searched apps that tracked the presence of ICE agents,” according to FBI Director Kash Patel.
ICEBlock creator Joshua Aaron disputed claims that his app could have contributed to the shooting. He pointed out that an app isn’t needed to find the locations of ICE facilities.
“You don’t need to use an app to tell you where an ICE agent is when you’re aiming at an ICE detention facility,” Aaron told the BBC. “Everybody knows that’s where ICE agents are.”
Apple cited “objectionable content”
Aaron said he was disappointed by Apple’s decision to remove the app. “ICEBlock is no different from crowd-sourcing speed traps, which every notable mapping application including Apple’s own Maps app [does],” he was quoted as saying. “This is protected speech under the First Amendment of the United States Constitution.”
If your iPhone is your main or only camera, the iPhone 17 Pro is for you.
The iPhone 17 Pro’s excellent camera is the best reason to buy it instead of the regular iPhone 17. Credit: Andrew Cunningham
The iPhone 17 Pro’s excellent camera is the best reason to buy it instead of the regular iPhone 17. Credit: Andrew Cunningham
Apple’s “Pro” iPhones usually look and feel a lot like the regular ones, just with some added features stacked on top. They’ve historically had better screens and more flexible cameras, and there has always been a Max option for people who really wanted to blur the lines between a big phone and a small tablet (Apple’s commitment to the cheaper “iPhone Plus” idea has been less steadfast). But the qualitative experience of holding and using one wasn’t all that different compared to the basic aluminum iPhone.
This year’s iPhone 17 Pro looks and feels like more of a departure from the basic iPhone, thanks to a new design that prioritizes function over form. It’s as though Apple anticipated the main complaints about the iPhone Air—why would I want a phone with worse battery and fewer cameras, why don’t they just make the phone thicker so they can fit in more things—and made a version of the iPhone that they could point to and say, “We already make that phone—it’s that one over there.”
Because the regular iPhone 17 is so good, and because it uses the same 6.3-inch OLED ProMotion screen, I think the iPhone 17 Pro is playing to a narrower audience than usual this year. But Apple’s changes and additions are also tailor-made to serve that audience. In other words, fewer people even need to consider the iPhone Pro this time around, but there’s a lot to like here for actual “pros” and people who demand a lot from their phones.
Design
The iPhone 17 drops the titanium frame of the iPhone 15 and 16 Pro in favor of a return to aluminum. But it’s no longer the aluminum-framed glass-sandwich design that the iPhone 17 still uses; it’s a reformulated “aluminum unibody” design that also protects a substantial portion of the phone’s back. It’s the most metal we’ve seen on the back of the iPhone since 2016’s iPhone 7.
But remember that part of the reason the 2017 iPhone 8 and iPhone X switched to the glass sandwich design was wireless charging. The aluminum iPhones always featured some kind of cutouts or gaps in the aluminum to allow Wi-Fi, Bluetooth, and cellular signals through. But the addition of wireless charging to the iPhone meant that a substantial portion of the phone’s back now needed to be permeable by wireless signals, and the solution to that problem was simply to embrace it with a full sheet of glass.
The iPhone 17 Pro returns to the cutout approach, and while it might be functional, it leaves me pretty cold, aesthetically. Small stripes on the sides of the phone and running all the way around the “camera plateau” provide gaps between the metal parts so that you can’t mess with your cellular reception by holding the phone wrong; on US versions of the phone with support for mmWave 5G, there’s another long ovular cutout on the top of the phone to allow those signals to pass through.
But the largest and most obvious is the sheet of glass on the back that Apple needed to add to make wireless charging work. The aluminum, the cell signal cutouts, and this sheet of glass are all different shades of the phone’s base color (it’s least noticeable on the Deep Blue phone and most noticeable on the orange one).
The result is something that looks sort of unfinished and prototype-y. There are definitely people who will like or even prefer this aesthetic, which makes it clearer that this piece of technology is a piece of technology rather than trying to hide it—the enduring popularity of clear plastic electronics is a testament to this. But it does feel like a collection of design decisions that Apple was forced into by physics rather than choices it wanted to make.
That also extends to the camera plateau area, a reimagining of the old iPhone camera bump that extends all the way across the top of the phone. It’s a bit less slick-looking than the one on the iPhone Air because of the multiple lenses. And because the camera bumps are still additional protrusions on top of the plateau, the phone wobbles when it’s resting flat on a table instead of resting on the plateau in a way that stabilizes the phone.
Finally, there’s the weight of the phone, which isn’t breaking records but is a step back from a substantial weight reduction that Apple was using as a first-sentence-of-the-press-release selling point just two years ago. The iPhone 17 Pro weighs the same amount as the iPhone 14 Pro, and it has a noticeable heft to it that the iPhone Air (say) does not have. You’ll definitely notice if (like me) your current phone is an iPhone 15 Pro.
Apple sent me one of its $59 “TechWoven” cases with the iPhone 17 Pro, and it solved a lot of what I didn’t like about the design—the inconsistent materials and colors everywhere, and the bump-on-a-bump camera. There’s still a bump on the top, but at least the aperture of a case evens it out so that your phone isn’t tilted by the plateau and wobbling because of the bump.
I liked Apple’s TechWoven case for the iPhone Pro, partly because it papered over some of the things I don’t love about the design. Credit: Andrew Cunningham
The original FineWoven cases were (rightly) panned for how quickly and easily they scratched, but the TechWoven case might be my favorite Apple-designed phone case of the ones I’ve used. It doesn’t have the weird soft lint-magnet feel of some of the silicone cases, FineWoven’s worst problems seem solved, and the texture on the sides of the case provides a reassuring grippiness. My main issue is that the opening for the USB-C port on the bottom is relatively narrow. Apple’s cables will fit fine, but I had a few older or thicker USB-C connectors that didn’t.
This isn’t a case review, but I bring it up mainly to say that I stand by my initial assessment of the Pro’s function-over-form design: I am happy I put it in a case, and I think you will be, too, whichever case you choose (when buying for myself or family members, I have defaulted to Smartish cases for years, but your mileage may vary).
On “Scratchgate”
Early reports from Apple’s retail stores indicated that the iPhone 17 Pro’s design was more susceptible to scratches than past iPhones and that some seemed to be showing marks from as simple and routine an activity as connecting and disconnecting a MagSafe charging pad.
Apple says the marks left by its in-store MagSafe chargers weren’t permanent scratches and could be cleaned off. But independent testing from the likes of iFixit has found that the anodization process Apple uses to add color to the iPhone’s aluminum frame is more susceptible to scratching and flaking on non-flat surfaces like the edges of the camera bump.
Like “antennagate” and “bendgate” before it, many factors will determine whether “scratchgate” is actually something you’ll notice. Independent testing shows there is something to the complaints, but it doesn’t show how often this kind of damage will appear in actual day-to-day use over the course of months or years. Do keep it in mind when deciding which iPhone and accessories you want—it’s just one more reason to keep the iPhone 17 Pro in a case, if you ask me—but I wouldn’t say it should keep you from buying this phone if you like everything else about it.
Camera
I have front-loaded my complaints about the iPhone 17 Pro to get them out of the way, but the fun thing about an iPhone in which function follows form is that you get a lot of function.
When I made the jump from the regular iPhone to the Pro (I went from an 11 to a 13 Pro and then to a 15 Pro), I did it mainly for the telephoto lens in the camera. For both kid photos and casual product photography, it was game-changing to be able to access the functional equivalent of optical zoom on my phone.
The iPhone 17 Pro’s telephoto lens in 4x mode. Andrew Cunningham
The iPhone 16 Pro changed the telephoto lens’ zoom level from 3x to 5x, which was useful if you want maximum zoom but which did leave a gap between it and the Fusion Camera-enabled 2x mode. The 17 Pro switches to a 4x zoom by default, closing that gap, and it further maximizes the zooming capabilities by switching to a 48 MP sensor.
Like the main and ultrawide cameras, which had already switched to 48 MP sensors in previous models, the telephoto camera saves 24 MP images when shooting in 4x mode. But it can also crop a 12 MP image out of the center of that sensor to provide a native-resolution 12 MP image at an 8x zoom level, albeit without the image quality improvements from the “pixel binning” process that 4x images get.
You can debate how accurate it is to market this as “optical-quality zoom” as Apple does, but it’s hard to argue with the results. The level of detail you can capture from a distance in 8x mode is consistently impressive, and Apple’s hardware and software image stabilization help keep these details reasonably free of the shake and blur you might see if you were shooting at this zoom level with an actual hardware lens.
It’s my favorite feature of the iPhone 17 Pro, and it’s the thing about the phone that comes closest to being worth the $300 premium over the regular iPhone 17.
The iPhone 17 Pro, main lens, 1x mode. Andrew Cunningham
Apple continues to gate several other camera-related features to the Pro iPhones. All phones can shoot RAW photos in third-party camera apps that support it, but only the Pro iPhones can shoot Apple’s ProRAW format in the first-party camera app (ProRAW performs Apple’s typical image processing for RAW images but retains all the extra information needed for more flexible post-processing).
I don’t spend as much time shooting video on my phone as I do photos, but for the content creator and influencer set (and the “we used phones and also professional lighting and sound equipment to shoot this movie” set) Apple still reserves several video features for the Pro iPhones. That list includes 120 fps 4K Dolby Vision video recording and a four-mic array (both also supported by the iPhone 16 Pro), plus ProRes RAW recording and Genlock support for synchronizing video from multiple sources (both new to the 17 Pro).
The iPhone Pro also remains the only iPhone to support 10 Gbps USB transfer speeds over the USB-C port, making it faster to transfer large video files from the phone to an external drive or a PC or Mac for additional processing and editing. It’s likely that Apple built this capability into the A19 Pro’s USB controller, but both the iPhone Air and the regular iPhone 17 are restricted to the same old 25-year-old 480 Mbps USB 2.0 data transfer speeds.
The iPhone 17 Pro gets the same front camera treatment as the iPhone 17 and the Air: a new square “Center Stage” sensor that crops a 24 MP square image into an 18 MP image, allowing users to capture approximately the same aspect ratios and fields-of-view with the front camera regardless of whether they’re holding the phone in portrait or landscape mode. It’s definitely an image-quality improvement, but it’s the same as what you get with the other new iPhones.
Specs, speeds, and battery
You still need to buy a Pro phone to get a USB-C port with 10 Gbps USB 3 transfer speeds instead of 480 Mbps USB 2.0 speeds. Credit: Andrew Cunningham
The iPhone 17 Pro uses, by a slim margin, the fastest and most capable version of the A19 Pro chip, partly because it has all of the A19 Pro’s features fully enabled and partly because its thermal management is better than the iPhone Air’s.
The A19 Pro in the iPhone 17 Pro uses two high-performance CPU cores and four smaller high-efficiency CPU cores, plus a fully enabled six-core GPU. Like the iPhone Air, the iPhone Pro also includes 12GB of RAM, up from 8GB in the iPhone 16 Pro and the regular iPhone 17. Apple has added a vapor chamber to the iPhone 17 Pro to help keep it cool rather than relying on metal to conduct heat away from the chips—an infinitesimal amount of water inside a small metal pocket continually boils, evaporates, and condenses inside the closed copper-lined chamber. This spreads the heat evenly over a large area, compared to just using metal to conduct the heat; having the heat spread out over a larger area then allows that heat to be dissipated more quickly.
All phones were tested with Adaptive Power turned off.
We saw in our iPhone 17 review how that phone’s superior thermals helped it outrun the iPhone Air’s version of the A19 Pro in many of our graphics tests; the iPhone Pro’s A19 Pro beats both by a decent margin, thanks to both thermals and the extra hardware.
The performance line graph that 3DMark generates when you run its benchmarks actually gives us a pretty clear look at the difference between how the iPhones act. The graphs for the iPhone 15 Pro, the iPhone 17, and the iPhone 17 Pro all look pretty similar, suggesting that they’re cooled well enough to let the benchmark run for a couple of minutes without significant throttling. The iPhone Air follows a similar performance curve for the first half of the test or so but then drops noticeably lower for the second half—the ups and downs of the line actually look pretty similar to the other phones, but the performance is just a bit lower because the A19 Pro in the iPhone Air is already slowing down to keep itself cool.
The CPU performance of the iPhone 17 Pro is also marginally better than this year’s other phones, but not by enough that it will be user-noticeable.
As for battery, Apple’s own product pages say it lasts for about 10 percent longer than the regular iPhone 17 and between 22 and 36 percent longer than the iPhone Air, depending on what you’re doing.
I found the iPhone Air’s battery life to be tolerable with a little bit of babying and well-timed use of the Low Power Mode feature, and the iPhone 17’s battery was good enough that I didn’t worry about making it through an 18-hour day. But the iPhone 17 Pro’s battery really is a noticeable step up.
One day, I forgot to plug it in overnight and awoke to a phone that still had a 30 percent charge, enough that I could make it through the morning school drop-off routine and plug it in when I got back home. Not only did I not have to think about the iPhone 17 Pro’s battery, but it’s good enough that even a battery with 85-ish percent capacity (where most of my iPhone batteries end up after two years of regular use) should still feel pretty comfortable. After the telephoto camera lens, it’s definitely the second-best thing about the iPhone 17 Pro, and the Pro Max should last for even longer.
Pros only
Apple’s iPhone 17 Pro. Credit: Andrew Cunningham
I’m taken with a lot of things about the iPhone 17 Pro, but the conclusion of our iPhone 17 review still holds: If you’re not tempted by the lightness of the iPhone Air, then the iPhone 17 is the one most people should get.
Even more than most Pro iPhones, the iPhone 17 Pro and Pro Max will make the most sense for people who actually use their phones professionally, whether that’s for product or event photography, content creation, or some other camera-centric field where extra flexibility and added shooting modes can make a real difference. The same goes for people who want a bigger screen, since there’s no iPhone 17 Plus.
Sure, the 17 Pro also performs a little better than the regular 17, and the battery lasts longer. But the screen was always the most immediately noticeable upgrade for regular people, and the exact same display panel is now available in a phone that costs $300 less.
The benefit of the iPhone Pro becoming a bit more niche is that it’s easier to describe who each of these iPhones is for. The Air is the most pleasant to hold and use, and it’s the one you’ll probably buy if you want people to ask you, “Oh, is that one of the new iPhones?” The Pro is for people whose phones are their most important camera (or for people who want the biggest phone they can get). And the iPhone 17 is for people who just want a good phone but don’t want to think about it all that much.
The good
Excellent performance and great battery life
It has the most flexible camera in any iPhone, and the telephoto lens in particular is a noticeable step up from a 2-year-old iPhone 15 Pro
12GB of RAM provides extra future-proofing compared to the standard iPhone
Not counting the old iPhone 16, it’s Apple’s only iPhone to be available in two screen sizes
Extra photography and video features for people who use those features in their everyday lives or even professionally
The bad
Clunky, unfinished-looking design
More limited color options compared to the regular iPhone
Expensive
Landscape layouts for apps only work on the Max model
The ugly
Increased weight compared to previous models, which actually used their lighter weight as a selling point
Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.
xAI’s claim that Apple gave ChatGPT a monopoly on prompts is “baseless,” OpenAI says.
OpenAI and Apple have moved to dismiss a lawsuit by Elon Musk’s xAI, alleging that ChatGPT’s integration into a “handful” of iPhone features violated antitrust laws by giving OpenAI a monopoly on prompts and Apple a new path to block rivals in the smartphone industry.
The lawsuit was filed in August after Musk raged on X about Apple never listing Grok on its editorially curated “Must Have” apps list, which ChatGPT frequently appeared on.
According to Musk, Apple linking ChatGPT to Siri and other native iPhone features gave OpenAI exclusive access to billions of prompts that only OpenAI can use as valuable training data to maintain its dominance in the chatbot market. However, OpenAI and Apple are now mocking Musk’s math in court filings, urging the court to agree that xAI’s lawsuit is doomed.
As OpenAI argued, the estimates in xAI’s complaint seemed “baseless,” with Musk hesitant to even “hazard a guess” at what portion of the chatbot market is being foreclosed by the OpenAI/Apple deal.
xAI suggested that the ChatGPT integration may give OpenAI “up to 55 percent” of the potential chatbot prompts in the market, which could mean anywhere from 0 to 55 percent, OpenAI and Apple noted.
Musk’s company apparently arrived at this vague estimate by doing “back-of-the-envelope math,” and the court should reject his complaint, OpenAI argued. That math “was evidently calculated by assuming that Siri fields ‘1.5 billion user requests per day globally,’ then dividing that quantity by the ‘total prompts for generative AI chatbots in 2024,'”—”apparently 2.7 billion per day,” OpenAI explained.
These estimates “ignore the facts” that “ChatGPT integration is only available on the latest models of iPhones, which allow users to opt into the integration,” OpenAI argued. And for any user who opts in, they must link their ChatGPT account for OpenAI to train on their data, OpenAI said, further restricting the potential prompt pool.
By Musk’s own logic, OpenAI alleged, “the relevant set of Siri prompts thus cannot plausibly be 1.5 billion per day, but is instead an unknown, unpleaded fraction of a fraction of a fraction of that number.”
Additionally, OpenAI mocked Musk for using 2024 statistics, writing that xAI failed to explain “the logic of using a year-old estimate of the number of prompts when the pleadings elsewhere acknowledge that the industry is experiencing ‘exponential growth.'”
Apple’s filing agreed that Musk’s calculations “stretch logic,” appearing “to rest on speculative and implausible assumptions that the agreement gives ChatGPT exclusive access to all Siri requests from all Apple devices (including older models), and that OpenAI may use all such requests to train ChatGPT and achieve scale.”
“Not all Siri requests” result in ChatGPT prompts that OpenAI can train on, Apple noted, “even by users who have enabled devices and opt in.”
OpenAI reminds court of Grok’s MechaHitler scandal
OpenAI argued that Musk’s lawsuit is part of a pattern of harassment that OpenAI previously described as “unrelenting” since ChatGPT’s successful debut, alleging it was “the latest effort by the world’s wealthiest man to stifle competition in the world’s most innovative industry.”
As OpenAI sees it, “Musk’s pretext for litigation this time is that Apple chose to offer ChatGPT as an optional add-on for several built-in applications on its latest iPhones,” without giving Grok the same deal. But OpenAI noted that the integration was rolled out around the same time that Musk removed “woke filters” that caused Grok to declare itself “MechaHitler.” For Apple, it was a business decision to avoid Grok, OpenAI argued.
Apple did not reference the Grok scandal in its filing but in a footnote confirmed that “vetting of partners is particularly important given some of the concerns about generative AI chatbots, including on child safety issues, nonconsensual intimate imagery, and ‘jailbreaking’—feeding input to a chatbot so it ignores its own safety guardrails.”
A similar logic was applied to Apple’s decision not to highlight Grok as a “Must Have” app, their filing said. After Musk’s public rant about Grok’s exclusion on X, “Apple employees explained the objective reasons why Grok was not included on certain lists, and identified app improvements,” Apple noted, but instead of making changes, xAI filed the lawsuit.
Also taking time to point out the obvious, Apple argued that Musk was fixated on the fact that his charting apps never make the “Must Have Apps” list, suggesting that Apple’s picks should always mirror “Top Charts,” which tracks popular downloads.
“That assumes that the Apple-curated Must-Have Apps List must be distorted if it does not strictly parrot App Store Top Charts,” Apple argued. “But that assumption is illogical: there would be little point in maintaining a Must-Have Apps List if all it did was restate what Top Charts say, rather than offer Apple’s editorial recommendations to users.”
Likely most relevant to the antitrust charges, Apple accused Musk of improperly arguing that “Apple cannot partner with OpenAI to create an innovative feature for iPhone users without simultaneously partnering with every other generative AI chatbot—regardless of quality, privacy or safety considerations, technical feasibility, stage of development, or commercial terms.”
“No facts plausibly” support xAI’s “assertion that Apple intentionally ‘deprioritized'” xAI apps “as part of an illegal conspiracy or monopolization scheme,” Apple argued.
And most glaringly, Apple noted that xAI is not a rival or consumer in the smartphone industry, where it alleges competition is being harmed. Apple urged the court to reject Musk’s theory that Apple is incentivized to boost OpenAI to prevent xAI’s ascent in building a “super app” that would render smartphones obsolete. If Musk’s super app dream is even possible, Apple argued, it’s at least a decade off, insisting that as-yet-undeveloped apps should not serve as the basis for blocking Apple’s measured plan to better serve customers with sophisticated chatbot integration.
“Antitrust laws do not require that, and for good reason: imposing such a rule on businesses would slow innovation, reduce quality, and increase costs, all ultimately harming the very consumers the antitrust laws are meant to protect,” Apple argued.
Musk’s weird smartphone market claim, explained
Apple alleged that Musk’s “grievance” can be “reduced to displeasure that Apple has not yet ‘integrated with any other generative AI chatbots’ beyond ChatGPT, such as those created by xAI, Google, and Anthropic.”
In a footnote, the smartphone giant noted that by xAI’s logic, Musk’s social media platform X “may be required to integrate all other chatbots—including ChatGPT—on its own social media platform.”
But antitrust law doesn’t work that way, Apple argued, urging the court to reject xAI’s claims of alleged market harms that “rely on a multi-step chain of speculation on top of speculation.” As Apple summarized, xAI contends that “if Apple never integrated ChatGPT,” xAI could win in both chatbot and smartphone markets, but only if:
1. Consumers would choose to send additional prompts to Grok (rather than other generative AI chatbots). 2. The additional prompts would result in Grok achieving scale and quality it could not otherwise achieve. 3. As a result, the X app would grow in popularity because it is integrated with Grok. 4. X and xAI would therefore be better positioned to build so-called “super apps” in the future, which the complaint defines as “multi-functional” apps that offer “social connectivity and messaging, financial services, e-commerce, and entertainment.” 5. Once developed, consumers might choose to use X’s “super app” for various functions. 6. “Super apps” would replace much of the functionality of smartphones and consumers would care less about the quality of their physical phones and rely instead on these hypothetical “super apps.” 7. Smartphone manufacturers would respond by offering more basic models of smartphones with less functionality. 8. iPhone users would decide to replace their iPhones with more “basic smartphones” with “super apps.”
Apple insisted that nothing in its OpenAI deal prevents Musk from building his super apps, while noting that from integrating Grok into X, Musk understands that integration of a single chatbot is a “major undertaking” that requires “substantial investment.” That “concession” alone “underscores the massive resources Apple would need to devote to integrating every AI chatbot into Apple Intelligence,” while navigating potential user safety risks.
The iPhone maker also reminded the court that it has always planned to integrate other chatbots into its native features after investing in and testing Apple Intelligence’s performance, relying on what Apple deems is the best chatbot on the market today.
Backing Apple up, OpenAI noted that Musk’s complaint seemed to cherry-pick testimony from Google CEO Sundar Pichai, claiming that “Google could not reach an agreement to integrate” Gemini “with Apple because Apple had decided to integrate ChatGPT.”
“The full testimony recorded in open court reveals Mr. Pichai attesting to his understanding that ‘Apple plans to expand to other providers for Generative AI distribution’ and that ‘[a]s CEO of Google, [he is] hoping to execute a Gemini distribution agreement with Apple’ later in 2025,” OpenAI argued.
Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.
Caroline Wilson Palow, legal director of the campaign group Privacy International, said the new order might be “just as big a threat to worldwide security and privacy” as the old one.
She said: “If Apple breaks end-to-end encryption for the UK, it breaks it for everyone. The resulting vulnerability can be exploited by hostile states, criminals, and other bad actors the world over.”
Apple made a complaint to the Investigatory Powers Tribunal over the original demand, backed by a parallel legal challenge from Privacy International and Liberty, another campaign group. That case was due to be heard early next year, but the new order may restart the legal process.
TCNs are issued under the UK Investigatory Powers Act, which the government maintains is needed by law enforcement to investigate terrorism and child sexual abuse.
Key figures in Donald Trump’s administration, including vice-president JD Vance and director of national intelligence Tulsi Gabbard, had pressured the UK to retract the January TCN. President Donald Trump has likened the UK’s request to Chinese state surveillance.
In August, Gabbard told the Financial Times that the UK had “agreed to drop” its demand that Apple enable access to “the protected encrypted data of American citizens.”
A person close to the Trump administration said at the time that the request for Apple to break its encryption would have to be dropped altogether to be faithful to the agreement between the two countries. Any back door would weaken protections for US citizens, the person said.
UK Prime Minister Sir Keir Starmer last month hosted Trump for a state visit, during which the two world leaders announced that US tech companies would invest billions of dollars to build artificial intelligence infrastructure in Britain.
Members of the US delegation raised the issue of the request to Apple around the time of Trump’s visit, according to two people briefed on the matter. However, two senior British government figures said the US administration was no longer leaning on the UK government to rescind the order.
Now that iOS 26, macOS 26 Tahoe, and Apple’s other big software updates for the year are out in public, Apple’s efforts for the next few months will shift to fixing bugs and adding individual new features. The first of those bug fix updates has arrived this week in the form of iOS 26.0.1, macOS 26.0.1, iPadOS 26.0.1, and equivalent updates for most of the devices across Apple’s ecosystem.
The release notes for most of the updates focus on device- and platform-specific early adopter problems, particularly for buyers of the new iPhone 17, iPhone 17 Pro, and iPhone Air.
The iOS 26.0.1 update fixes a bug that could prevent phones from connecting to cellular networks, a bug that could cause app icons to appear blank, and the VoiceOver feature becoming disabled on devices that have it on. Camera, Wi-Fi, and Bluetooth bugs with the new iPhones have also been patched. The iPadOS update also fixes a bug that was causing the floating software keyboard to move around.
let’s not confuse “more interesting” with “better”
The least exciting iPhone this year is also the best value for the money.
The iPhone 17 Pro isn’t flashy but it’s probably the best of this year’s upgrades. Credit: Andrew Cunningham
The iPhone 17 Pro isn’t flashy but it’s probably the best of this year’s upgrades. Credit: Andrew Cunningham
Apple seems determined to leave a persistent gap between the cameras of its Pro iPhones and the regular ones, but most other features—the edge-to-edge-screen design with FaceID, the Dynamic Island, OLED display panels, Apple Intelligence compatibility—eventually trickle down to the regular-old iPhone after a generation or two of timed exclusivity.
One feature that Apple has been particularly slow to move down the chain is ProMotion, the branding the company uses to refer to a screen that can refresh up to 120 times per second rather than the more typical 60 times per second. ProMotion isn’t a necessary feature, but since Apple added it to the iPhone 13 Pro in 2021, the extra fluidity and smoothness, plus the always-on display feature, have been big selling points for the Pro phones.
This year, ProMotion finally comes to the regular-old iPhone 17, years after midrange and even lower-end Android phones made the swap to 90 or 120 Hz display panels. And it sounds like a small thing, but the screen upgrade—together with a doubling of base storage from 128GB to 256GB—makes the gap between this year’s iPhone and iPhone Pro feel narrower than it’s been in a long time. If you jumped on the Pro train a few years back and don’t want to spend that much again, this might be a good year to switch back. If you’ve ever been tempted by the Pro but never made the upgrade, you can continue not doing that and miss out on relatively little.
The iPhone 17 has very little that we haven’t seen in an iPhone before, compared to the redesigned Pro or the all-new Air. But it’s this year’s best upgrade, and it’s not particularly close.
You’ve seen this one before
Externally, the iPhone 17 is near-identical to the iPhone 16, which itself used the same basic design Apple had been using since the iPhone 12. The most significant update in that five-year span was probably the iPhone 15, which switched from the display notch to the Dynamic Island and from the Lightning port to USB-C.
The iPhone 12 generation was also probably the last time the regular iPhone and the Pro were this similar. Those phones used the same basic design, the same basic chip, and the same basic screen, leaving mostly camera-related improvements and the Max model as the main points of differentiation. That’s all broadly true of the split between the iPhone 17 and the 17 Pro, as well.
The iPhone Air and Pro both depart from the last half-decade of iPhone designs in different ways, but the iPhone 17 sticks with the tried-and-true. Credit: Andrew Cunningham
The iPhone 17’s design has changed just enough since last year that you’ll need to find a new iPhone 17-compatible case and screen protector for your phone rather than buying something that fits a previous-generation model (it’s imperceptibly taller than the iPhone 16). The screen size has been increased from 6.1 inches to 6.3, the same as the iPhone Pro. But the aluminum-framed-glass-sandwich design is much less of a departure from recent precedent than either the iPhone Air or the Pro.
The screen is the real star of the show in the iPhone 17, bringing 120 Hz ProMotion technology and the Pro’s always-on display feature to the regular iPhone for the first time. According to Apple’s spec sheets (and my eyes, admittedly not a scientific measurement), the 17 and the Pro appear to be using identical display panels, with the same functionally infinite contrast, resolution (2622 x 1206), and brightness specs (1,000 nits typical, 1,600 nits for HDR, 3,000 nits peak in outdoor light).
It’s easy to think of the basic iPhone as “the cheap one” because it is the least expensive of the four new phones Apple puts out every year, but $799 is still well into premium-phone range, and even middle-of-the-road phones from the likes of Google and Samsung have been shipping high-refresh-rate OLED panels in cheaper phones than this for a few years now. By that metric, it’s faintly ridiculous that Apple isn’t shipping something like this in its $600 iPhone 16e, but in Apple’s ecosystem, we’ll take it as a win that the iPhone 17 doesn’t cost more than the 16 did last year.
Holding an iPhone 17 feels like holding any other regular-sized iPhone made within the last five years, with the exceptions of the new iPhone Air and some of the heavier iPhone Pros. It doesn’t have the exceptionally good screen-size-to-weight ratio or the slim profile of the Air, and it doesn’t have the added bulk or huge camera plateau of the iPhone 17 Pro. It feels about like it looks: unremarkable.
Camera
iPhone 15 Pro, main lens, 1x mode, outdoor light. If you’re just shooting with the main lens, the Air and iPhone 17 win out in color and detail thanks to a newer sensor and ISP. Andrew Cunningham
The iPhone Air’s single camera has the same specs and uses the same sensor as the iPhone 17’s main camera, so we’ve already written a bit about how well it does relative to the iPhone Pro and to an iPhone 15 Pro from a couple of years ago.
Like the last few iPhone generations, the iPhone 17’s main camera uses a 48 MP sensor that saves 24 MP images, using a process called “pixel binning” to decide which pixels are saved and which are discarded when shrinking the images down. To enable an “optical quality” 2x telephoto mode, Apple crops a 12 MP image out of the center of that sensor without doing any resizing or pixel binning. The results are a small step down in quality from the regular 1x mode, but they’re still native resolution images with no digital zoom, and the 2x mode on the iPhone Air or iPhone 17 can actually capture fine detail better than an older iPhone Pro in situations where you’re shooting an object that’s close by and the actual telephoto lens isn’t used.
The iPhone 15 Pro. When you shoot a nearby subject in 2x or even 3x mode, the Pro phones give you a crop of the main sensor rather than switching to the telephoto lens. You need to be farther from your subject for the phone to engage the telephoto lens. Andrew Cunningham
One improvement to the iPhone 17’s camera sensor this year is that the ultrawide camera is also upgraded to a 48 MP sensor so it can benefit from the same shrinking-and-pixel-binning strategy Apple uses for the main camera. In the iPhone 16, this secondary sensor was still just 12 MP.
Compared to the iPhone 15 Pro and iPhone 16 we have here, wide shots on the iPhone 17 benefit mainly from the added detail you capture in higher-resolution 24 or 48 MP images. The difference is slightly more noticeable with details in the background of an image than details in the foreground, as visible in the Lego castle surrounding Lego Mario.
The older the phone you’re using is, the more you’ll benefit from sensor and image signal processing improvements. Bits of dust and battle damage on Mario are most distinct on the iPhone 17 than the iPhone 15 Pro, for example, but aside from the resolution, I don’t notice much of a difference between the iPhone 16 and 17.
A true telephoto lens is probably the biggest feature the iPhone 17 Pro has going for it relative to the basic iPhone 17, and Apple has amped it up with its own 48 MP sensor this year. We’ll reuse the 4x and 8x photos from our iPhone Air review to show you what you’re missing—the telephoto camera captures considerably more fine detail on faraway objects, but even as someone who uses the telephoto on the iPhone 15 Pro constantly, I would have to think pretty hard about whether that camera is worth $300, even once you add in the larger battery, ProRAW support, and other things Apple still holds back for the Pro phones.
Specs and speeds and battery
Our iPhone Air review showed that the main difference between the iPhone 17’s Apple A19 chip and the A19 Pro used in the iPhone Air and iPhone Pro is RAM. The iPhone 17 sticks with 8GB of memory, whereas both Air and Pro are bumped up to 12GB.
There are other things that the A19 Pro can enable, including ProRes video support and 10Gbps USB 3 file transfer speeds. But many of those iPhone Pro features, including the sixth GPU core, are mostly switched off for the iPhone Air, suggesting that we could actually be looking at the exact same silicon with a different amount of RAM packaged on top.
Regardless, 8GB of RAM is currently the floor for Apple Intelligence, so there’s no difference in features between the iPhone 17 and the Air or the 17 Pro. Browser tabs and apps may be ejected from memory slightly less frequently, and the 12GB phones may age better as the years wear on. But right now, 8GB of memory puts you above the amount that most iOS 26-compatible phones are using—Apple is still optimizing for plenty of phones with 6GB, 4GB, or even 3GB of memory. 8GB should be more than enough for the foreseeable future, and I noticed zero differences in day-to-day performance between the iPhone 17 and the iPhone Air.
All phones were tested with Adaptive Power turned off.
The iPhone 17 is often actually faster than the iPhone Air, despite both phones using five-core A19-class GPUs. Apple’s thinnest phone has less room to dissipate heat, which leads to more aggressive thermal throttling, especially for 3D apps like games. The iPhone 17 will often outperform Apple’s $999 phone, despite costing $200 less.
All of this also ignores one of the iPhone 17’s best internal upgrades: a bump from 128GB of storage to 256GB of storage at the same $799 starting price as the iPhone 16. Apple’s obnoxious $100-or-$200-per-tier upgrade pricing for storage and RAM is usually the worst part about any of its products, so any upgrade that eliminates that upcharge for anyone is worth calling out.
On the battery front, we didn’t run specific tests, but the iPhone 17 did reliably make it from my typical 7: 30 or 7: 45 am wakeup to my typical 1: 00 or 1: 30 am bedtime with 15 or 20 percent leftover. Even a day with Personal Hotspot use and a few dips into Pokémon Go didn’t push the battery hard enough to require a midday top-up. (Like the other new iPhones this year, the iPhone 17 ships with Adaptive Power enabled, which can selectively reduce performance or dim the screen and automatically enables Low Power Mode at 20 percent, all in the name of stretching the battery out a bit and preventing rapid drops.)
Better battery life out of the box is already a good thing, but it also means more wiggle room for the battery to lose capacity over time without seriously inconveniencing you. This is a line that the iPhone Air can’t quite cross, and it will become more and more relevant as your phone approaches two or three years in service.
The one to beat
Apple’s iPhone 17. Credit: Andrew Cunningham
The screen is one of the iPhone Pro’s best features, and the iPhone 17 gets it this year. That plus the 256GB storage bump is pretty much all you need to know; this will be a more noticeable upgrade for anyone with, say, the iPhones 12-to-14 than the iPhone 15 or 16 was. And for $799—$200 more than the 128GB version of the iPhone 16e and $100 more than the 128GB version of the iPhone 16—it’s by far the iPhone lineup’s best value for money right now.
This is also happening at the same time as the iPhone Pro is getting a much chonkier new design, one I don’t particularly love the look of even though I do appreciate the functional camera and battery upgrades it enables. This year’s Pro feels like a phone targeted toward people who are actually using it in a professional photography or videography context, where in other years, it’s felt more like “the regular iPhone plus a bunch of nice, broadly appealing quality-of-life stuff that may or may not trickle down to the regular iPhone over time.”
In this year’s lineup, you get the iPhone Air, which feels like it’s trying to do something new at the expense of basics like camera and battery life. You get the iPhone 17 Pro, which feels like it was specifically built for anyone who looks at the iPhone Air and thinks, “I just want a phone with a bigger battery and a better camera and I don’t care what it looks like or how light it is” (hello, median Ars Technica readers and employees). And the iPhone 17 is there quietly undercutting them both, as if to say, “Would anyone just like a really good version of the regular iPhone?”
Next and last on our iPhone review list this year: the iPhone 17 Pro. Maybe spending a few days up close with it will help me appreciate the design more?
The good
The exact same screen as this year’s iPhone Pro for $300 less, including 120 Hz ProMotion, variable refresh rates, and an always-on screen.
Same good main camera as the iPhone Air, plus the added flexibility of an improved wide-angle camera.
Good battery life.
A19 is often faster than iPhone Air’s A19 Pro thanks to better heat dissipation.
Jumps from 128GB to 256GB of storage without increasing the starting price.
The bad
8GB of RAM instead of 12GB. 8GB is fine but more is also good!
I slightly prefer last year’s versions of most of these color options.
No two-column layout for apps in landscape mode.
The telephoto lens seems like it will be restricted to the iPhone Pro forever.
The ugly
People probably won’t be able to tell you have a new iPhone?
Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.
These features try to turn iPhones into more powerful work and organization tools.
iOS 26 came out last week, bringing a new look and interface alongside some new capabilities and updates aimed squarely at iPhone power users.
We gave you our main iOS 26 review last week. This time around, we’re taking a look at some of the updates targeted at people who rely on their iPhones for much more than making phone calls and browsing the Internet. Many of these features rely on Apple Intelligence, meaning they’re only as reliable and helpful as Apple’s generative AI (and only available on newer iPhones, besides). Other adjustments are smaller but could make a big difference to people who use their phone to do work tasks.
Reminders attempt to get smarter
The Reminders app gets the Apple Intelligence treatment in iOS 26, with the AI primarily focused on making it easier to organize content within Reminders lists. Lines in Reminders lists are often short, quickly jotted-down blurbs rather than lengthy, detailed complex instructions. With this in mind, it’s easy to see how the AI can sometimes lack enough information in order to perform certain tasks, like logically grouping different errands into sensible sections.
But Apple also encourages applying the AI-based Reminders features to areas of life that could hold more weight, such as making a list of suggested reminders from emails. For serious or work-critical summaries, Reminders’ new Apple Intelligence capabilities aren’t reliable enough.
Suggested Reminders based on selected text
iOS 26 attempts to elevate Reminders from an app for making lists to an organization tool that helps you identify information or important tasks that you should accomplish. If you share content, such as emails, website text, or a note, with the app, it can create a list of what it thinks are the critical things to remember from the text. But if you’re trying to extract information any more advanced than an ingredients list from a recipe, Reminders misses the mark.
Sometimes I tried sharing longer text with Reminders and didn’t get any suggestions.
Credit: Scharon Harding
Sometimes I tried sharing longer text with Reminders and didn’t get any suggestions. Credit: Scharon Harding
Sometimes, especially when reviewing longer text, Reminders was unable to think of suggested reminders. Other times, the reminders that it suggested, based off of lengthy messages, were off-base.
For instance, I had the app pull suggested reminders from a long email with guidelines and instructions from an editor. Highlighting a lot of text can be tedious on a touchscreen, but I did it anyway because the message had lots of helpful information broken up into sections that each had their own bold sub-headings. Additionally, most of those sections had their own lists (some using bullet points, some using numbers). I hoped Reminders would at least gather information from all of the email’s lists. But the suggested reminders ended up just being the same text from three—but not all—of the email’s bold sub-headings.
When I tried getting suggested reminders from a smaller portion of the same email, I surprisingly got five bullet points that covered more than just the email’s sub-headings but that still missed key points, including the email’s primary purpose.
Ultimately, the suggested Reminders feature mostly just boosts the app’s ability to serve as a modern shopping list. Suggested Reminders excels at pulling out ingredients from recipes, turning each ingredient into a suggestion that you can tap to add to a Reminders list. But being able to make a bulleted list out of a bulleted list is far from groundbreaking.
Auto-categorizing lines in Reminders lists
Since iOS 17, Reminders has been able to automatically sort items in grocery lists into distinct categories, like Produce and Proteins. iOS 26 tries taking things further by automatically grouping items in a list into non-culinary sections.
The way Reminders groups user-created tasks in lists is more sensible—and useful—than when it tries to create task suggestions based on shared text.
For example, I made a long list of various errands I needed to do, and Reminders grouped them into these categories: Administrative Tasks, Household Chores, Miscellaneous, Personal Tasks, Shopping, and Travel & Accommodation. The error rate here is respectable, but I would have tweaked some things. For one, I wouldn’t use the word “administrative” to refer to personal errands. The two tasks included under Administrative Tasks would have made more sense to me in Personal Tasks or Miscellaneous, even though those category names are almost too vague to have distinct meaning.
Preview comes to iOS
With Preview’s iOS debut, Apple brings to iPhones an app for viewing and editing PDFs and images that macOS users have had for years. As a result, many iPhone users will find the software easy and familiar to use.
But for iPhone owners who have long relied on Files for viewing, marking, and filling out PDFs and the like, Preview doesn’t bring many new capabilities. Anything that you can do in Preview, you could have done by viewing the same document in Files in an older version of iOS, save for a new crop tool and dedicated button for showing information about the document.
That’s kind of the point, though. When an iPhone has two discrete apps that can read and edit files, it’s far less frustrating to work with multiple documents. While you’re annotating a document in Preview, the Files app is still available, allowing you to have more than one document open at once. It’s a simple adjustment but one that vastly improves multitasking.
More Shortcuts options
Shortcuts gets somewhat more capable in iOS 26. That’s assuming you’re interested in using ChatGPT or Apple Intelligence generative AI in your automated tasks. You can tag in generative AI to create a shortcut that includes summarizing text in bullet points and applying that bulleted list to the shortcut’s next task, for instance.
An example of a Shortcut that uses generative AI.
Credit: Apple
An example of a Shortcut that uses generative AI. Credit: Apple
There are inherent drawbacks here. For one, Apple Intelligence and ChatGPT, like many generative AI tools, are subject to inaccuracies and can frequently overlook and/or misinterpret critical information. iOS 26 makes it easier for power users to incorporate a rewrite of a long text that has a more professional tone into a Shortcut. But that doesn’t mean that AI will properly communicate the information, especially when used across different scenarios with varied text.
You have three options for building Shortcuts that include use of AI models. Using ChatGPT or Apple Intelligence via Apple’s Private Cloud Compute, which runs the model on an Apple server, requires an Internet connection. Alternatively, you can use an on-device model without connecting to the web.
You can run more advanced models via Private Cloud Compute than you can with Apple Intelligence on-device. In Apple’s testing, models via Private Cloud Compute perform better on things like writing summaries and composition compared to on-device models.
Apple says personal user data sent to Private Cloud Compute “isn’t accessible to anyone other than the user — not even to Apple.” Apple has a strong, but flawed, reputation for being better about user privacy than other Big Tech firms. But by offering three different models to use with Shortcuts, iOS 26 ensures greater functionality, options, and control.
Something for podcasters
It’s likely that more people rely on iPads (or Macs) than iPhones for podcasting. Nevertheless, a new local capture feature introduced to both iOS 26 and iPadOS 26 makes it a touch more feasible to use iPhones (and iPads especially) for recording interviews for podcasts.
Before the latest updates, iOS and iPadOS only allowed one app to access the device’s microphone at a time. So, if you were interviewing someone via a videoconferencing app, you couldn’t also use your iPhone or iPad to record the discussion, since the videoconferencing app is using your mic to share your voice with whoever is on the other end of the call. Local capture on iOS 26 doesn’t include audio input controls, but its inclusion gives podcasters a way to record interviews or conversations on iPhones without needing additional software or hardware. That capability could save the day in a pinch.
Scharon is a Senior Technology Reporter at Ars Technica writing news, reviews, and analysis on consumer gadgets and services. She’s been reporting on technology for over 10 years, with bylines at Tom’s Hardware, Channelnomics, and CRN UK.
In June, the company announced changes to its app store policy in an attempt to avoid being further penalized by Brussels.
Apple argues the bloc’s digital rules have made it harder to do business in Europe and worsened consumers’ experience.
In a post on Thursday, the company said the DMA was leaving European consumers with fewer choices and creating an unfair competitive landscape—contrary to the law’s own goals.
For example, Apple said it had had to delay certain features, such as live translation via its AirPods, to make sure they complied with the DMA’s requirement for “interoperability.” The EU rules specify that apps and devices made by one company need to work with those made by competitors.
“Despite our concerns with the DMA, teams across Apple are spending thousands of hours to bring new features to the European Union while meeting the law’s requirements. But it’s become clear that we can’t solve every problem the DMA creates,” the company said.
A European Commission spokesperson said it was normal that companies sometimes “need more time to make their products compliant” and that the commission was helping companies to do so.
The spokesperson also said that “DMA compliance is not optional, it’s an obligation.”
The EU is set to scrutinize if Apple, Google, and Microsoft are failing to adequately police financial fraud online, as it steps up efforts to police how Big Tech operates online.
The EU’s tech chief Henna Virkkunen told the Financial Times that on Tuesday, the bloc’s regulators would send formal requests for information to the three US Big Tech groups as well as global accommodation platform Booking Holdings, under powers granted under the Digital Services Act to tackle financial scams.
“We see that more and more criminal actions are taking place online,” Virkkunen said. “We have to make sure that online platforms really take all their efforts to detect and prevent that kind of illegal content.”
The move, which could later lead to a formal investigation and potential fines against the companies, comes amid transatlantic tensions over the EU’s digital rulebook. US President Donald Trump has threatened to punish countries that “discriminate” against US companies with higher tariffs.
Virkkunnen stressed the commission looked at the operations of individual companies, rather than where they were based. She will scrutinize how Apple and Google are handling fake applications in their app stores, such as fake banking apps.
She said regulators would also look at fake search results in the search engines of Google and Microsoft’s Bing. The bloc wants to have more information about the approach Booking Holdings, whose biggest subsidiary Booking.com is based in Amsterdam, is taking to fake accommodation listings. It is the only Europe-based company among the four set to be scrutinized.
At multiple points over many years, Apple executives have taken great pains to point out that they think touchscreen Macs are a silly idea. But it remains one of those persistent Mac rumors that crops up over and over again every couple of years, from sources that are reliable enough that they shouldn’t be dismissed out of hand.
Today’s contribution comes from supply chain analyst Ming Chi-Kuo, who usually has some insight into what Apple is testing and manufacturing. Kuo says that touchscreen MacBook Pros are “expected to enter mass production by late 2026,” and that the devices will also shift to using OLED display panels instead of the Mini LED panels on current-generation MacBook Pros.
Kuo says that Apple’s interest in touchscreen Macs comes from “long-term observation of iPad user behavior.” Apple’s tablet hardware launches in the last few years have also included keyboard and touchpad accessories, and this year’s iPadOS 26 update in particular has helped to blur the line between the touch-first iPad and the keyboard-and-pointer-first Mac. In other words, Apple has already acknowledged that both kinds of input can be useful when combined in the same device; taking that same jump on the Mac feels like a natural continuation of work Apple is already doing.
Touchscreens became much more common on Windows PCs starting in 2012 when Windows 8 was released, itself a response to Apple’s introduction of the iPad a couple of years before. Microsoft backed off on almost all of Windows 8’s design decisions in the following years after the dramatic UI shift proved unpopular with traditional mouse-and-keyboard users, but touchscreen PCs like Microsoft’s Surface lineup have persisted even as the software has changed.
Spotlighting the most helpful new features of iOS 26.
The new Clear icons look in iOS 26 can make it hard to identify apps, since they’re all the same color. Credit: Scharon Harding
iOS 26 became publicly available this week, ushering in a new OS naming system and the software’s most overhauled look since 2013. It may take time to get used to the new “Liquid Glass” look, but it’s easier to appreciate the pared-down controls.
Beyond a glassy, bubbly new design, the update’s flashiest new features also include new Apple Intelligence AI integration that varies in usefulness, from fluffy new Genmoji abilities to a nifty live translation feature for Phones, Messages, and FaceTime.
New tech is often bogged down with AI-based features that prove to be overhyped, unreliable, or just not that useful. iOS 26 brings a little of each, so in this review, we’ll home in on the iOS updates that will benefit both mainstream and power users the most.
Table of Contents
Let’s start with Liquid Glass
If we’re talking about changes that you’re going to use a lot, we should start with the new Liquid Glass software design that Apple is applying across all of its operating systems. iOS hasn’t had this much of a makeover since iOS 7. However, where iOS 7 applied a flatter, minimalist effect to windows and icons and their edges, iOS 26 adds a (sometimes frosted) glassy look and a mildly fluid movement to actions such as pulling down menus or long-pressing controls. All the while, windows look like they’re reflecting the content underneath them. When you pull Safari’s menu atop a webpage, for example, blurred colors from the webpage’s images and text are visible on empty parts of the menu.
Liquid Glass is now part of most of Apple’s consumer devices, including Macs and Apple TVs, but the dynamic visuals and motion are especially pronounced as you use your fingers to poke, slide, and swipe across your iPhone’s screen.
For instance, when you use a tinted color theme or the new clear theme for Home Screen icons, colors from the Home Screen’s background look like they’re refracting from under the translucent icons. It’s especially noticeable when you slide to different Home Screen pages. And in Safari, the address bar shrinks down and becomes more translucent as you scroll to read an article.
Because the theme is incorporated throughout the entire OS, the Liquid Glass effect can be cheesy at times. It feels forced in areas such as Settings, where text that just scrolled past looks slightly blurred at the top of the screen.
Liquid Glass makes the top of the Settings menu look blurred.
Credit: Scharon Harding
Liquid Glass makes the top of the Settings menu look blurred. Credit: Scharon Harding
Other times, the effect feels fitting, like when pulling the Control Center down and its icons appear to stretch down to the bottom of the screen and then quickly bounce into their standard size as you release your finger. Another place Liquid Glass flows nicely is in Photos. As you browse your pictures, colors subtly pop through the translucent controls at the bottom of the screen.
This is a matter of appearance, so you may have your own take on whether Liquid Glass looks tasteful or not. But overall, it’s the type of redesign that’s distinct enough to be a fun change, yet mild enough that you can grow accustomed to it if you’re not immediately impressed.
Liquid Glass simplifies navigation (mostly)
There’s more to Liquid Glass than translucency. Part of the redesign is simplifying navigation in some apps by displaying fewer controls.
Opening Photos is now cleaner at launch, bringing you to all of your photos instead of the Collections section, like iOS 18 does. At the bottom are translucent tabs for Library and Collections, plus a Search icon. Once you start browsing, the Library and Collections tabs condense into a single icon, and Years, Months, and All tabs appear, maintaining a translucence that helps keep your focus on your pictures.
You can still bring up more advanced options (such as Flash, Live, Timer) with one tap. And at the top of the camera’s field of view are smaller toggles for night mode and flash. But for when you want to take a quick photo, iOS 26 makes it easier to focus on the necessities while keeping the extraneous within short reach.
Similarly, the initial controls displayed at the bottom of the screen when you open Camera are pared down from six different photo- and video-shooting modes to the two that really matter: Photo and Video.
If you long-press Photo, options for the Time-Lapse, Slow-Mo, Cinematic, Portrait, Spatial, and Pano modes appear.
Credit: Scharon Harding
If you long-press Photo, options for the Time-Lapse, Slow-Mo, Cinematic, Portrait, Spatial, and Pano modes appear. Credit: Scharon Harding
iOS 26 takes the same approach with Video mode by focusing on the essentials (zoom, resolution, frame rate, and flash) at launch.New layout options for navigating Safari, however, slowed me down. In a new Compact view, the address bar lives at the bottom of the screen without a dedicated toolbar, giving the web page more screen space. But this setup makes accessing common tasks, like opening a new or old tab, viewing bookmarks, or sharing a link, tedious because they’re hidden behind a menu button.
If you tend to have multiple browser tabs open, you’ll want to stick with the classic layout, now called Top (where the address bar is at the top of the screen and the toolbar is at the bottom) or the Bottom layout (where the address bar and toolbar are at the bottom of the screen).
On the more practical side of Safari updates is a new ability to turn any webpage into a web app, making favorite and important URLs accessible quickly and via a dedicated Home Screen icon. This has been an iOS feature for a long time, but until now the pages always opened in Safari. Users can still do this if they like, but by default these sites now open as their own distinct apps, with dedicated icons in the app switcher. Web apps open full-screen, but in my experience, back and forward buttons only come up if you go to a new website. Sliding left and right replaces dedicated back and forward controls, but sliding isn’t as reliable as just tapping a button.
Viewing Ars Technica as a web app.
Credit: Scharon Harding
Viewing Ars Technica as a web app. Credit: Scharon Harding
iOS 26 remembers that iPhones are telephones
With so much focus on smartphone chips, screens, software, and AI lately, it can be easy to forget that these devices are telephones. iOS 26 doesn’t overlook the core purpose of iPhones, though. Instead, the new operating system adds a lot to the process of making and receiving phone calls, video calls, and text messages, starting with the look of the Phone app.
Continuing the streamlined Liquid Glass redesign, the Phone app on iOS 26 consolidates the bottom controls from Favorites, Recents, Contacts, Keypad, and Voicemail, to Calls (where voicemails also live), Contacts, and Keypad, plus Search.
I’d rather have a Voicemails section at the bottom of the screen than Search, though. The Voicemails section is still accessible by opening a menu at the top-right of the screen, but it’s less prominent, and getting to it requires more screen taps than before.
On Phone’s opening screen, you’ll see the names or numbers of missed calls and voicemails in red. But voicemails also have a blue dot next to the red phone number or name (along with text summarizing or transcribing the voicemail underneath if those settings are active). This setup caused me to overlook missed calls initially. Missed calls with voicemails looked more urgent because of the blue dot. For me, at first glance, it appeared as if the blue dots represented unviewed missed calls and that red numbers/names without a blue dot were missed calls that I had already viewed. It’s taking me time to adjust, but there’s logic behind having all missed phone activity in one place.
Fighting spam calls and messages
For someone like me, whose phone number seems to have made it to every marketer and scammers’ contact lists, it’s empowering to have iOS 26’s screening features help reduce time spent dealing with spam.
The phone can be set to automatically ask callers with unsaved numbers to state their name. As this happens, iOS displays the caller’s response on-screen, so you can decide if you want to answer or not. If you’re not around when the phone rings, you can view the transcript later and then mark the caller as known, if desired. This has been my preferred method of screening calls and reduces the likelihood of missing a call I want to answer.
There are also options for silencing calls and voicemails from unknown numbers and having them only show in a section of the app that’s separate from the Calls tab (and accessible via the aforementioned Phone menu).
A new Phone menu helps sort important calls from calls that are likely spam.
Credit: Scharon Harding
A new Phone menu helps sort important calls from calls that are likely spam. Credit: Scharon Harding
You could also have iOS direct calls that your cell phone carrier identifies as spam to voicemail and only show the missed calls in the Phone menu’s dedicated Spam list. I found that, while the spam blocker is fairly reliable, silencing calls from unsaved numbers resulted in me missing unexpected calls from, say, an interview source or my bank. And looking through my spam and unknown callers lists sounds like extra work that I’m unlikely to do regularly.
Messages
iOS 26 applies the same approach to Messages. You can now have texts from unknown senders and spam messages automatically placed into folders that are separate from your other texts. It’s helpful for avoiding junk messages, but it can be confusing if you’re waiting for something like a two-factor authentication text, for example.
Elsewhere in Messages is a small but effective change to browsing photos, links, and documents previously exchanged via text. Upon tapping the name of a person in a conversation in Messages, you’ll now see tabs for viewing that conversation’s settings (such as the recipient’s number and a toggle for sending read receipts), as well as separate tabs for photos and links. Previously, this was all under one tab, so if you wanted to find a previously sent link, you had to scroll through the conversation’s settings and photos. Now, you can get to links with a couple of quick taps. Additionally, with iOS 26 you can finally set up custom iMessage backgrounds, including premade ones and ones that you can make from your own photos or by using generative AI. It’s not an essential update but is an easy way to personalize your iPhone by brightening up texts.
Hold Assist
Another time saver is Hold Assist. It makes calling customer service slightly more tolerable by allowing you to hang up during long wait times and have your iPhone ring when someone’s ready to talk to you. It’s a feature that some customer service departments have offered for years already, but it’s handy to always have it available.
You have to be quick to respond, though. One time I answered the phone after using Hold Assist, and the caller informed me that they had said “hello” a few times already. This is despite the fact that iOS is supposed to let the agent know that you’ll be on the phone shortly. If I had waited a couple more seconds to pick up the phone, it’s likely that the customer service rep would have hung up.
Live translations
One of the most novel features that iOS 26 brings to iPhone communication is real-time translations for Spanish, Mandarin, French, German, Italian, Japanese, Korean, and Portuguese. After downloading the necessary language libraries, iOS can translate one of those languages to another in real time when you’re talking on the phone or FaceTime or texting.
The feature worked best in texts, where the software doesn’t have to deal with varying accents, people speaking fast or over one another, stuttering, or background noise. Translated texts and phone calls always show the original text written in the sender’s native language, so you can double-check translations or see things that translations can miss, like acronyms, abbreviations, and slang.
Translating some basic Spanish.
Credit: Scharon Harding
Translating some basic Spanish. Credit: Scharon Harding
During calls or FaceTime, Live Translation sometimes struggled to keep up while it tried to manage the nuances and varying speeds of how different people speak, as well as laughs and other interjections.
However, it’s still remarkable that the iPhone can help remove language barriers without any additional hardware, apps, or fees. It will be even better if Apple can improve reliability and add more languages.
Spatial images on the Home and Lock Screen
The new spatial images feature is definitely on the fluffier side of this iOS update, but it is also a practical way to spice up your Lock Screen, Home Screen, and the Home Screen’s Photos widget.
Basically, it applies a 3D effect to any photo in your library, which is visible as you move your phone around in your hand. Apple says that to do this, iOS 26 uses the same generative AI models that the Apple Vision Pro uses and creates a per-pixel depth map that makes parts of the image appear to pop out as you move the phone within six degrees of freedom.
The 3D effect is more powerful on some images than others, depending on the picture’s composition. It worked well on a photo of my dog sitting in front of some plants and behind a leaf of another plant. I set the display time so that it appears tucked behind her fur, and when I move the phone around, the dog and the leaf in front of her appear to move around, while the background plants stay still.
But in images with few items and sparser backgrounds, the spatial effect looks unnatural. And oftentimes, the spatial effect can be quite subtle.
Still, for those who like personalizing their iPhone with Home and Lock Screen customization, spatial scenes are a simple and harmless way to liven things up. And, if you like the effect enough, a new spatial mode in the Camera app allows you to create new spatial photos.
A note on Apple Intelligence notification summaries
As we’ve already covered in our macOS 26 Tahoe review, Apple Intelligence-based notification summaries haven’t improved much since their 2024 debut in iOS 18 and macOS 15 Sequoia. After problems with showing inaccurate summaries of news notifications, Apple updated the feature to warn users that the summaries may be inaccurate. But it’s still hit or miss when it comes to how easy it is to decipher the summaries.
I did have occasional success with notification summaries in iOS 26. For instance, I understood a summary of a voicemail that said, “Payment may have appeared twice; refunds have been processed.” Because I had already received a similar message via email (a store had accidentally charged me twice for a purchase and then refunded me), I knew I didn’t need to open that voicemail.
Vague summaries sometimes tipped me off as to whether a notification was important. A summary reading “Townhall meeting was hosted; call [real phone number] to discuss issues” was enough for me to know that I had a voicemail about a meeting that I never expressed interest in. It wasn’t the most informative summary, but in this case, I didn’t need a lot of information.
However, most of the time, it was still easier to just open the notification than try to decipher what Apple Intelligence was trying to tell me. Summaries aren’t really helpful and don’t save time if you can’t fully trust their accuracy or depth.
Playful, yet practical
With iOS 26, iPhones get a playful new design that’s noticeable and effective but not so drastically different that it will offend or distract those who are happy with the way iOS 18 works. It’s exciting to experience one of iOS’s biggest redesigns, but what really stands out are the thoughtful tweaks that bring practical improvements to core features, like making and receiving phone calls and taking pictures.
Some additions and changes are superfluous, but the update generally succeeds at improving functionality without introducing jarring changes that isolate users or force them to relearn how to use their phone.
I can’t guarantee that you’ll like the Liquid Glass design, but other updates should make it simpler to do some of the most important tasks with iPhones, and it should be a welcome improvement for long-time users.
Scharon is a Senior Technology Reporter at Ars Technica writing news, reviews, and analysis on consumer gadgets and services. She’s been reporting on technology for over 10 years, with bylines at Tom’s Hardware, Channelnomics, and CRN UK.
The Game Overlay in macOS Tahoe. Credit: Andrew Cunningham
Tahoe’s new Game Overlay doesn’t add features so much as it groups existing gaming-related features to make them more easily accessible.
The overlay makes itself available any time you start a game, either via a keyboard shortcut or by clicking the rocketship icon in the menu bar while a game is running. The default view includes brightness and volume settings, toggles for your Mac’s energy mode (for turning on high-performance or low-power mode, when they’re available), a toggle for Game Mode, and access to controller settings when you’ve got one connected.
The second tab in the overlay displays achievements, challenges, and leaderboards for the game you’re playing—though only if they offer Apple’s implementation of those features. Achievements for games installed from Steam, for example, aren’t visible. And the last tab is for social features, like seeing your friends list or controlling chat settings (again, when you’re using Apple’s implementation).
More granular notification summaries
I didn’t think the Apple Intelligence notification summaries were very useful when they launched in iOS 18 and macOS 15 Sequoia last year, and I don’t think iOS 26 or Tahoe really changes the quality of those summaries in any immediately appreciable way. But following a controversy earlier this year where the summaries botched major facts in breaking news stories, Apple turned notification summaries for news apps off entirely while it worked on fixes.
Those fixes, as we’ve detailed elsewhere, are more about warning users of potential inaccuracies than about preventing those inaccuracies in the first place.
Apple now provides three broad categories of notification summaries: those for news and entertainment apps, those for communication and social apps, and those for all other kinds of apps. Summaries for each category can be turned on or off independently, and the news and entertainment category has a big red disclaimer warning users to “verify information” in the individual news stories before jumping to conclusions. Summaries are italicized, get a special icon, and a “summarized by Apple Intelligence” badge, just to make super-ultra-sure that people are aware they’re not taking in raw data.
Personally, I think if Apple can’t fix the root of the problem in a situation like this, then it’s best to take the feature out of iOS and macOS entirely rather than risk giving even one person information that’s worse or less accurate than the information they already get by being a person on the Internet in 2025.
As we wrote a few months ago, asking a relatively small on-device language model to accurately summarize any stack of notifications covering a wide range of topics across a wide range of contexts is setting it up to fail. It does work OK when summarizing one or two notifications, or when summarizing straightforward texts or emails from a single person. But for anything else, be prepared for hit-or-miss accuracy and usefulness.
Relocated volume and brightness indicators
The pop-ups you see when adjusting the system volume or screen brightness have been redesigned and moved. The indicators used to appear as large rounded squares, centered on the lower half of your primary display. The design had changed over the years, but this was where they’ve appeared throughout the 25-year existence of Mac OS X.
Now, both indicators appear in the upper-right corner of the screen, glassy rectangles that pop out from items on the menu bar. They’ll usually appear next to the Control Center menu bar item, but the volume indicator will pop out of the Sound icon if it’s visible.
New low battery alert
Tahoe picks up an iPhone-ish low-battery alert on laptops. Credit: Andrew Cunningham
Tahoe tweaks the design of macOS’ low battery alert notification. A little circle-shaped meter (in the same style as battery meters in Apple’s Batteries widgets) shows you in bright red just how close your battery is to being drained.
This notification still shows up separately from others and can’t be dismissed, though it doesn’t need to be cleared and will go away on its own. It starts firing off when your laptop’s battery hits 10 percent and continues to go off when you drop another percentage point from there (it also notified me without the percentage readout changing, seemingly at random, as if to annoy me badly enough to plug my computer in more quickly).
The notification frequency and the notification thresholds can’t be changed, if this isn’t something you want to be reminded about or if it’s something you want to be reminded about even earlier. But you could possibly use the battery level trigger in Shortcuts to customize your Mac’s behavior a bit.
Recovery mode changes
A new automated recovery tool in macOS Tahoe’s recovery volume. Credit: Andrew Cunningham
Tahoe’s version of the macOS Recovery mode gets a new look to match the rest of the OS, but there are a few other things going on, too.
If you’ve ever had a problem getting your Mac to boot, or if you’ve ever just wanted to do a totally fresh install of the operating system, you may have run into the Mac’s built-in recovery environment before. On an Apple Silicon Mac, you can usually access it by pressing and holding the power button when you start up your Mac and clicking the Options button to start up using the hidden recovery volume rather than the main operating system volume.
Tahoe adds a new tool called the Device Recovery Assistant to the recovery environment, accessible from the Utilities menu. This automated tool “will look for any problems” with your system volume “and attempt to resolve them if found.”
Maybe the Recovery Assistant will actually solve your boot problems, and maybe it won’t—it doesn’t tell you much about what it’s doing, beyond needing to unlock FileVault on my system volume to check it out. But it’s one more thing to try if you’re having serious problems with your Mac and you’re not ready to countenance a clean install yet.
The web browser in the recovery environment is still WebKit, but it’s not Safari-branded anymore, and it sheds a lot of Safari features you wouldn’t want or need in a temporary OS. Credit: Andrew Cunningham
Apple has made a couple of other tweaks to the recovery environment, beyond adding a Liquid Glass aesthetic. The recovery environment’s built-in web browser is simply called Web Browser, and while it’s still based on the same WebKit engine as Safari, it doesn’t have Safari’s branding or its settings (or other features that are extraneous to a temporary recovery environment, like a bookmarks menu). The Terminal window picks up the new Clear theme, new SF Mono Terminal typeface, and the new default 120-row-by-30-column size.
A new disk image format
Not all Mac users interact with disk images regularly, aside from opening them up periodically to install an app or restore an old backup. But among other things, disk images are used by Apple’s Virtualization framework, which makes it relatively simple to run macOS and Linux virtual machines on the platform for testing and other things. But the RAW disk image format used by older macOS versions can come with quite severe performance penalties, even with today’s powerful chips and fast PCI Express-connected SSDs.
Enter the Apple Sparse Image Format, or ASIF. Apple’s developer documentation says that because ASIF images’ “intrinsic structure doesn’t depend on the host file system’s capabilities,” they “transfer more efficiently between hosts or disks.” The upshot is that reading files from and writing files to these images should be a bit closer to your SSD’s native performance (Howard Oakley at The Eclectic Light Company has some testing that suggests significant performance improvements in many cases, though it’s hard to make one-to-one comparisons because testing of the older image formats was done on older hardware).
The upshot is that disk images should be capable of better performance in Tahoe, which will especially benefit virtual machines that rely on disk images. This could benefit the lightweight virtualization apps like VirtualBuddy and Viable that mostly exist to provide a front end for the Virtualization framework, as well as virtualization apps like Parallels that offer support for Windows.
Quantum-safe encryption support
You don’t have a quantum computer on your desk. No one does, outside of labs where this kind of technology is being tested. But when or if they become more widely used, they’ll render many industry-standard forms of encryption relatively easy to break.