There’s an app for the keyboard promising new features, but it’s not mandatory for the keyboard to work.
Clicks Technology
I used to be a speed demon on phone keyboards. Similar to when I use a mechanical keyboard, I could type with so much ease that during their early days of text messaging, people in my household would ask me to write out their longer messages. Those days of carefree cell phone typing hit a rut when I got my first iPhone.
Now, I can’t start without first looking at my touchscreen keyboard. And I almost always make at least one typo when writing long texts, emails, or documents. That’s why I’m intrigued by the latest attempt to bring old-school physical keyboards to iPhones.
A snap-on keyboard for the iPhone
On Thursday, Clicks Technology unveiled Clicks, a keyboard available for the iPhone 14 Pro, iPhone 15 Pro, and iPhone 15 Pro Max that snaps to the phone like a case. But instead of adding protection, it adds a physical keyboard. Each key boasts 0.22 mm of travel, Jeff Gadway, SVP of product marketing at Clicks, told Ars via email. That seems like miles compared to the flat nature of touchscreens.
Clicks Technology has hinted at plans for releasing Clicks in additional colors beyond what’s seen here.
Clicks Technology
The keyboard connects via the iPhone’s Lightning or USB-C port (whichever the iPhone has). It uses iOS’s support for external keyboards, leveraging the human interface devices (HID) protocol. According to Clicks’ FAQ page, the company decided to forego Bluetooth to avoid pairing complications and latency. Users are supposed to still be able to charge their phones, including with wireless chargers, with Clicks connected.
But if you’re hoping to pair a traditional-style phone keyboard with traditional wired headphones, you’re out of luck. The company’s website says Clicks Technology is “working on a solution” to allow the keyboard and wired headphones to work simultaneously, but you have to pick one or the other for now. Clicks also isn’t considered compatible with MagSafe accessories, though the makers hope to change that eventually.
One look at Clicks’ layout, and I already see appeal in there being a Tab key, which the standard integrated iPhone keyboard lacks. Further, the keyboard is also supposed to make it easier to leverage keyboard shortcuts using its Command (CMD) key. Clicks’ makers highlight shortcuts like launching search (CMD + Space), getting to the home screen (CMD + H), and scrolling through web pages with the space key. Clicks claims to support keyboard shortcuts across “many” third-party apps, according to Thursday’s announcement.
Should the keyboard prove to work well and feel good, it could be a clever way to add more screen real estate for some iPhones since users won’t have a touchscreen keyboard hogging screen space at times. However, I’m curious to see how hard it is to hold and navigate a Clicks-equipped iPhone, including going from the physical keyboard to touchscreen as needed, for longer periods.
But Clicks also impacts iPhone battery life, even though the startup claims the effect is minimal.
“When the backlight is turned off, even on a heavy use day, battery usage will typically be less than ~2 percent. If the backlight is on, usage may increase up to another ~2 percent,” Clicks’ FAQ page, which we’ll have to take with a grain of salt, reads. The keyboard’s backlight turns off automatically after 5 seconds of the keyboard not being used and can be disabled. The keyboard also has an off switch.
When asked for further information, Gadway said the keyboard uses about 4.4 mAh when on but not in use.
“The background Wh consumption when the backlight is off is approximately 0.01628 Wh. It’s important to note that Wh is dependent on the voltage the battery uses, therefore we take the average of 3.7V,” he added.
Some might also be disappointed to notice that Clicks lacks a key for emojis, which have become so prominent in today’s culture that some mechanical keyboards and mice have started including integrated emoji buttons. Clicks says the keyboard doesn’t have an emoji button because iOS external keyboards do not currently support the feature. But there are still ways for Clicks users to bring up the emoji menu, including by pressing multiple keys that the keyboard does have.
Apple’s latest research about running large language models on smartphones offers the clearest signal yet that the iPhone maker plans to catch up with its Silicon Valley rivals in generative artificial intelligence.
The paper, entitled “LLM in a Flash,” offers a “solution to a current computational bottleneck,” its researchers write.
Its approach “paves the way for effective inference of LLMs on devices with limited memory,” they said. Inference refers to how large language models, the large data repositories that power apps like ChatGPT, respond to users’ queries. Chatbots and LLMs normally run in vast data centers with much greater computing power than an iPhone.
The paper was published on December 12 but caught wider attention after Hugging Face, a popular site for AI researchers to showcase their work, highlighted it late on Wednesday. It is the second Apple paper on generative AI this month and follows earlier moves to enable image-generating models such as Stable Diffusion to run on its custom chips.
Device manufacturers and chipmakers are hoping that new AI features will help revive the smartphone market, which has had its worst year in a decade, with shipments falling an estimated 5 percent, according to Counterpoint Research.
Despite launching one of the first virtual assistants, Siri, back in 2011, Apple has been largely left out of the wave of excitement about generative AI that has swept through Silicon Valley in the year since OpenAI launched its breakthrough chatbot ChatGPT. Apple has been viewed by many in the AI community as lagging behind its Big Tech rivals, despite hiring Google’s top AI executive, John Giannandrea, in 2018.
While Microsoft and Google have largely focused on delivering chatbots and other generative AI services over the Internet from their vast cloud computing platforms, Apple’s research suggests that it will instead focus on AI that can run directly on an iPhone.
Apple’s rivals, such as Samsung, are gearing up to launch a new kind of “AI smartphone” next year. Counterpoint estimated more than 100 million AI-focused smartphones would be shipped in 2024, with 40 percent of new devices offering such capabilities by 2027.
The head of the world’s largest mobile chipmaker, Qualcomm chief executive Cristiano Amon, forecast that bringing AI to smartphones would create a whole new experience for consumers and reverse declining mobile sales.
“You’re going to see devices launch in early 2024 with a number of generative AI use cases,” he told the Financial Times in a recent interview. “As those things get scaled up, they start to make a meaningful change in the user experience and enable new innovation which has the potential to create a new upgrade cycle in smartphones.”
More sophisticated virtual assistants will be able to anticipate users’ actions such as texting or scheduling a meeting, he said, while devices will also be capable of new kinds of photo editing techniques.
Google this month unveiled a version of its new Gemini LLM that will run “natively” on its Pixel smartphones.
Running the kind of large AI model that powers ChatGPT or Google’s Bard on a personal device brings formidable technical challenges, because smartphones lack the huge computing resources and energy available in a data center. Solving this problem could mean that AI assistants respond more quickly than they do from the cloud and even work offline.
Ensuring that queries are answered on an individual’s own device without sending data to the cloud is also likely to bring privacy benefits, a key differentiator for Apple in recent years.
“Our experiment is designed to optimize inference efficiency on personal devices,” its researchers said. Apple tested its approach on models including Falcon 7B, a smaller version of an open source LLM originally developed by the Technology Innovation Institute in Abu Dhabi.
Optimizing LLMs to run on battery-powered devices has been a growing focus for AI researchers. Academic papers are not a direct indicator of how Apple intends to add new features to its products, but they offer a rare glimpse into its secretive research labs and the company’s latest technical breakthroughs.
“Our work not only provides a solution to a current computational bottleneck but also sets a precedent for future research,” wrote Apple’s researchers in the conclusion to their paper. “We believe as LLMs continue to grow in size and complexity, approaches like this work will be essential for harnessing their full potential in a wide range of devices and applications.”
Apple did not immediately respond to a request for comment.
What stung her wasn’t the return to being the Android interloper in the chats again. It wasn’t the resulting lower-quality images, loss of encryption, and strange “Emphasized your message” reaction texts. It was losing messages during the outage and never being entirely certain they had been sent or received. There was a gathering on Saturday, and she had to double-check with a couple people about the details after showing up inadvertently early at the wrong spot.
That kind of grievance is why, after Apple on Wednesday appeared to have blocked what Beeper described as “~5% of Beeper Mini users” from accessing iMessages, both co-founder Eric Migicovksy and the app told users they understood if people wanted out. The app had already suspended its plans to charge customers $1.99 per month, following the first major outage. But this was something more about “how ridiculously annoying this uncertainty is for our users,” Migicovsky posted.
Fighting on two fronts
But Beeper would keep working to ensure access and keep fighting on other fronts. Migicovsky pointed to Epic’s victory at trial against Google’s Play Store (“big tech”) as motivation. “We have a chance. We’re not giving up.” Over the weekend, Migicovsky reposted shows of support from Senators Elizabeth Warren (D-Mass.) and Amy Klobuchar (D-Minn.), who have focused on reigning in and regulating large technology company’s powers.
Apple previously issued a (somewhat uncommon) statement about Beeper’s iMessage access, stating that it “took steps to protect our users by blocking techniques that exploit fake credentials in order to gain access to iMessage.” Citing privacy, security, and spam concerns, Apple stated it would “continue to make updates in the future” to protect users. Migicovsky previously denied to Ars that Beeper used “fake credentials” or in any way made iMessages less secure.
I asked Migicovsky by direct message if, given Apple’s stated plan to continually block it, there could ever be a point at which Beeper’s access was “settled,” or “back up and running,” as he put it in his post on X (formerly Twitter). He wrote that it was up to the press and the community. “If there’s enough pressure on Apple, they will have to quit messing with us.” “Us,” he clarified, meant both Apple’s customers using iMessage and Android users trying to chat securely with iPhone friends.
“That’s who they’re penalizing,” he wrote. “It’s not a Beeper vs. Apple fight, it’s Apple versus customers.”
Over the past couple of years of reviewing the iPhone, we’ve often jokingly called them “smartcameras” rather than smartphones, as the camera features are really what sell people on upgrading to new models.
So, for our final Apple gift guide, we’ll revisit some of what we explored in our iPhone 15 and iPhone 15 Pro review with a special focus on the cameras. If you’re looking to grab a new iPhone for yourself or someone in your family, which camera is best?
The idea here is to provide a top-level, quick summary of the features of each iPhone camera as they pertain to specific uses to make for an easy buying guide for last-minute holiday shoppers who want a quick answer. We’ll go over each phone and survey its features, detailing their relevant uses and noting some recommendations and considerations along the way.
If you’re already deeply familiar with this topic, this is a cheat sheet for would-be buyers, not an in-depth analysis.
If you aren’t familiar with these topics and you’re interested in going deeper, our iPhone reviews from the past few years are the place to go; we’ve covered the iteration of SmartHDR, the additions of new lenses and features, and so on as those things have been introduced or tweaked.
But as for today’s quick summary, let’s dive in!
Ars Technica may earn compensation for sales from links on this post through affiliate programs.
A note on computational photography and SmartHDR
The camera lens bump on the back of each iPhone has been getting bigger with time, but it’s software that has been driving better picture quality. Apple uses a few techniques to improve the pictures you take with your iPhone, and foremost among those is what the company calls SmartHDR.
Introduced in the iPhone XS (though some competing Android flagships did this beforehand and just called it something else), SmartHDR is a complex beast. But the simple description is that when you take a photo with your iPhone with SmartHDR enabled, it will take not one but several shots. It will then use a trained algorithm to combine all the photos’ best aspects into one picture.
The specifics of that algorithm have evolved with time, and Apple has identified a few specific versions of SmartHDR over the past few years. But all that matters when we’re looking at the latest iPhones is well, the latest version of SmartHDR. And here’s what you can expect: Most of the time, SmartHDR produces drastically better photos, with fewer unwanted artifacts and abnormalities, a clearer picture, better lighting, and so on.
Once in a while, though, it makes a weird call, and you’ll see something anomalous because of SmartHDR. It also sometimes (let’s be real: usually) gives photos a doctored, unreal quality.
The same goes for Night Mode, a feature Apple essentially copied from Google’s Pixel phones. Introduced in iPhones in 2019, Night Mode also takes a lot of photos in a short period (albeit a longer one than SmartHDR; you have to hold the phone still for a few seconds). In this case, the goal is to battle the low-light shortcomings of smartphone cameras, bring out lost detail, and reduce graininess.
It’s very effective but almost too effective in many cases; photos taken in the dark end up with a bright, glowing quality. It’s great if you want to ensure you can see how much you and your friends or family are smiling in a group photo; it’s not so great if your goal is capturing reality accurately.
Below: Shots taken in a very dark room with the iPhone 15, iPhone 15 Pro, and iPhone 15 Pro Max, from our iPhone 15 and iPhone 15 Pro review.
iPhone 15.
Samuel Axon
iPhone 15 Pro.
Samuel Axon
iPhone 15 Pro Max.
Samuel Axon
iPhone 14 Pro Max
Samuel Axon
iPhone 14.
Samuel Axon
iPhone 13 Pro.
Samuel Axon
Competing flagship phones do much of this, too, so it’s just the state of smartphone camera tech. Mostly, it’s worth the downsides because the laws of optics essentially cap how good these cameras can be without these sorts of computational photography features.
Anyway, when we make the recommendations below, we assume you are all-in on this computational photography stuff. Otherwise, you’ll want to look at alternatives to taking photos with an iPhone if quality matters to you.
iPhone 15 and iPhone 15 Plus
We’ll start with the cheapest phone in Apple’s iPhone 15 lineup because the other two phones (iPhone 15 Pro and iPhone 15 Pro Max) build on what’s seen here. The iPhone 15 Plus is getting lumped in here because its camera system is identical to its smaller variant.
The iPhone 15 has a 48-megapixel main camera with a quad-pixel sensor and an ƒ/1.6 aperture. By default, this camera takes 24-megapixel images, using a computational process to combine low-light 12 MP images with large quad pixels and a 48 MP image.
You can take full 48 MP photos too by going into the Settings app, tapping Camera, tapping Formats, and turning on Resolution Control. When this is enabled, you can tap a toggle in the top-right corner when taking a photo to take one at full resolution.
The 48 MP lens is also used to enable 2x zoom at a quality comparable to the 2x optical zoom seen on prior Pro-model iPhones. Apple does this by cropping the image and applying machine learning techniques to produce the final result. (I told you it’s all about the computational features!)
This is why we don’t recommend the iPhone 14, iPhone 13, or iPhone SE (all of which are still in Apple’s lineup) for would-be buyers who prioritize the camera abilities. That 2x zoom is a must-have, and those other phones don’t offer it. They offer a digital zoom option, but you see a real hit to quality when you use it.
That covers 1x and 2x zoom with the rear camera. There’s another lens back there, though: a 12MP ultra-wide camera (ƒ/2.4 aperture). This one enables what Apple labels as 0.5x zoom, allowing you to capture more stuff in tight spaces, like a group of people posing for a selfie in a car or a very small room, for example.
On the front of the phone, you’ll find a 12 MP camera with a ƒ/1.9 aperture; this is the selfie camera. Like the rear camera, it supports several of Apple’s computational photography buzzwords like SmartHDR 5, the Photonic Engine, and Deep Fusion.
The front and rear cameras can record 4K video with Dolby Vision HDR at up to 60 fps. The rear camera system supports Cinematic Video, which adds a depth-of-field effect behind human subjects. It also has Action Mode, which takes lower-than-4K resolution video but has a strong stabilization effect for situations where your hands move a lot.
Altogether, these features make the iPhone 15 an excellent all-around camera system. It has all the features you’d need to take photos of your kids at home or take selfies with friends while on the town—including Night Mode for low-light shots.
It will be enough for most people. This is a particularly good time for the non-Pro iPhone, as Apple introduced a bunch of formerly Pro-only features (like the 48 MP main camera) to the non-Pro phone for the first time during this cycle.
That said, there are still some situations where you might want to spring for the iPhone Pro or even the iPhone Pro Max.
iPhone 15 Pro
Now that we’ve covered the basics of the iPhone 15’s camera system, we can focus on what’s different if you spend extra on the iPhone 15 Pro.
The iPhone 15 Pro has a more powerful sensor (2.44 µm quad pixel to the iPhone 15’s 2 µm quad pixel) in the main camera, which goes from a ƒ/1.6 aperture in the iPhone 15 to ƒ/1.78 in the Pro. Whereas the iPhone 15 had a 26 mm main lens focal length, you’re looking at 24, 28, and 35mm for the Pro.
Apple says the iPhone 15 Pro has improved optical image stabilization and a flash that produces more natural colors, too. Meanwhile, the Ultra-Wide lens goes from a ƒ/2.4 aperture to ƒ/2.2.
The Pro phone adds a third lens, too: a 12 MP, ƒ/2.8 aperture telephoto lens for 3x zoom. That means that the iPhone 15 Pro’s zoom levels are 0.5x, 1x, 2x, and 3x to the iPhone 15’s 0.5x, 1x, and 2x.
There are no substantial differences between the front-facing camera in the iPhone 15 and the iPhone 15 Pro.
There are a few Pro-specific features, too, specs aside. The iPhone 15 Pro can use Night Mode for portrait photos (a shooting mode that adds a depth-of-field effect to still images), whereas with the iPhone 15, you have to choose one or the other. It’s an edge case, but there you have it.
The iPhone 15 Pro also supports the ProRAW format, which provides high-quality images with minimal doctoring so that photographers can tweak or enhance the image to their own spec in software later.
Finally, the iPhone 15 Pro supports Macro photography mode. This automatically switches the camera settings when you’re taking an ultra-close-up shot of something detailed, which results in substantially better macro photography in many situations.
On the video side of things, the differences in quality aren’t huge. But there are some Pro-specific features here. The iPhone 15 Pro supports log video recording, macro videos, and a 3D “spatial video” format to be viewed later on Apple’s upcoming Vision Pro headset. When I tried the Vision Pro earlier this year, I wasn’t impressed with these spatial photos, but it’s possible Apple will have improved them by the time the device reaches the public.
You’ll want to go with the Pro if you’re taking close-ups of flowers. You might prefer the Pro to the regular 15 if you want to take ProRAW photos to edit the image to professional standards later. And 3x zoom makes a big difference in situations like concerts where you want to take pictures of something far away.
In general, this makes the iPhone 15 Pro a better fit for content creators of various types, and it offers more options for some unique edge cases. You’ll also see marginally better low-light photography—sometimes.
If you’re not seeing those edge cases often and are not producing professional-quality content, though, the iPhone 15’s camera will serve you just fine. In our experience, the only thing you’ll miss frequently is that 3x zoom.
iPhone 15 Pro Max
Speaking of zoom features, that’s the main thing differentiating the iPhone 15 Pro Max from the smaller iPhone 15 Pro.
The Max replaces the 3x telephoto lens with a 5x one—same megapixels, same aperture. You lose the 3x option, but you can still take advantage of the main camera’s 48MP lens to take 2x zoom photos, and 5x is more differentiated and arguably better for many situations.
Below: Daytime shots at 2x, 3x, or 5x zoom (as applicable) on the iPhone 15, iPhone 15 Pro, and iPhone 15 Pro Max from our iPhone 15 and iPhone 15 Pro review.
The iPhone 15 Pro Max at 5x zoom.
Samuel Axon
The iPhone 15 Pro at 3x zoom.
Samuel Axon
The iPhone 15 at 2x zoom.
Samuel Axon
The iPhone 15 Pro at 2x zoom.
Samuel Axon
The iPhone 15 Pro Max at 2x zoom.
Samuel Axon
That’s the only difference between the iPhone 15 Pro Max and the iPhone 15—but it’s significant.
In general, we’d recommend picking between these two Pro models based on screen size, not camera features, but if you find yourself in situations like concerts where you want more powerful zoom, it could be worth the upgrade on that basis.
A quick recap
The iPhone 15 is a good all-around camera, and it will be enough for most use cases. We don’t recommend springing for the more expensive phones for the camera alone unless you have a very specific need in your daily life.
Jump to the iPhone 15 Pro or Pro Max if you are a professional content creator who needs the best raw image files, the ability to record 4K 60 fps HDR video to external storage, if you like to do macro photography, or if you are an avid user of Apple’s AI-driven Portrait Mode.
Go for the Max if powerful optical zoom is a top priority. Otherwise, stick with the 15.
Enlarge/ The iPhone 15 is part of Apple’s self-repair program now.
Samuel Axon
Apple today expanded the Self Service Repair program it launched in April to include access to Apple’s diagnostics tool online and the iPhone 15 series and M2 Macs.
The online tool, Apple said in today’s announcement, provides “the same ability as Apple Authorized Service Providers and Independent Repair Providers to test devices for optimal part functionality and performance, as well as identify which parts may need repair.” The troubleshooting tool is only available in the US and will hit Europe in 2024, according to Apple.
Upon visiting the tool’s website, you’ll be prompted to put your device in diagnostic mode before entering the device’s serial number. Then, you’ll have access to a diagnostic suite, including things like a mobile resource inspector for checking software and validating components’ presence, testing for audio output and “display pixel anomalies,” and tests for cameras and Face ID.
Apple’s support page says the tests may “help isolate issues, investigate whether a part needs to be replaced, or verify that a repair has been successfully completed.”
The tool requires iOS 17.0 or macOS Sonoma 14.1 and later.
Apple’s Self Service Repair program relies on parts pairing, though, and critics say this limits the tools’ effectiveness. Self-repair activist iFixit has been vocal about its disagreement with Apple’s use of the practice since the tech giant launched its self-repair program. iFixit has argued that parts serialization limits the usage of third-party parts. In September, iFixit CEO Kyle Wiens called parts pairing “a serious threat to our ability to fix the things we own,” noting that Apple may be seeking to strong-arm a favorable customer experience but that it’s costing us the environment and “ownership rights.”
In a statement to Ars Technica today, Wiens expressed further disappointment with Apple’s parts serialization:
Apple still has a long way to go to create a robust repair ecosystem, including ending their repair-hostile parts pairing system. This software tool clearly illuminates the problems we’ve identified with parts pairing, where the diagnostic tool fails to recognize the ambient light sensor in a new part we’ve installed.
Users of Apple M2-based MacBook Pro and MacBook Air laptops, as well as the Mac Mini, Pro, and Studio, are now all included in the program, which gives customers access to tools, parts, and manuals previously only accessible by Apple and authorized repair partners. Customers can also rent tool repair kits, although they, too, have been criticized for their bulkiness and limited rental period.
Since launching its repair program, though, Apple has made a turnabout with user repairability, even if it’s still flawed. With the latest additions, Apple’s program now supports 35 products. The company has also become an unexpected proponent for state and national right-to-repair bills. And it’s simplified repairs via its Self Service Repair program— somewhat—by no longer requiring fixers to call Apple upon repair completions. People can instead verify repairs and update firmware with the System Configuration post-repair software tool. Today, Apple also announced bringing the program to 24 new European countries, bringing the program’s total to 33 countries.
Apple still says its repair program is best reserved for people who are experienced with electronics repairs.
According to a report in Bloomberg, Tang Tan, vice president of Product Design, is leaving Apple, and his departure heralds a shuffle of executives heading up some of the company’s most important products.
Sometimes, you might wonder just how much a specific executive influences the grand scheme of things, but the report claims that people within Apple see Tan’s departure as “a blow,” clarifying that he “made critical decisions about Apple’s most important products.” His team reportedly had “tight control” over the look and functionality of those products.
Tan oversaw major aspects of iPhone and Apple Watch design, and he was the executive overseeing accessories and AirPods, as well. He reported to John Ternus, Apple’s senior vice president of Hardware Engineering, who is likely a more widely known name.
Richard Dinh, “Tan’s top lieutenant and head of iPhone product design,” will report directly to Ternus and take on some of Tan’s duties, while Kate Bergeron, previously involved in Mac hardware engineering, will take on the Apple Watch.
Apple has seen several executive departures from its product design and engineering groups recently, so many aspects of upcoming iPhones and other products will be designed with new eyes and perhaps new sensibilities, though what that might lead to remains to be seen.
Apple recently shifted the iPhone from the company’s proprietary Lightning port to a more standard USB-C, and it changed the materials for its Pro line of phones. Despite tweaks like that, the iPhone’s design and functionality has not changed significantly in the past five or so years.
The iPhone 16 line in 2024 is expected to shake things up a little more, at least regarding the phone’s look and feel. Rumors have suggested that the new phones may have larger screens (and bigger chassis overall) and perhaps haptic buttons instead of the current physical buttons. Other changes could be in store, and Apple’s plans are likely not yet finalized.