Tech

would-luddites-find-the-gig-economy-familiar?

Would Luddites find the gig economy familiar?

Machine Breakers Unite! —

Luddites were hardly the anti-tech dullards historians have painted them to be.

Woman about to swing a hammer at a laptop.

The term Luddite is usually used as an insult. It suggests someone who is backward-looking, averse to progress, afraid of new technology, and frankly, not that bright. But Brian Merchant claims that that is not who the Luddites were at all. They were organized, articulate in their demands, very much understood how factory owners were using machinery to supplant them, and highly targeted in their destruction of that machinery.

Their pitiable reputation is the result of a deliberate smear campaign by elites in their own time who (successfully, as it turned out) tried to discredit their coherent and justified movement. In his book Blood in the Machine: The Origins of the Rebellion Against Big Tech, Merchant memorializes the Luddites not as the hapless dolts with their heads in the sand that they’ve become synonymous with, but rather as the first labor organizers. Longing for the halcyon days of yore when we were more in touch with nature isn’t Luddism, Merchant writes; that’s pastoralism—totally different thing.

OG Luddites

Weavers used to work at home, using hand-powered looms (i.e., machines). The whole family pitched in to make cloth; they worked on their own schedules and spent their leisure time and meals together. Master weavers apprenticed for seven years to learn their trade. It worked this way in the north of England for hundreds of years.

In 1786 Edmund Cartwright invented the power-loom. Now, instead of a master weaver being required to make cloth, an unschooled child could work a loom. Anyone who could afford these “automated” looms (they did still need some human supervision) could cram a bunch of them into a factory and bring in orphans from the poorhouse to oversee them all day long. The orphans could churn out a lot more cloth much faster than before, and owners didn’t have to pay the 7-year-olds what they had been paying the master weavers. By the beginning of the 19th century, that is exactly what the factory owners did.

The weavers, centered in Nottinghamshire—Robin Hood country—obviously did not appreciate factory owners using these automated looms to obviate their jobs, their training—their entire way of life, really. They tried to negotiate with the factory owners for fair wages and to get protective legislation enacted to limit the impacts of the automated looms and protect their rights and products. But Parliament was having none of it; instead, Parliament—somewhat freaked out by the French Revolution—passed the Combination Acts in 1801, which made unionizing illegal. So, the workers took what they saw as their only remaining avenue of recourse; they started smashing the automated looms.

The aristocrats in the House of Lords told them they didn’t understand, that this automation would make things better for everyone. But it wasn’t improving things for anyone the Luddites knew or saw. They watched factory owners get richer and richer, their own families get thinner and thinner, and markets get flooded with inferior cloth made by child slaves working in unsafe conditions. So they continued breaking the machines, even after the House of Lords made it a capital crime in 1812.

Merchant tells his story through the experience of selected individuals. One is Robert Blincoe, an orphan whose memoir of mistreatment in his 10 years of factory work is thought to have inspired Dickens’ Oliver Twist. Another is Lord Byron, who, like other Romantic poets, sympathized with the Luddites and who spoke (beautifully but futilely) in the House of Lords on their behalf. George Mellor, another figure Merchant spends time with, is one of the primary candidates for a real-life General Ludd.

Edward Ludd himself doesn’t qualify, as he was mythical. Supposedly an apprentice in the cloth trade who smashed his master’s device with a hammer in 1799, he became the movement’s figurehead, with the disparate raiders breaking machines all over northern England, leaving notes signed with his name. George Mellor, by contrast, was one of the best writers and organizers the Luddites had. He’d spent the requisite seven years to learn his cloth finishing job and in 1811 was ready to get to work. The West Riding of York, where he lived, had been home to wool weavers for centuries. But now greedy factory owners were using machines and children to do the work he had spent his adolescence mastering. After over a year of pleading with the owners and the government, and then resorting to machine breaking, there was no change and no hope in sight.

Finally, Mellor led a raid in which a friend was killed, and he snapped. He murdered a factory owner and was hanged, along with 14 of his fellows (only four were involved in the murder; the rest were killed for other Luddite activities).

Even as their bodies were still practically swinging on the gallows, the aristocracy and press were already undermining and reshaping the Luddite story, depicting them as deluded and small-minded men who smashed machines they couldn’t understand—not the strategic, grassroots labor activists they were. That misrepresentation is largely how they are still remembered.

Would Luddites find the gig economy familiar? Read More »

android-15-might-bring-back-lock-screen-widgets

Android 15 might bring back lock screen widgets

A copy of a copy —

After iOS 16 reintroduced lock screen widgets, Google is dusting off its old code.

Jelly Bean is back!

Enlarge / Jelly Bean is back!

Andrew Cunningham

It sure looks like Android 15 is going to have lock screen widgets. The Android 14 QPR2 beta landed the other day, and Mishaal Rahman over at Android Authority found a hidden unfinished feature that brings back lock screen widgets. We’ve expected this to happen since Apple’s big lock screen widget release with iOS 16.

Rahman found a new “communal” space feature that resembles lock screen widgets. After enabling the feature and swiping in from the right of the lock screen, a pencil icon will pop up. Tapping the icon opens up a widget list, allowing you to move some widgets to the lock screen. Right now, in this unfinished state, the default lock screen clock and notification panel UI don’t know how to get out of the way yet, so you get a pile of widgets with the usual lock screen UI on top. It’s a mess.

Lock screen widgets... sort of. It's early.

Enlarge / Lock screen widgets… sort of. It’s early.

Any time one smartphone operating system does something, the other tends to copy it, and iOS added lock screen widgets in 2022. Two years later is plenty of time for Google to adjust and copy the feature. The thing is, Android added lock screen widgets in 2012 with Android 4.2. Google removed the feature two years later in Android 5.0, so really, this is Android copying iOS copying Android. Some of this code is apparently making a comeback, as all the widgets available to the lock screen were ones that still had the 10-year-old “keyguard” flag set for Android 4.2.

The widget lock screen has strangely been named the “communal” space, and Rahman speculates this might be because this particular UI experiment was meant for tablets in a dock mode. “Communal” would mean that everyone in your house could see them, and maybe it would be good to limit the amount of personal data displayed without needing to pass the lock screen. This is just one of the feature experiments that happened to slip out the door, though, and it’s hard to imagine Google not letting phones do this, too, when iOS already does it.

Android 15 might bring back lock screen widgets Read More »

compression-attached-memory-modules-may-make-upgradable-laptops-a-thing-again

Compression Attached Memory Modules may make upgradable laptops a thing again

https://img.global.news.samsung.com/global/wp-content/uploads/2023/09/LPCAMM-Module_PR_main1.jpg

Enlarge / Samsung shared this rendering of a CAMM ahead of the publishing of the CAMM2 standard in September.

Of all the PC-related things to come out of CES this year, my favorite wasn’t Nvidia’s graphics cards or AMD’s newest Ryzens or Intel’s iterative processor refreshes or any one of the oddball PC concept designs or anything to do with the mad dash to cram generative AI into everything.

No, of all things, the thing that I liked the most was this Crucial-branded memory module spotted by Tom’s Hardware. If it looks a little strange to you, it’s because it uses the Compression Attached Memory Module (CAMM) standard—rather than being a standard stick of RAM that you insert into a slot on your motherboard, it lies flat against the board where metal contacts on the board and the CAMM module can make contact with one another.

CAMM memory has been on my radar for a while, since it first cropped up in a handful of Dell laptops. Mistakenly identified at the time as a proprietary type of RAM that would give Dell an excuse to charge more for it, Dell has been pushing for the standardization of CAMM modules for a couple of years now, and JEDEC (the organization that handles all current computer memory standards) formally finalized the spec last month.

Something about seeing an actual in-the-wild CAMM module with a Crucial sticker on it, the same kind of sticker you’d see on any old memory module from Amazon or Newegg, made me more excited about the standard’s future. I had a similar feeling when I started digging into USB-C or when I began seeing M.2 modules show up in actual computers (though CAMM would probably be a bit less transformative than either). Here’s a thing that solves some real problems with the current technology, and it has the industry backing to actually become a viable replacement.

From upgradable to soldered (and back again?)

SO-DIMM memory slots in the Framework Laptop 13. RAM slots used to be the norm in laptop motherboards, though now you need to do a bit of work to seek out laptops that feature them.

Enlarge / SO-DIMM memory slots in the Framework Laptop 13. RAM slots used to be the norm in laptop motherboards, though now you need to do a bit of work to seek out laptops that feature them.

Andrew Cunningham

It used to be easy to save some money on a new PC by buying a version without much RAM and performing an upgrade yourself, using third-party RAM sticks that cost a fraction of what manufacturers would charge. But most laptops no longer afford you the luxury.

Most PC makers and laptop PC buyers made an unspoken bargain in the early- to mid-2010s, around when the MacBook Air and the Ultrabook stopped being special thin-and-light outliers and became the standard template for the mainstream laptop: We would jettison nearly any port or internal component in the interest of making a laptop that was thinner, sleeker, and lighter.

The CD/DVD drive was one of the most immediate casualties, though its demise had already been foreshadowed thanks to cheap USB drives, cloud storage, and streaming music and video services. But as laptops got thinner, it also gradually became harder to find Ethernet and most other non-USB ports (and, eventually, even traditional USB-A ports), space for hard drives (not entirely a bad thing, now that M.2 SSDs are cheap and plentiful), socketed laptop CPUs, and room for other easily replaceable or upgradable components. Early Microsoft Surface tablets were some of the worst examples of this era of computer design—thin sandwiches of glass, metal, and glue that were difficult or impossible to open without totally destroying them.

Another casualty of this shift was memory modules, specifically Dual In-line Memory Modules (DIMMs) that could be plugged into a socket on the motherboard and easily swapped out. Most laptops had a pair of SO-DIMM slots, either stacked on top of each other (adding thickness) or placed side by side (taking up valuable horizontal space that could have been used for more battery).

Eventually, these began to go away in favor of soldered-down memory, saving space and making it easier for manufacturers to build the kinds of MacBook Air-alikes that people wanted to buy, but also adding a point of failure to the motherboard and possibly shortening its useful life by setting its maximum memory capacity at the outset.

Compression Attached Memory Modules may make upgradable laptops a thing again Read More »

google-lays-off-“hundreds”-more-employees,-strips-google-assistant-features

Google lays off “hundreds” more employees, strips Google Assistant features

Hey Google, it’s been nice knowing you —

Google’s layoffs hit hardware, the Google Assistant, and even the AR division.

Google is looking pretty dilapidated these days.

Enlarge / Google is looking pretty dilapidated these days.

Aurich Lawson

Google’s cost-cutters are still working overtime, with more layoffs this week and cuts to Google Assistant functionality.

First up, The New York Times reports Google laid off “hundreds” of workers in “several divisions” on Wednesday. Core engineering, the Google Assistant, and the hardware division all lost people. The report says that “Google said that most of the hardware cuts affected a team working on augmented reality.” AR cuts are eyebrow-raising since that’s quickly going to be one of the highest-profile teams at the company this year, as Google, Samsung, and Qualcomm team up to battle the Apple Vision Pro. FitBit was apparently also a big loser, with 9to5Google reporting that Fitbit co-founders James Park and Eric Friedman and “other Fitbit leaders” have left Google.

Over the years, Google has rarely laid off workers, but since January of last year, a new focus on cost-cutting has made layoffs a regular occurrence at Google. The purge started with an announcement of 12,000 layoffs in January, which took until at least March to complete. Then there were more layoffs at Alphabet companies Waymo and Everyday Robots in March, Waze layoffs in June, recruiting layoffs in September, Google News cuts in October, and now these layoffs in January. There are rumors of more layoffs happening this month, too, focusing on the ad sales division.

Next up is a Google blog post titled “Changes we’re making to Google Assistant,” which details 17 features that are being removed from Google’s struggling voice assistant. Google says these “underutilized” features will be “no longer supported” at some point in the future, with shutdown warnings coming on January 26.

The Google Search bar, which (depending on your local anti-trust laws) is contractually obligated to be on the front of an Android phone, will no longer bring up the Google Assistant.

The Google Search bar, which (depending on your local anti-trust laws) is contractually obligated to be on the front of an Android phone, will no longer bring up the Google Assistant.

Ron Amadeo

The full list of cut features—it’s a big list—is here. The biggest and most ominous news is that the Google Assistant is losing its premium, default spot on the homepage of all Android devices. The microphone button in the Google Search bar used to bring up the Assistant, but now it will only send your voice input directly to Google Search. You’ll still be able to bring up the Assistant using what are basically secret, invisible shortcuts, like saying “Hey Google” or long-pressing on the home button (if you have gesture navigation turned off), but this is a massive change that means the Assistant will no longer be front-and-center on Android phones.

The Assistant is from the Google Search division and was once considered the future of the company and the future of Google Search. If the Assistant couldn’t answer a question, it would just forward you to Google Search, so this change makes the microphone button a lot less useful. It also highlights the ongoing death of the Google Assistant, which has fallen out of favor at the company. (Android users unhappy about this should download the Google Assistant shortcut app.) Here are some of the features being removed:

  • The Google Assistant’s messaging feature, where voice messages would be sent to any phones and tablets in your family group, is dead. Audio messages will still play on local network speakers, but Google is no longer sending notifications across the Internet to Android and iOS.
  • Google Play Books voice support sounds like it will be totally gone. You can still use generic audio-cast features from another device, but you can’t ask the Assistant to play an audiobook anymore.
  • Setting music alarms—not regular alarms—is dead. Controlling a stopwatch—not normal timer support—is also gone.
  • The death of Fitbit under Google continues with the removal of voice-control activities for the Fitbit Sense and Versa 3. A wrist-based Google Assistant is exclusive to the Pixel Watch in Google’s lineup, though that probably won’t last long either.

One problem with all voice assistants is that there’s no good way to communicate the hundreds of possible voice commands to users, so there’s a good chance you didn’t know most of these exist. Figuring out whether any given Google Assistant feature is available on a phone, speaker, smart display, car, TV, or headphones is also an impossible task. Some cut features I have personally never heard of include “managing your cookbook”—apparently there is a “Google Cookbook” of saved recipes available on smart displays and nowhere else. Google says it was somehow previously possible to “send a payment, make a reservation, or post to social media” by voice on some platforms. When I ask the Google Assistant to do any of those things right now, it says “I don’t know, but I found these results on search.” I’m not even sure where you would enter payment details for the Assistant to have access to (was this some iteration of Google Pay?), or how you would connect social media accounts.

It increasingly sounds like it’s time to pick out a nice plot of land in the Google Graveyard for the Assistant. On one hand, Google seems to want to shut this one down in exchange for “Pixie,” a voice assistant that will be exclusive to Pixel devices, starting with the Pixel 9. On the other hand, just in October, Google promised the Assistant would be getting Bard generative-AI integration, so none of this lines up perfectly. It’s odd to be removing the Assistant from Android home screens, stripping it of features, planning a big revamp, and also planning a direct competitor.

Google lays off “hundreds” more employees, strips Google Assistant features Read More »

detachable-lenovo-laptop-is-two-separate-computers,-runs-windows-and-android

Detachable Lenovo laptop is two separate computers, runs Windows and Android

ARM Windows and x86 Android would have been really funny —

The Lenovo ThinkBook Plus Gen 5 Hybrid combines the best (?) of both worlds.

  • The Lenovo ThinkBook Plus Gen 5 Hybrid. It has two CPUs, two operating systems, and six speakers, and it generally sounds deeply complicated.

    Lenovo

  • In the previous image, the device was running Android, but click the halves together, and now it runs Windows.

    Lenovo

  • Here’s a closer look at that interface. There’s a proprietary connector in the middle and what looks like some support towers on either side. Note that the hinge is actually on the tablet side.

    Lenovo

  • Ports! We have a bottom USB-C port. but that’s not the connection interface. We’re also covering the bottom power button when connected. Also there’s a pen.

    Lenovo

  • The back. Sometimes these touch laptops have a 360-degree hinge action, but that’s not going to work here.

    Lenovo

Have you ever used a Windows laptop and thought, “Gee, I really wish this was also an Android tablet”? Does Lenovo have a product for you!

The Lenovo ThinkBook Plus Gen 5 Hybrid laptop at CES 2024 is both a Windows laptop and an Android tablet. The bottom half contains all the usual Intel laptop parts, while the top half packs a Qualcomm chip and a whole duplicate set of computing components. A detachable screen lets both halves come apart and operate separately, and you’ll be spending your life riding the line between the Windows and Android ecosystems. Because you’re getting two entirely separate computers, you’ll also have to pay for two entirely separate computers—the device costs $2,000.

Because the device houses two entirely separate computers, you can separate them and run them at the same time. Of course, the tablet acts as an Android tablet when it’s detached, but you can also plug the headless laptop base into a monitor and use Windows. Lenovo calls the tablet the “Hybrid Tab” while the bottom is the “Hybrid Station,” and the whole thing voltroned together forms the “ThinkBook Plus Gen 5 Hybrid.” The laptop base runs Windows 11 and has an Intel Core Ultra 7 processor, 32GB of RAM, a 1TB SSD, Intel graphics, and a 75 Whr battery. The tablet runs Android 13 on a Snapdragon 8+ gen 1 SoC, along with 12GB of RAM, 256GB of UFS 3.1 storage, and a 38 Whr battery.

Peripherals are all over the place. The tablet has four 1 W speakers, while the laptop base has two 2 W speakers. The laptop has a fingerprint reader and the tablet doesn’t, but the tablet has infrared face biometrics and the laptop doesn’t. The tablet has an unspecified front camera and two rear cameras: 13 MP and 5 MP.

The laptop has two USB-C Thunderbolt 4 ports—one on either side—and a headphone jack, while the tablet just has one USB-C port. The tablet’s USB-C port is on the bottom, but that’s not used for connectivity between the two halves; instead, there’s some kind of proprietary port. Interestingly, the laptop hinges are on the tablet half, making the tablet a bit heavier in tablet mode than it needs to be. Speaking of weight, the whole contraption weighs 1755 g (3.8 lbs). The laptop bottom measures 9.4 mm thick, with a weight of 970 g, and the tablet is 6.6 mm thick with a weight of 785 g.

The two halves are so separate that you can just run them separately.

Enlarge / The two halves are so separate that you can just run them separately.

Lenovo

The idea is a strange one, given how heavy, expensive, and complicated this device will be. And Windows already has a full touch interface that works fine in a tablet form factor. Something like the Microsoft Surface Book built a whole PC into the top half of a detachable laptop, which powered the Windows laptop and Windows tablet parts without needing a whole extra computer in the other half. Windows can also run Android apps now, if for some reason you’re in love with the idea of tablet Android apps.

The only reason to have the whole extra Android computer is if you really like the Android tablet OS and Google’s tablet apps, which is hard for even most die-hard Android users to say. With an official build of Android, you’ll get the Google apps, which wouldn’t be available through the Windows Android store. Google has to individually approve your device to get those apps, so it’s interesting that the company is apparently OK with this wacky hybrid device.

The inevitable problem with having two totally separate computers in one device is that at some point, there will be data on one device that you want to access on the other. Lenovo says the Windows computer has a “Hybrid Stream” feature that will enable the “streaming” of Android apps from the tablet via a “Picture-in-Picture window.” That sounds a bit like Stadia, where inputs were sent to a remote device for processing and a video stream was sent back. Here, though, the “remote” device is just the top half of your laptop. There is also a “Tablet with Keyboard” mode, indicating that you can use the laptop’s keyboard and mouse with the Android tablet somehow.

Many questions remain. Lenovo doesn’t actually say how you switch OSes. The Android tablet supports pen input, but Lenovo doesn’t say the pen will work with Windows. It also doesn’t say that Windows can use the tablet’s cameras (though that would be possible over USB). Another thing that would be possible over USB but isn’t mentioned is mass storage access of the other device—both Windows and Android could be host devices.

The Lenovo ThinkBook Plus Gen 5 Hybrid will be ready for all your mixed computing tasks in Q2.

Detachable Lenovo laptop is two separate computers, runs Windows and Android Read More »

portable-monitors-could-make-foldable-screen-gadgets-finally-make-sense

Portable monitors could make foldable-screen gadgets finally make sense

  • Asus plans to release this foldable OLED monitor in 2024. Electronics retailer Abt Electronics captured footage of it on display at CES.

  • The monitor has a metal chassis and glossy coating.

  • The monitor could help workers quickly add more screen space to setups.

  • In its video, Abt Electronics showed off different angled views of the monitor.

  • Like other foldables, the crease can seemingly catch reflections and glare when the device is bent.

  • Port selection.

  • The portable monitor will come with a sleeve.

Foldable screens have been bending their way into consumer gadgets over the last few years. But with skepticism about durability, pricing, image quality, and the necessity of such devices, foldable screens aren’t mainstream. With those concerns in mind, I haven’t had much interest in owning a foldable-screen gadget, even after using a foldable laptop for a month. However, the foldable portable monitor that Asus is showing at CES in Las Vegas this week is an application of foldable OLED that makes more sense to me than others.

Asus’ ZenScreen Fold OLED MQ17QH announced on Tuesday is a 17.3-inch portable monitor that can fold to a 12.5-inch size. The monitor has 2560×1920 pixels for a pixel density of 184.97 pixels per inch. Other specs include a 100 percent DCI-P3 coverage claim and VESA DisplayHDR True Black 500 certification.

When I think of the ways I use portable monitors, foldability makes more sense than it does with other device types. For example, I love working outside when possible, and an extra 17.3-inch screen that’s easy to carry would make long work sessions with an ultraportable laptop more feasible. The Fold OLED’s 17.3 inches is near the larger size for a portable monitor, but the fold and comparatively light weight should make it feel more transportable than similarly sized monitors that don’t fold in half.

Regarding dimensions and weight, Asus compares the monitor to a 13-inch thin-and-light laptop. The monitor weighs 2.58 lbs—that’s notable heft for something meant to be lugged around (the smaller Lenovo ThinkVision M14 weighs 1.3 pounds, for comparison). But 2.58 lbs is still on the lighter side for a 17-inch-class monitor (Asus’ 17-inch ROG Strix XG17AHP is 3.88 lbs), and Asus’ foldable is similarly thin.

Asus credits a “waterdrop-style hinge” for the monitor’s thin size. It’s 0.38 inches (9.7 mm) thick when unfolded, which should translate to about 0.76 inches (19.4 mm) when the monitor is folded shut.

It feels more natural for a portable monitor to add a fold for easy transport, since portability is right in the device category’s name.

Plus, a portable monitor doesn’t have the same types of component and cooling concerns as computing devices like laptops and phones do.

Crease concerns

I haven’t seen Asus’ foldable monitor in person, so I can only speculate on image quality. The monitor is still being finalized, but based on images and video from people who’ve seen the ZenScreen Fold OLED in person at CES and my experience using foldables, I expect the display to show a crease that picks up reflections and/or glare when bent. But considering that a portable monitor will typically be open flat, this doesn’t matter the same way it would with other types of foldable devices.

However, what matters is whether that crease is still visible when the monitor’s flat. A portable monitor is likely to be viewed from different angles, which could make even a slight crease pop. For what it’s worth, The Verge reported that the Asus monitor’s crease seemed to “disappear” when flat, but I remain highly cautious.

Asus’ monitor announcement showed confidence that “you’ll hardly be able to tell that there’s a hinge behind the display” when it’s open because of the waterdrop-style hinge, which is the same hinge type that the Samsung Galaxy Z Fold 5 uses, as pointed out by The Verge. The hinge type reportedly makes for a looser feel when the device is closed. Samsung Display has claimed that this puts less stress on the display and minimizes the gap seen when the foldable is shut. Asus’ announcement noted that the foldable monitor’s hinge uses “hundreds of parts,” which “all but eliminat[e] the gap.”

A close-up of the hinge, shown on-video by Abt Electronics.

Enlarge / A close-up of the hinge, shown on-video by Abt Electronics.

Like with any other foldable, though, durability remains a concern. A portable monitor may be moved around frequently, and Ars has seen firsthand how fragile a foldable screen can be, including with those small-gap designs.

Speaking of different viewing angles and visibility outdoors (and in bright rooms), the use of OLED suggests that this monitor won’t be as bright as some LCD portable monitors. That could limit visibility, depending on your use case.  Asus hasn’t shared a brightness spec for the ZenScreen Fold.

Portable monitors could make foldable-screen gadgets finally make sense Read More »

nvidia’s-g-sync-pulsar-is-anti-blur-monitor-tech-aimed-squarely-at-your-eyeball

Nvidia’s G-Sync Pulsar is anti-blur monitor tech aimed squarely at your eyeball

What will they sync of next? —

Branded monitors can sync pixels to backlighting, refresh rate, and GPU frames.

Motion blur demonstration of G-Sync Pulsar, with

Enlarge / None of this would be necessary if it weren’t for your inferior eyes, which retain the colors of pixels for fractions of a second longer than is optimal for shooting dudes.

Nvidia

Gaming hardware has done a lot in the last decade to push a lot of pixels very quickly across screens. But one piece of hardware has always led to complications: the eyeball. Nvidia is targeting that last part of the visual quality chain with its newest G-Sync offering, Pulsar.

Motion blur, when it’s not caused by slow LCD pixel transitions, is caused by “the persistence of an image on the retina, as our eyes track movement on-screen,” as Nvidia explains it. Prior improvements in display tech, like variable rate refresh, Ultra Low Motion Blur, and Variable Overdrive have helped with the hardware causes of this deficiency. The eyes and their object permanence, however, can only be addressed by strobing a monitor’s backlight.

You can’t just set that light blinking, however. Variable strobing frequencies causes flicker, and timing the strobe to the monitor refresh rate—itself also tied to the graphics card output—was tricky. Nvidia says it has solved that issue with its G-Sync Pulsar tech, employing “a novel algorithm” in “synergizing” its variable refresh smoothing and monitor pulsing. The result is that pixels are transitioned from one color to another at a rate that reduces motion blur and pixel ghosting.

Nvidia also claims that Pulsar can help with the visual discomfort caused by some strobing effects, as the feature “intelligently controls the pulse’s brightness and duration.”

  • The featureless axis labels make my brain hurt, but I believe this chart suggests that G-Sync Pulsar does the work of timing out exactly when to refresh screen pixels at 360 Hz.

    Nvidia

  • The same, but this time at 200 Hz.

    Nvidia

  • And again, this time at 100 Hz. Rapidly changing pixels are weird, huh?

    Nvidia

To accommodate this “radical rethinking of display technology,” a monitor will need Nvidia’s own chips built in. There are none yet, but the Asus ROG Swift PG27 Series G-Sync and its 360 Hz refresh rate is coming “later this year.” No price for that monitor is available yet.

It’s hard to verify how this looks and feels without hands-on time. PC Gamer checked out Pulsar at CES this week and verified that, yes, it’s easier to read the name of the guy you’re going to shoot while you’re strafing left and right at an incredibly high refresh rate. Nvidia also provided a video, captured at 1,000 frames per second, for those curious.

Nvidia’s demonstration of G-Sync Pulsar, using Counter-Strike 2 filmed at 1000 fps, on a 360 Hz monitor, with Pulsar on and off, played back at 1/24 speed.

Pulsar signals Nvidia’s desire to once again create an exclusive G-Sync monitor feature designed to encourage a wraparound Nvidia presence on the modern gaming PC. It’s a move that has sometimes backfired on the firm before. The company relented to market pressures in 2019 and enabled G-Sync in various variable refresh rate monitors powered by VESA’s Display port Adaptive-Sync tech (more commonly known by its use in AMD’s FreeSync monitors). G-Sync monitors were selling for typically hundreds of dollars more than their FreeSync counterparts, and while they technically had some exclusive additional features, the higher price points likely hurt Nvidia’s appeal when a gamer was looking at the full cost of new or upgraded system.

There will not be any such cross-standard compatibility with G-Sync Pulsar, which will only be offered on monitors with a G-Sync Ultimate badge, and then further support Pulsar, specifically. There’s always a chance that another group will develop its own synced-strobe technology that could work across GPUs, but nothing is happening as of yet.

In related frame-rate news, Nvidia also announced this week that its GeForce Now game streaming service will offer G-Sync capabilities to those on Ultimate or Priority memberships and playing on capable screens. Nvidia claims that, paired with its Reflex offering on GeForce Now, the two “make cloud gaming experiences nearly indistinguishable from local ones.” I’ll emphasize here that those are Nvidia’s words, not the author’s.

Nvidia’s G-Sync Pulsar is anti-blur monitor tech aimed squarely at your eyeball Read More »

iphone-survives-16,000-foot-fall-after-door-plug-blows-off-alaska-air-flight-1282

iPhone survives 16,000-foot fall after door plug blows off Alaska Air flight 1282

the ultimate drop test —

Still-working iPhone is one of two discovered after the airline accident, says NTSB.

The iPhone that fell from Alaska Airlines flight 1282, discovered by Seanathan Bates under a bush on the side of the road.

Enlarge / The iPhone that fell from Alaska Airlines flight 1282, discovered by Seanathan Bates under a bush on the side of the road.

On Sunday, game developer Seanathan Bates discovered a working iPhone that fell 16,000 from Alaska Airlines flight 1282 on Friday. Flight 1282 suffered an explosive decompression event when a door plug blew off the plane. No one was injured during the incident. The iPhone wasn’t injured, either—still unlocked and with a torn charging cable connector plugged in, it appeared largely undamaged and displayed information that matched the flight.

“Found an iPhone on the side of the road,” wrote Bates in a post on X. “Still in airplane mode with half a battery and open to a baggage claim for #AlaskaAirlines ASA1282 Survived a 16,000 foot drop perfectly in tact!”

The discovery location of the iPhone that fell from Alaska Airlines flight 1282.

Enlarge / The discovery location of the iPhone that fell from Alaska Airlines flight 1282.

After the discovery, Bates contacted the NTSB, who took possession of the device and told him the iPhone was actually the second phone that had been found from the flight. During a press conference on Sunday, NTSB chair Jennifer Homendy confirmed that two people had discovered cell phones that fell from flight 1281. The other cell phone was discovered in someone’s yard.

The decompression event started when a door plug used to cover an unused exit door on the Boeing 737 Max 9 unexpectedly detached from the plane. Rapid decompression can suck passengers and objects violently out of an aircraft due to air pressure differences. While no people fell out of the plane, the loose iPhone apparently got ripped away while charging. “In case you didn’t see it, there was a broken-off charger plug still inside it! Thing got *yankedout the door,” wrote Bates in his X post.

The iPhone that fell from Alaska Airlines flight 1282 had a ripped charging connector still plugged into it.

Enlarge / The iPhone that fell from Alaska Airlines flight 1282 had a ripped charging connector still plugged into it.

iPhones surviving harrowing drops from sky-heights aren’t unheard of. In May, AppleInsider reported on a skydiver’s iPhone that survived a 14,000 fall from a plane. Given air resistance that limits an object’s descent speed and landings in a soft spot like moist dirt or mud, the survivals aren’t entirely surprising. Landing on a hard surface would likely be a different story, however.

At the time of the iPhone’s discovery, the search was still on for the missing door plug, but the plug has since been found. Compared to the apparent ease of discovering two small cell phones first, a Hacker News commenter quipped, “Boeing needs ‘Find My Door.'”

No word yet on whether the iPhone has been reunited with its owner.

iPhone survives 16,000-foot fall after door plug blows off Alaska Air flight 1282 Read More »

they’re-not-cheap,-but-nvidia’s-new-super-gpus-are-a-step-in-the-right-direction

They’re not cheap, but Nvidia’s new Super GPUs are a step in the right direction

supersize me —

RTX 4080, 4070 Ti, and 4070 Super arrive with price cuts and/or spec bumps.

Nvidia's latest GPUs, apparently dropping out of hyperspace.

Enlarge / Nvidia’s latest GPUs, apparently dropping out of hyperspace.

Nvidia

  • Nvidia’s latest GPUs, apparently dropping out of hyperspace.

    Nvidia

  • The RTX 4080 Super.

    Nvidia

  • Comparing it to the last couple of xx80 GPUs (but not the original 4080).

    Nvidia

  • The 4070 Ti Super.

    Nvidia

  • Comparing to past xx70 Ti generations.

    Nvidia

  • The 4070 Super.

    Nvidia

  • Compared to past xx70 generations.

    Nvidia

If there’s been one consistent criticism of Nvidia’s RTX 40-series graphics cards, it’s been pricing. All of Nvidia’s product tiers have seen their prices creep up over the last few years, but cards like the 4090 raised prices to new heights, while lower-end models like the 4060 and 4060 Ti kept pricing the same but didn’t improve performance much.

Today, Nvidia is sprucing up its 4070 and 4080 tiers with a mid-generation “Super” refresh that at least partially addresses some of these pricing problems. Like older Super GPUs, the 4070 Super, 4070 Ti Super, and 4080 Super use the same architecture and support all the same features as their non-Super versions, but with bumped specs and tweaked prices that might make them more appealing to people who skipped the originals.

The 4070 Super will launch first, on January 17, for $599. The $799 RTX 4070 Ti Super launches on January 24, and the $999 4080 Super follows on January 31.

RTX 4090 RTX 4080 RTX 4080 Super RTX 4070 Ti RTX 4070 Ti Super RTX 4070 RTX 4070 Super
CUDA Cores 16,384 9,728 10,240 7,680 8,448 5,888 7,168
Boost Clock 2,520 MHz 2,505 MHz 2,550 MHz 2,610 MHz 2,610 MHz 2,475 MHz 2,475 MHz
Memory Bus Width 384-bit 256-bit 256-bit 192-bit 256-bit 192-bit 192-bit
Memory Clock 1,313 MHz 1,400 MHz 1,437 MHz 1,313 MHz 1,313 MHz 1,313 MHz 1,313 MHz
Memory size 24GB GDDR6X 16GB GDDR6X 16GB GDDR6X 12GB GDDR6X 16GB GDDR6X 12GB GDDR6X 12GB GDDR6X
TGP 450 W 320 W 320 W 285 W 285 W 200 W 220 W

Of the three cards, the 4080 Super probably brings the least significant spec bump, with a handful of extra CUDA cores and small clock speed increases but the same amount of memory and the same 256-bit memory interface. Its main innovation is its price, which at $999 is $200 lower than the original 4080’s $1,199 launch price. This doesn’t make it a bargain—we’re still talking about a $1,000 graphics card—but the 4080 Super feels like a more proportionate step down from the 4090 and a good competitor to AMD’s flagship Radeon RX 7900 XTX.

The 4070 Ti Super stays at the same $799 price as the 4070 Ti (which, if you’ll recall, was nearly launched at $899 as the “RTX 4080 12GB“) but addresses two major gripes with the original by stepping up to a 256-bit memory interface and 16GB of RAM. It also picks up some extra CUDA cores, while staying within the same power envelope as the original 4070 Ti. These changes should help it keep up with modern 4K games, where the smaller pool of memory and narrower memory interface of the original 4070 Ti could sometimes be a drag on performance.

Most of the RTX 40-series lineup. The original 4080 and 4070 Ti are going away, while the original 4070 now slots in at $549. It's not shown here, but Nvidia confirmed that the 16GB 4060 Ti is also sticking around at $449.

Enlarge / Most of the RTX 40-series lineup. The original 4080 and 4070 Ti are going away, while the original 4070 now slots in at $549. It’s not shown here, but Nvidia confirmed that the 16GB 4060 Ti is also sticking around at $449.

Nvidia

Finally, we get to the RTX 4070 Super, which also keeps the 4070’s $599 price tag but sees a substantial uptick in processing hardware, from 5,888 CUDA cores to 7,168 (the power envelope also increases, from 200 W to 220 W). The memory system remains unchanged. The original 4070 was already a decent baseline for entry-level 4K gaming and very good 1440p gaming, and the 4070 Super should make 60 FPS 4K attainable in even more games.

Nvidia says that the original 4070 Ti and 4080 will be phased out. The original 4070 will stick around at a new $549 price, $50 less than before, but not particularly appealing compared to the $599 4070 Super. The 4090, 4060, and the 8GB and 16GB versions of the 4060 Ti all remain available for the same prices as before.

  • The Super cards’ high-level average performance compared to some past generations of GPU, without DLSS 3 frame generation numbers muddying the waters. The 4070 should be a bit faster than an RTX 3090 most of the time.

    Nvidia

  • Some RTX 4080 performance comparisons. Note that the games at the top all have DLSS 3 frame generation enabled for the 4080 Super, while the older cards don’t support it.

    Nvidia

  • The 4070 Ti Super vs the 3070 Ti and 2070 Super.

    Nvidia

  • The 4070 Super versus the 3070 and the 2070.

    Nvidia

Nvidia’s performance comparisons focus mostly on older-generation cards rather than the non-Super versions, and per usual for 40-series GPU announcements, they lean heavily on performance numbers that are inflated by DLSS 3 frame generation. In terms of pure rendering performance, Nvidia says the 4070 Super should outperform an RTX 3090—impressive, given that the original 4070 was closer to an RTX 3080. The RTX 4080 Super is said to be roughly twice as fast as an RTX 3080, and Nvidia says the RTX 4070 Ti Super will be roughly 2.5 times faster than a 3070 Ti.

Though all three of these cards provide substantially more value than their non-Super predecessors at the same prices, the fact remains that prices have still gone up compared to past generations. Nvidia last released a Super refresh during the RTX 20-series back in 2019; the RTX 2080 Super went for $699 and the 2070 Super for $499. But the 4080 Super, 4070 Ti Super, and 4070 Super will give you more for your money than you could get before, which is at least a move in the right direction.

They’re not cheap, but Nvidia’s new Super GPUs are a step in the right direction Read More »

$329-radeon-7600-xt-brings-16gb-of-memory-to-amd’s-latest-midrange-gpu

$329 Radeon 7600 XT brings 16GB of memory to AMD’s latest midrange GPU

more rams —

Updated 7600 XT also bumps up clock speeds and power requirements.

The new Radeon RX 7600 XT mostly just adds extra memory, though clock speeds and power requirements have also increased somewhat.

Enlarge / The new Radeon RX 7600 XT mostly just adds extra memory, though clock speeds and power requirements have also increased somewhat.

AMD

Graphics card buyers seem anxious about buying a GPU with enough memory installed, even in midrange graphics cards that aren’t otherwise equipped to play games at super-high resolutions. And while this anxiety tends to be a bit overblown—lots of first- and third-party testing of cards like the GeForce 4060 Ti shows that just a handful of games benefit when all you do is boost GPU memory from 8GB to 16GB—there’s still a market for less-expensive GPUs with big pools of memory, whether you’re playing games that need it or running compute tasks that benefit from it.

That’s the apparent impetus behind AMD’s sole GPU announcement from its slate of CES news today: the $329 Radeon RX 7600 XT, a version of last year’s $269 RX 7600 with twice as much memory, slightly higher clock speeds, and higher power use to go with it.

RX 7700 XT RX 7600 RX 7600 XT RX 6600 RX 6600 XT RX 6650 XT RX 6750 XT
Compute units (Stream processors) 54 (3,456) 32 (2,048) 32 (2,048) 28 (1,792) 32 (2,048) 32 (2,048) 40 (2,560)
Boost Clock 2,544 MHz 2,600 MHz 2,760 MHz 2,490 MHz 2,589 MHz 2,635 MHz 2,600 MHz
Memory Bus Width 192-bit 128-bit 128-bit 128-bit 128-bit 128-bit 192-bit
Memory Clock 2,250 MHz 2,250 MHz 2,250 MHz 1,750 MHz 2,000 MHz 2,190 MHz 2,250 MHz
Memory size 12GB GDDR6 8GB GDDR6 16GB GDDR6 8GB GDDR6 8GB GDDR6 8GB GDDR6 12GB GDDR6
Total board power (TBP) 245 W 165 W 190 W 132 W 160 W 180 W 250 W

The core specifications of the 7600 XT remain the same as the regular 7600: 32 of AMD’s compute units (CUs) based on the RDNA3 GPU architecture and the same memory clock speed attached to the same 128-bit memory bus. But RAM has been boosted from 8GB to 16GB, and the GPU’s clock speeds have been boosted a little, ensuring that the card runs games a little faster than the regular 7600, even in games that don’t care about the extra memory.

Images of AMD’s reference design show a slightly larger card than the regular 7600, with a second 8-pin power connector to provide the extra power (total board power increases from 165 W to 190 W). The only other difference between the cards is DisplayPort 2.1 support—it was optional in the regular RX 7600, but all 7600 XTs will have it. That brings it in line with all the other RX 7000-series GPUs.

  • AMD’s hand-picked benchmarks generally show a mild performance improvement over the RX 7600, though Forza is an outlier.

    AMD

  • The 7600 XT’s performance relative to Nvidia’s RTX 4060 is also a little better than the RX 7600’s, thanks to added RAM and higher clocks. But Nvidia should continue to benefit from superior ray-tracing performance in a lot of games.

    AMD

  • Testing against the 4060 at 1440p. Note that the longest bars are coming from games with FSR 3 frame-generation enabled and that Nvidia’s cards also support DLSS 3.

    AMD

  • The complete RX 7000-series lineup.

    AMD

AMD’s provided performance figures show the 7600 XT outrunning the regular 7600 by between 5 and 10 percent in most titles, with one—Forza Horizon 5 with ray-tracing turned all the way up—showing a more significant jump of around 40 percent at 1080p and 1440p. Whether that kind of performance jump is worth the extra $60 depends on the games you play and how worried you are about the system requirements in future games.

AMD says the RX 7600 XT will be available starting on January 24. Pricing and availability for other RX 7000-series GPUs, including the regular RX 7600, aren’t changing.

$329 Radeon 7600 XT brings 16GB of memory to AMD’s latest midrange GPU Read More »

amd-launches-ryzen-8000g-desktop-cpus,-with-updated-igpus-and-ai-acceleration

AMD launches Ryzen 8000G desktop CPUs, with updated iGPUs and AI acceleration

AMD's first Ryzen 8000 desktop processors are what the company used to call

Enlarge / AMD’s first Ryzen 8000 desktop processors are what the company used to call “APUs,” a combination of a fast integrated GPU and a reasonably capable CPU.

AMD

AMD’s G-series Ryzen desktop processors have always been a bit odd—a little behind the curve on AMD’s latest CPU architectures, but with integrated graphics performance that’s enough for a tiny and/or cheap gaming desktop without a dedicated graphics card. They’re also usually updated much more slowly than AMD’s other desktop Ryzens. Today, AMD is announcing a new lineup of Ryzen 8000G processors, chips that should provide a substantial boost over 2021’s Ryzen 5000G chips as long as you don’t mind buying a new socket AM5 motherboard and RAM to go with them.

There are three new processors releasing on January 31. The most powerful is the $329 Ryzen 7 8700G, an 8-core CPU with a Radeon 780M GPU. The next step down, and probably the best combination of price and performance, is the $229 6-core Ryzen 5 8600G, which comes with a slightly slower Radeon 760M GPU.

At the bottom of the range is the $179 Ryzen 5 8500G. It also includes six CPU cores, but with a wrinkle: two of those cores are regular Zen 4 cores, while four are smaller “Zen 4c” cores that are optimized to save space rather than run at high clock speeds. Zen 4c can do exactly the same things as Zen 4, but Zen 4c won’t be as fast, something to be aware of when you’re comparing the chips. The 8500G includes a Radeon 740M GPU.

The Radeon 780M uses 12 of AMD’s compute units (CUs), based on the same RDNA3 graphics architecture as the Radeon RX 7000 series dedicated graphics cards. The 760M only has eight of these CUs enabled, while the Radeon 740M uses four. All four CPUs have a TDP of 65W, which can be adjusted up and down if you have a socket AM5 motherboard with a B650 or X670 chipset.

CPU MSRP/Street price CPU/GPU Arch Cores/threads Radeon GPU Clocks (Base/Boost) Total cache (L2+L3)
Ryzen 7 8700G $329 Zen 4/RDNA3 8c/16t 780M (12 CU) 4.2/5.1 24MB
Ryzen 7 7700 $329 Zen 4/RDNA2 8c/16t Radeon (2 CU) 3.8/5.3 40MB
Ryzen 7 5700G $198 Zen 3/Vega 8c/16t Radeon (8 CU) 3.8/4.6 20MB
Ryzen 5 8600G $229 Zen 4/RDNA3 6c/12t 760M (8 CU) 4.3/5.0 22MB
Ryzen 7 7600 $229 Zen 4/RDNA2 6c/12t Radeon (2 CU) 3.8/5.1 38MB
Ryzen 5 5600G $150 Zen 3/Vega 6c/12t Radeon (7 CU) 3.9/4.4 19MB
Ryzen 5 5600GT $140 Zen 3/Vega 6c/12t Radeon (7 CU) 3.6/4.6 19MB
Ryzen 5 8500G $179 Zen 4 and Zen 4c/RDNA3 6c/12t 740M (4 CU) 3.5/5.0 22MB
Ryzen 5 5500GT $125 Zen 3/Vega 6c/12t Radeon (? CUs) 3.6/4.4 19MB

A fourth processor, the quad-core Ryzen 8300G, will be available exclusively through PC OEMs. Expect to see it in lower-end desktop systems from the likes of HP and others, but you won’t be able to buy it at retail. It uses one large Zen 4 CPU core and three small Zen 4c cores.

The Ryzen 8700G and 8600G are priced at the exact same level as the 7700 and 7600, which have the same CPU architecture and core count. If you’re trying to decide which one to buy, note that the Ryzen 7000 chips’ higher boost clock speeds and larger pools of cache will help them outperform the 8000G processors, so they’re the ones to get if you plan to install a dedicated GPU right away or you just don’t care about integrated graphics performance.

AMD launches Ryzen 8000G desktop CPUs, with updated iGPUs and AI acceleration Read More »

amd-releases-even-more-ryzen-5000-cpus,-keeps-its-last-gen-am4-platform-alive

AMD releases even more Ryzen 5000 CPUs, keeps its last-gen AM4 platform alive

the long goodbye —

New-old chips stick with the aging Zen 3, but could be good CPU upgrade options.

Four new Ryzen 5000 CPUs, all riffs on existing Ryzen 5000 CPUs.

Enlarge / Four new Ryzen 5000 CPUs, all riffs on existing Ryzen 5000 CPUs.

AMD

AMD announced the first Ryzen 8000 desktop processors today: a new lineup of socket AM5 CPUs that bring RDNA 3 integrated GPUs and an AI-accelerating NPU to its desktop platform for the first time. But the company also spent some time on new budget chips for its last-generation AM4 platform. The four new Ryzen 5000 processors cover everything from budget office desktops with integrated GPUs to cost-conscious gaming systems.

At the top of the range is the Ryzen 7 5700X3D, an 8-core CPU with an extra 64MB slab of L3 cache stacked on top of the main CPU die. At $249, it will be a little over $100 cheaper than the 5800X3D, but with the same core count, cache size, and a slightly lower maximum clock speed (4.1 GHz, down from 4.5 GHz). AMD compared it favorably to the Core i5-13600K in gaming workloads, a chip that currently retails for a bit over $280.

The Ryzen 7 5700 is a $175 8-core processor without 3D V-Cache that should still perform reasonably well in most workloads, though AMD’s spec sheet says that it has less cache than the 5700X and only supports PCI Express 3.0 instead of PCIe 4.0. This indicates that the 5700 is actually a 5700G with the integrated graphics disabled; it will be a bit slower than the Ryzen 5700X, despite their similar names, core counts, and clock speeds. The Ryzen 5 5600GT and 5500GT are 6- and 4-core chips with Vega-based integrated graphics, both intended for lower-end systems. At $140 and $125, they essentially amount to minor clock speed bumps for the existing Ryzen 5 5600G and Ryzen 3 5300G.

The new chips are the latest in a surprisingly long line of last hurrahs. Early 2022 brought us some new budget processors and the Ryzen 5800X3D, just a few months before the AM5 platform launched. And in mid-2023, AMD released a limited-edition Ryzen 5600X3D for people who could get to a local Micro Center store and buy one (as of this writing, a quick spot-check of several east coast Micro Centers showed that 5600X3D chips were still broadly available at that price).

It’s hard to recommend that anyone building a new PC go with the socket AM4 platform at this point—even these “new” chips are still using the old Zen 3 architecture and are broadly similar to older products that have been available since late 2020. But they’re still decent cost-efficient upgrade options for people who already have an AM4 motherboard that they use with a Ryzen 1000, 2000, or 3000 processor; if you upgrade from a Ryzen 1000-series chip, it will also help your PC meet Windows 11’s official system requirements, if that’s something you care about.

“AM4 continues to be a key part of our product portfolio,” AMD PR Manager Matthew Hurwitz told Ars when asked why AMD was still releasing new Ryzen 5000 CPUs. “New SKUs give users more options to fit their budget or use case.”

The complete, small-print list of all the AM4 and AM5 processors AMD will offer as of late January.

Enlarge / The complete, small-print list of all the AM4 and AM5 processors AMD will offer as of late January.

AMD

Hurwitz also told us that, unlike the 5600X3D, there would be no availability limitations for any of these new Ryzen 5000 chips. The company also doesn’t immediately plan to discontinue any other Ryzen 5000 CPUs that are still being sold, though “there is always a natural shift from older to newer SKUs as time passes.”

These new-old chips will all be available to purchase starting on January 31. We can at least be thankful that, unlike AMD’s laptop CPUs, the model numbers of these processors aren’t changing just because of the year they were released.

Listing image by AMD

AMD releases even more Ryzen 5000 CPUs, keeps its last-gen AM4 platform alive Read More »