Author name: Mike M.

m4-ipad-pro-review:-well,-now-you’re-just-showing-off

M4 iPad Pro review: Well, now you’re just showing off

The back of an iPad with its Apple logo centered

Enlarge / The 2024, M4-equipped 13-inch iPad Pro.

Samuel Axon

The new iPad Pro is a technical marvel, with one of the best screens I’ve ever seen, performance that few other machines can touch, and a new, thinner design that no one expected.

It’s a prime example of Apple flexing its engineering and design muscles for all to see. Since it marks the company’s first foray into OLED beyond the iPhone or Watch, and the first time a new M-series chip has debuted on something other than a Mac, it comes across as a tech demo for where the company is headed beyond just tablets.

Still, it remains unclear why most people would spend one, two, or even three thousand dollars on a tablet that, despite its amazing hardware, does less than a comparably priced laptop—or at least does it a little more awkwardly, even if it’s impressively quick and has a gorgeous screen.

Specifications

There are some notable design changes in the 2024 iPad Pro, but really, it’s all about the specs—and it’s a more notable specs jump than usual in a couple of areas.

M4

First up, there’s the M4 chip. The previous iPad Pro had an M2 chip, and the latest Mac chip is the M3, so not only did the iPad Pro jump two whole generations, but this is the first time it has debuted the newest iteration of Apple Silicon. (Previously, new M-series chips launched on the Mac first and came to the iPad Pro a few months later.)

Using second-generation 3 nm tech, the M4’s top configuration has a 10-core CPU, a 10-core GPU, and a 16-core NPU. In that configuration, the 10-core CPU has four performance cores and six efficiency cores.

A lower configuration of the M4 has just nine CPU cores—three performance and six efficiency. Which one you get is tied to how much storage you buy. 256GB and 512GB models get nine CPU cores, while 1TB and 2TB get 10. Additionally, the two smaller storage sizes have 8GB of RAM to the larger ones’ 16GB.

This isn’t the first time Apple has tied RAM to storage configurations, but doing that with CPU cores is new for the iPad. Fortunately, the company is upfront about all this in its specs sheet, whereas the RAM differentiation wasn’t always clear to buyers in the past. (Both configurations claim 120GB/s memory bandwidth, though.)

Can the M4 help the iPad Pro bridge the gap between laptop and tablet? Mostly, it made me excited to see the M4 in a laptop.

Enlarge / Can the M4 help the iPad Pro bridge the gap between laptop and tablet? Mostly, it made me excited to see the M4 in a laptop.

Samuel Axon

Regardless of the specific configuration, the M4 promises substantially better CPU and GPU performance than the M2, and it supports hardware-accelerated ray-tracing via Metal, which some games and applications can take advantage of if developers put in the work to make it happen. (It looked great in a demo of Diablo Immortal I saw, but it’s unclear how often we’ll actually see it in the wild.)

Apple claims 1.5x faster CPU performance than the M2 and up to 4x faster graphics performance specifically on applications that involve new features like ray-tracing or hardware-accelerated mesh shading. It hasn’t made any specific GPU performance claims beyond those narrow cases.

A lot of both Apple’s attention and that of the media is focused on the Neural Engine, which is what Apple calls the NPU in the M-series chips. That’s because the company is expected to announce several large language model-based AI features in iOS, macOS, and iPadOS at its developer conference next month, and this is the chip that will power some of that on the iPad and Mac.

Some neat machine-learning features are already possible on the M4—you can generate audio tracks using certain instruments in your Logic Pro projects, apply tons of image optimizations to photos with just a click or two, and so on.

M4 iPad Pro review: Well, now you’re just showing off Read More »

m2-ipad-air-review:-the-everything-ipad

M2 iPad Air review: The everything iPad

breath of fresh air —

M2 Air won’t draw new buyers in, but if you like iPads, these do all you need.

  • The new 13-inch iPad Air with the Apple M2 processor inside.

    Andrew Cunningham

  • In portrait mode. The 13-inch model is a little large for dedicated tablet use, but if you do want a gigantic tablet, the $799 price is appealing.

    Andrew Cunningham

  • The Apple Pencil Pro attaches, pairs, and charges via a magnetic connection on the edge of the iPad.

    Andrew Cunningham

  • In the Magic Keyboard. This kickstand-less case is still probably the best way to make the iPad into a true laptop replacement, though it’s expensive and iPadOS is still a problem.

    Andrew Cunningham

  • The tablet’s USB-C port, used for charging and connecting to external accessories.

    Andrew Cunningham

  • Apple’s Smart Folio case. The magnets on the cover will scoot up and down the back of the iPad, allowing you a bit of flexibility when angling the screen.

    Andrew Cunningham

  • The Air’s single-lens, flash-free camera, seen here peeking through the Smart Folio case.

    Andrew Cunningham

The iPad Air has been a lot of things in the last decade-plus. In 2013 and 2014, the first iPad Airs were just The iPad, and the “Air” label simply denoted how much lighter and more streamlined they were than the initial 2010 iPad and 2011’s long-lived iPad 2. After that, the iPad Air 2 survived for years as an entry-level model, as Apple focused on introducing and building out the iPad Pro.

The Air disappeared for a while after that, but it returned in 2019 as an in-betweener model to bridge the gap between the $329 iPad (no longer called “Air,” despite reusing the first-gen Air design) and more-expensive and increasingly powerful iPad Pros. It definitely made sense to have a hardware offering to span the gap between the basic no-frills iPad and the iPad Pro, but pricing and specs could make things complicated. The main issue for the last couple of years has been the base Air’s 64GB of storage—scanty enough that memory swapping doesn’t even work on it— and the fact that stepping up to 256GB brought the Air too close to the price of the 11-inch iPad Pro.

Which brings us to the 2024 M2 iPad Air, now available in 11-inch and 13-inch models for $599 and $799, respectively. Apple solved the overlap problem this year partly by bumping the Air’s base storage to a more usable 128GB and partly by making the 11-inch iPad Pro so much more expensive that it almost entirely eliminates any pricing overlap (only the 1TB 11-inch Air, at $1,099, is more expensive than the cheapest 11-inch iPad Pro).

I’m not sure I’d go so far as to call the new Airs the “default” iPad for most buyers—the now-$349 10th-gen iPad still does everything the iPad is best at for less money, and it’s still all you really need if you just want a casual gaming, video streaming, and browsing tablet (or a tablet for a kid). But the M2 Air is the iPad that best covers the totality of everything the iPad can do from its awkward perch, stuck halfway between the form and function of the iPhone and the Mac.

Not quite a last-gen iPad Pro

The new iPad Airs have a lot in common with the M2 iPad Pro from 2022. They have the same screen sizes and resolutions, the same basic design, they work with the same older Magic Keyboard accessories (not the new ones with the function rows, metal palm rests, and larger trackpads, which are reserved for the iPad Pro), and they obviously have the same Apple M2 chip.

Performance-wise, nothing we saw in the benchmarks we ran was surprising; the M2’s CPU and (especially) its GPU are a solid generational jump up from the M1, and the M1 is already generally overkill for the vast majority of iPad apps. The M3 and M4 are both significantly faster than the M2, but the M2 is still unquestionably powerful enough to do everything people currently use iPads to do.

That said, Apple’s decision to use an older chip rather than the M3 or M4 does mean the new Airs come into the world missing some capabilities that have come to other Apple products announced in the last six months or so. That list includes hardware-accelerated ray-tracing on the GPU, hardware-accelerated AV1 video codec decoding, and, most importantly, a faster Neural Engine to help power whatever AI stuff Apple’s products pick up in this fall’s big software updates.

The 13-inch Air’s screen has the same resolution and pixel density (2732×2048, 264 PPI) as the last-generation 12.9-inch iPad Pro. And unlike the 13-inch Pro, which truly is a 13-inch screen, Apple’s tech specs page says the 13-inch Air is still using a 12.9-inch screen, and Apple is just rounding up to get to 13.

The 13-inch Air display does share some other things with the last-generation iPad Pro screen, including P3 color, a 600-nit peak brightness. Its display panel has been laminated to the front glass, and it has an anti-reflective coating (two of the subtle but important quality improvements the Air has that the $349 10th-gen iPad doesn’t). But otherwise it’s not the same panel as the M2 Pro; there’s no mini LED, no HDR support, and no 120 Hz ProMotion support.

M2 iPad Air review: The everything iPad Read More »

black-basta-ransomware-group-is-imperiling-critical-infrastructure,-groups-warn

Black Basta ransomware group is imperiling critical infrastructure, groups warn

Black Basta ransomware group is imperiling critical infrastructure, groups warn

Getty Images

Federal agencies, health care associations, and security researchers are warning that a ransomware group tracked under the name Black Basta is ravaging critical infrastructure sectors in attacks that have targeted more than 500 organizations in the past two years.

One of the latest casualties of the native Russian-speaking group, according to CNN, is Ascension, a St. Louis-based health care system that includes 140 hospitals in 19 states. A network intrusion that struck the nonprofit last week ​​took down many of its automated processes for handling patient care, including its systems for managing electronic health records and ordering tests, procedures, and medications. In the aftermath, Ascension has diverted ambulances from some of its hospitals and relied on manual processes.

“Severe operational disruptions”

In an Advisory published Friday, the FBI and the Cybersecurity and Infrastructure Security Agency said Black Basta has victimized 12 of the country’s 16 critical infrastructure sectors in attacks that it has mounted on 500 organizations spanning the globe. The nonprofit health care association Health-ISAC issued its own advisory on the same day that warned that organizations it represents are especially desirable targets of the group.

“The notorious ransomware group, Black Basta, has recently accelerated attacks against the healthcare sector,” the advisory stated. It went on to say: “In the past month, at least two healthcare organizations, in Europe and in the United States, have fallen victim to Black Basta ransomware and have suffered severe operational disruptions.”

Black Basta has been operating since 2022 under what is known as the ransomware-as-a-service model. Under this model, a core group creates the infrastructure and malware for infecting systems throughout a network once an initial intrusion is made and then simultaneously encrypting critical data and exfiltrating it. Affiliates do the actual hacking, which typically involves either phishing or other social engineering or exploiting security vulnerabilities in software used by the target. The core group and affiliates divide any revenue that results.

Recently, researchers from security firm Rapid7 observed Black Basta using a technique they had never seen before. The end goal was to trick employees from targeted organizations to install malicious software on their systems. On Monday, Rapid7 analysts Tyler McGraw, Thomas Elkins, and Evan McCann reported:

Since late April 2024, Rapid7 identified multiple cases of a novel social engineering campaign. The attacks begin with a group of users in the target environment receiving a large volume of spam emails. In all observed cases, the spam was significant enough to overwhelm the email protection solutions in place and arrived in the user’s inbox. Rapid7 determined many of the emails themselves were not malicious, but rather consisted of newsletter sign-up confirmation emails from numerous legitimate organizations across the world.

Example spam email

Enlarge / Example spam email

Rapid7

With the emails sent, and the impacted users struggling to handle the volume of the spam, the threat actor then began to cycle through calling impacted users posing as a member of their organization’s IT team reaching out to offer support for their email issues. For each user they called, the threat actor attempted to socially engineer the user into providing remote access to their computer through the use of legitimate remote monitoring and management solutions. In all observed cases, Rapid7 determined initial access was facilitated by either the download and execution of the commonly abused RMM solution AnyDesk, or the built-in Windows remote support utility Quick Assist.

In the event the threat actor’s social engineering attempts were unsuccessful in getting a user to provide remote access, Rapid7 observed they immediately moved on to another user who had been targeted with their mass spam emails.

Black Basta ransomware group is imperiling critical infrastructure, groups warn Read More »

noaa-says-‘extreme’-solar-storm-will-persist-through-the-weekend

NOAA says ‘extreme’ Solar storm will persist through the weekend

Bright lights —

So far disruptions from the geomagnetic storm appear to be manageable.

Pink lights appear in the sky above College Station, Texas.

Enlarge / Pink lights appear in the sky above College Station, Texas.

ZoeAnn Bailey

After a night of stunning auroras across much of the United States and Europe on Friday, a severe geomagnetic storm is likely to continue through at least Sunday, forecasters said.

The Space Weather Prediction Center at the US-based National Oceanic and Atmospheric Prediction Center observed that ‘Extreme’ G5 conditions were ongoing as of Saturday morning due to heightened Solar activity.

“The threat of additional strong flares and CMEs (coronal mass ejections) will remain until the large and magnetically complex sunspot cluster rotates out of view over the next several days,” the agency posted in an update on the social media site X on Saturday morning.

Good and bad effects

For many observers on Friday night the heightened Solar activity was welcomed. Large areas of the United States, Europe, and other locations unaccustomed to displays of the aurora borealis saw vivid lights as energetically charged particles from the Solar storm passed through the Earth’s atmosphere. Brilliantly pink skies were observed as far south as Texas. Given the forecast for ongoing Solar activity, another night of extended northern lights is possible again on Saturday.

There were also some harmful effects. According to NOAA, there have been some irregularities in power grid transmissions, and degraded satellite communications and GPS services. Users of SpaceX’s Starlink satellite internet constellation have reported slower download speeds. Early on Saturday morning, SpaceX founder Elon Musk said the company’s Starlink satellites were “under a lot of pressure, but holding up so far.”

This is the most intense Solar storm recorded in more than two decades. The last G5 event—the most extreme category of such storms—occurred in October 2003 when there were electricity issues reported in Sweden and South Africa.

Should this storm intensify over the next day or two, scientists say the major risks include more widespread power blackouts, disabled satellites, and long-term damage of GPS networks.

Cause of these storms

Such storms are triggered when the Sun ejects a significant amount of its magnetic field and plasma into the Solar wind. The underlying causes of these coronal mass ejections, deeper in the Sun, are not fully understood. But it is hoped that data collected by NASA’s Parker Solar Probe and other observations will help scientists better understand and predict such phenomena.

When these coronal mass ejections reach Earth’s magnetic field they change it, and can introduce significant currents into electricity lines and transformers, leading to damages or outages.

The most intense geomagnetic storm occurred in 1859, during the so-called Carrington Event. This produced auroral lights around the world, and caused fires in multiple telegraph stations—at the time there were 125,000 miles of telegraph lines in the world.

According to one research paper on the Carrington Event, “At its height, the aurora was described as a blood or deep crimson red that was so bright that one ‘could read a newspaper by’.”

NOAA says ‘extreme’ Solar storm will persist through the weekend Read More »

is-dark-matter’s-main-rival-theory-dead?

Is dark matter’s main rival theory dead?

Galaxy rotation has long perplexed scientists.

Enlarge / Galaxy rotation has long perplexed scientists.

One of the biggest mysteries in astrophysics today is that the forces in galaxies do not seem to add up. Galaxies rotate much faster than predicted by applying Newton’s law of gravity to their visible matter, despite those laws working well everywhere in the Solar System.

To prevent galaxies from flying apart, some additional gravity is needed. This is why the idea of an invisible substance called dark matter was first proposed. But nobody has ever seen the stuff. And there are no particles in the hugely successful Standard Model of particle physics that could be the dark matter—it must be something quite exotic.

This has led to the rival idea that the galactic discrepancies are caused instead by a breakdown of Newton’s laws. The most successful such idea is known as Milgromian dynamics or Mond, proposed by Israeli physicist Mordehai Milgrom in 1982. But our recent research shows this theory is in trouble.

The main postulate of Mond is that gravity starts behaving differently from what Newton expected when it becomes very weak, as at the edges of galaxies. Mond is quite successful at predicting galaxy rotation without any dark matter, and it has a few other successes. But many of these can also be explained with dark matter, preserving Newton’s laws.

So how do we put Mond to a definitive test? We have been pursuing this for many years. The key is that Mond only changes the behavior of gravity at low accelerations, not at a specific distance from an object. You’ll feel lower acceleration on the outskirts of any celestial object—a planet, star, or galaxy—than when you are close to it. But it is the amount of acceleration, rather than the distance, that predicts where gravity should be stronger.

This means that, although Mond effects would typically kick in several thousand light years away from a galaxy, if we look at an individual star, the effects would become highly significant at a tenth of a light year. That is only a few thousand times larger than an astronomical unit (AU)—the distance between the Earth and the Sun. But weaker Mond effects should also be detectable at even smaller scales, such as in the outer Solar System.

This brings us to the Cassini mission, which orbited Saturn between 2004 and its final fiery crash into the planet in 2017. Saturn orbits the Sun at 10 AU. Due to a quirk of Mond, the gravity from the rest of our galaxy should cause Saturn’s orbit to deviate from the Newtonian expectation in a subtle way.

Cassini orbited Saturn from 2004 to 2017.

Enlarge / Cassini orbited Saturn from 2004 to 2017.

This can be tested by timing radio pulses between Earth and Cassini. Since Cassini was orbiting Saturn, this helped to measure the Earth-Saturn distance and allowed us to precisely track Saturn’s orbit. But Cassini did not find any anomaly of the kind expected in Mond. Newton still works well for Saturn.

Is dark matter’s main rival theory dead? Read More »

cryptmaster-is-a-dark,-ridiculous-rpg-test-of-your-typing-and-guessing-skills

Cryptmaster is a dark, ridiculous RPG test of your typing and guessing skills

A different kind of text adventure —

Ask a necromancer to lick a shield. Type out “HIT,” “YELL,” “ZAP.” It’s funny.

Cryptmaster screenshot showing the player typing out

Enlarge / Sometimes you gotta get your nose in there to remember the distinct aroma of 1980s RPG classics.

Akupara Games

There are people who relish the feeling of finally nailing down a cryptic clue in a crossword. There are also people unduly aggravated by a puzzlemaster’s puns and clever deceptions. I’m more the latter kind. I don’t even play the crossword—or Wordle or Connections or Strands—but my wife does, and she’ll feed me clues. Without fail, they leave me in some strange state of being relieved to finally get it, yet also keyed up and irritated.

Cryptmaster, out now on Steam, GOG, and Itch.io for Windows, seems like the worst possible game for people like me, and yet I dig it. It is many things at once: a word-guessing game, a battle typing (or shouting) challenge, a party-of-four first-person grid-based dungeon crawler, and a text-prompt adventure, complete with an extremely goofy sense of humor. It’s also in stark black and white. You cannot fault this game for a lack of originality, even while it evokes Wizardry, Ultima Underground, and lots of other arrow-key-moving classics, albeit with an active tongue-in-cheek filter.

Cryptmaster announcement trailer.

The Cryptmaster in question has woken up four role-playing figures—fighter, rogue, bard, and wizard—to help him escape from his underground lair to the surface, for reasons that must be really keen and good. As corpses, you don’t remember any of your old skills, but you can guess them. What’s a four-letter action that a fighter might perform, or a three-letter wizard move? Every time you find a box or treasure, the Cryptmaster opens it, gives you a letter count, then lets you ask for clues. “SMELL,” you type, and he says it has that wonderful old-paper smell. “LOOK,” and he notes that there are writings and drawings on one side. Guess “SCROLL,” and he adds those letters to your characters’ next ability clues. Guess wrong, well, better luck next time.

  • Okay, so none of my characters can get really good prices through group buying, got it.

    Akupara Games

  • Gelatinous cubes, of course, but this one makes you think on the fly about which verbs you can use.

    Akupara Games

  • A lot of the characters in Cryptmaster are, well, characters.

    Akupara Games

  • In case you didn’t get enough word games from the main gameplay, there is a mini card game you can play with its own letters-and-words mechanics.

    Akupara Games

  • Uncovering more verbs reveals more of your dead characters’ past lives.

    Akupara Games

Once you’ve got a few verbs, you’ll want to learn them and figure out how they fit together, because you’ll have to fight some things. Combat is all about typing but also remembering your words and juggling cooldowns, attack, defense, and ability costs. Strike with your fighter, backstab with the rogue, fling a spell from the wizard, and have your bard reset the fighter’s cooldown, all while a baddie very slowly winds up and swings at random party members. Some fights can be avoided by maneuvering around them, but successful fights also let you choose another letter to potentially reveal new verbs. Apologies for the somewhat vague descriptions here, but I’m trying not to give away any words.

There are a few other mechanics to learn, like smashing wall-crawling bugs to gather their ability-powering essence, and defiling shrines to better suit your undead needs. But let’s talk about the Cryptmaster. Saying the character is “voiced” by the game’s writer and co-designer, Lee Williams, truly undersells it. As with some of the best adventure games, Williams and coder/designer/artist Paul Hart have anticipated so, so many things you might type in when prompted to guess, ask, or interact with their gloomy little world. Maybe there’s a point at which the Cryptmaster—a far more dour version of the HBO Cryptkeeper eternally disappointed in you—stops being surprising in his responses. I have yet to find it after a few hours of play. (How the team pulled off such a huge response range is detailed in an interview at Game Developer.)

Go ahead and recapture some of your childhood sense of wonder: Swear at the Cryptmaster. You won’t be disappointed.

You can play the game in turn-based mode, removing the pressure of remembering and typing out actions, but it’s not the recommended setting. While I played with only typing and relished the chance to give my mechanical keyboard a workout, you can also play with voice prompts. If you’re not sure if this is the kind of game for you, there’s a free demo on Steam that should clue you in.

Was that a pun? Maybe. Cryptmaster gave me a bit more appreciation for word-guessing games—the kind with enjoyments that are not easily, shall we say, spelled out.

Cryptmaster is a dark, ridiculous RPG test of your typing and guessing skills Read More »

how-the-moon-got-a-makeover

How the Moon got a makeover

Putting on a new face —

The Moon’s former surface sank to the depths, until volcanism brought it back.

Image of the moon.

Our Moon may appear to shine peacefully in the night sky, but billions of years ago, it was given a facial by volcanic turmoil.

One question that has gone unanswered for decades is why there are more titanium-rich volcanic rocks, such as ilmenite, on the near side as opposed to the far side. Now a team of researchers at Arizona Lunar and Planetary Laboratory are proposing a possible explanation for that.

The lunar surface was once flooded by a bubbling magma ocean, and after the magma ocean had hardened, there was an enormous impact on the far side. Heat from this impact spread to the near side and made the crust unstable, causing sheets of heavier and denser minerals on the surface to gradually sink deep into the mantle. These melted again and were belched out by volcanoes. Lava from these eruptions (more of which happened on the near side) ended up in what are now titanium-rich flows of volcanic rock. In other words, the Moon’s old face vanished, only to resurface.

What lies beneath

The region of the Moon in question is known as the Procellarum KREEP Terrane (PKT). KREEP signifies high concentrations of potassium (K), rare earth elements (REE), and phosphorus (P). This is also where ilmenite-rich basalts are found. Both KREEP and the basalts are thought to have first formed when the Moon was cooling from its magma ocean phase. But the region stayed hot, as KREEP also contains high levels of radioactive uranium and thorium.

“The PKT region… represents the most volcanically active region on the Moon as a natural result of the high abundances of heat-producing elements,” the researchers said in a study recently published in Nature Geoscience.

Why is this region located on the near side, while the far side is lacking in KREEP and ilmenite-rich basalts? There was one existing hypothesis that caught the researchers’ attention: it proposed that after the magma ocean hardened on the near side, sheets of these KREEP minerals were too heavy to stay on the surface. They began to sink into the mantle and down to the border between the mantle and core. As they sank, these mineral sheets were thought to have left behind trace amounts of material throughout the mantle.

If the hypothesis was accurate, this would mean there should be traces of minerals from the hardened KREEP magma crust in sheet-like configurations beneath the lunar surface, which could reach all the way down to the edge of the core-mantle boundary.

How could that be tested? Gravity data from the GRAIL (Gravity Recovery and Interior Laboratory) mission to the Moon possibly had the answer. It would allow them to detect gravitational anomalies caused by the higher density of the KREEP rock compared to surrounding materials.

Coming to the surface

GRAIL data had previously revealed that there was a pattern of subsurface gravitational anomalies in the PKT region. This appeared similar to the pattern that the sheets of volcanic rock were predicted to have made as they sank, which is why the research team decided to run a computer simulation of sinking KREEP to see how well the hypothesis matched up with the GRAIL findings.

Sure enough, the simulation ended up forming just about the same pattern as the anomalies GRAIL found. The polygonal pattern seen in both the simulations and GRAIL data most likely means that traces of heavier KREEP and ilmenite-rich basalt layers were left behind beneath the surface as those layers sank due to their density, and GRAIL detected their residue due to their greater gravitational pull. GRAIL also suggested there were many lesser anomalies in the PKT region, which makes sense considering that a large part of the crust is made of volcanic rocks thought to have sunk and left behind residue before they melted and surfaced again through eruptions.

We now also have an idea of when this phenomenon occurred. Because there are impact basins that dated to around 4.22 billion years ago (not to be confused with the earlier far-side impact), but the magma ocean is thought to have hardened before that, the researchers think that the crust also began to sink before that time.

“The PKT border anomalies provide the most direct physical evidence for the nature of the post-magma ocean… mantle overturn and sinking of ilmenite into the deep interior,” the team said in the same study.

This is just one more bit of information regarding how the Moon evolved and why it is so uneven. The near side once raged with lava that is now volcanic rock, much of which exists in flows called mare (which translates to “sea” in Latin). Most of this volcanic rock, especially in the PKT region, contains rare earth elements.

We can only confirm that there really are traces of ancient crust inside the Moon by the collection of actual lunar material far beneath the surface. When Artemis astronauts are finally able to gather samples of volcanic material from the Moon in situ, who knows what will come to the surface?

Nature Geoscience, 2024.  DOI: 10.1038/s41561-024-01408-2

How the Moon got a makeover Read More »

nasa-wants-a-cheaper-mars-sample-return—boeing-proposes-most-expensive-rocket

NASA wants a cheaper Mars Sample Return—Boeing proposes most expensive rocket

The Space Launch System rocket lifts off on the Artemis I mission.

Enlarge / The Space Launch System rocket lifts off on the Artemis I mission.

NASA is looking for ways to get rock samples back from Mars for less than the $11 billion the agency would need under its own plan, so last month, officials put out a call to industry to propose ideas.

Boeing is the first company to release details about how it would attempt a Mars Sample Return mission. Its study involves a single flight of the Space Launch System (SLS) rocket, the super heavy-lift launcher designed to send astronauts to the Moon on NASA’s Artemis missions.

Jim Green, NASA’s former chief scientist and longtime head of the agency’s planetary science division, presented Boeing’s concept Wednesday at the Humans to Mars summit, an annual event sponsored primarily by traditional space companies. Boeing is the lead contractor for the SLS core stage and upper stage and has pitched the SLS, primarily a crew launch vehicle, as a rocket for military satellites and deep space probes.

All in one

Green, now retired, said the concept he and Boeing engineers propose would reduce the risks of Mars Sample Return. With one mission, there are fewer points of potential failure, he said.

“To reduce mission complexity, this new concept is doing one launch,” Green said.

This argument makes some sense, but the problem is SLS is the most expensive rocket flying today. Even if NASA and Boeing introduce cost-cutting measures, NASA’s inspector general reported last year it’s unlikely the cost of a single SLS launch would fall below $2 billion. The inspector general recommended NASA consider buying commercial rockets as an alternative to SLS for future Artemis missions.

NASA’s Perseverance rover, operating on Mars since February 2021, is collecting soil and rock core samples and sealing them in 43 cigar-size titanium tubes. The rover has dropped the first 10 of these tubes in a depot on the Martian surface that could be retrieved by a future sample return mission. The remaining tubes will likely remain stowed on Perseverance in hopes the rover will directly hand off the samples to the spacecraft that comes to Mars to get them.

Boeing says a single launch of the Space Launch System rocket could carry everything needed for a Mars Sample Return mission.

Enlarge / Boeing says a single launch of the Space Launch System rocket could carry everything needed for a Mars Sample Return mission.

Boeing

In his remarks, Green touted the benefits of launching a Mars Sample Return mission with a single rocket and a single spacecraft. NASA’s baseline concept involves two launches, one with a US-built lander and a small rocket to boost the rocket samples back off the surface of Mars, and another with a European spacecraft to rendezvous with the sample carrier in orbit around Mars, then bring the specimens back to Earth.

“This concept is one launch vehicle,” he said. “It’s the SLS. What does it do? It’s carrying a massive payload. What is the payload? It’s a Mars entry and descent aeroshell. It has a propulsive descent module.”

The lander would carry everything needed to get the samples back to Earth. A fetch rover onboard the lander would deploy to drive out and pick up the sample tubes collected by the Perseverance rover. Then, a robotic arm would transfer the sample tubes to a container at the top of a two-stage rocket called the Mars Ascent Vehicle (MAV) sitting on top of the lander. The MAV would have the oomph needed to boost the samples off the surface of Mars and into orbit, then fire engines to target a course back to Earth.

Boeing has no direct experience as a prime contractor for any Mars mission. SpaceX, with its giant Starship rocket designed for eventual Mars missions, and Lockheed Martin, which has built several Mars landers for NASA, are the companies with the technology and expertise that seem to be most useful for Mars Sample Return.

NASA is also collecting ideas for Mars Sample Return from its space centers across the United States. The agency also tasked the Jet Propulsion Laboratory, which was in charge of developing the original dead-on-arrival concept, to come up with a better idea. Later this year, NASA officials will reference these new proposals as they decide how to proceed with Mars Sample Return, with the goal of getting samples back from Mars in the 2030s.

NASA wants a cheaper Mars Sample Return—Boeing proposes most expensive rocket Read More »

more-children-gain-hearing-as-gene-therapy-for-profound-deafness-advances

More children gain hearing as gene therapy for profound deafness advances

Success —

The therapy treats a rare type of deafness, but experts hope it’s a “jumping point.”

Opal Sandy (center), who was born completely deaf because of a rare genetic condition, can now hear unaided for the first time after receiving gene therapy at 11-months-old. She is shown with her mother, father, and sister at their home in Eynsham, Oxfordshire, on May 7, 2024.

Enlarge / Opal Sandy (center), who was born completely deaf because of a rare genetic condition, can now hear unaided for the first time after receiving gene therapy at 11-months-old. She is shown with her mother, father, and sister at their home in Eynsham, Oxfordshire, on May 7, 2024.

There are few things more heartwarming than videos of children with deafness gaining the ability to hear, showing them happily turning their heads at the sound of their parents’ voices and joyfully bobbing to newly discovered music. Thanks to recent advances in gene therapy, more kids are getting those sweet and triumphant moments—with no hearing aids or cochlear implants needed.

At the annual conference of the American Society for Gene & Cell Therapy held in Baltimore this week, researchers showed many of those videos to their audiences of experts. On Wednesday, Larry Lustig, an otolaryngologist at Columbia University, presented clinical trial data of two children with profound deafness—the most severe type of deafness—who are now able to hear at normal levels after receiving an experimental gene therapy. One of the children was 11 months old at the time of the treatment, marking her as the youngest child in the world to date to receive gene therapy for genetic deafness.

On Thursday, Yilai Shu, an otolaryngologist at Fudan University in Shanghai, provided a one-year progress report on six children who were treated in the first in-human trial of gene therapy for genetic therapy. Five of the six had their hearing restored.

That trial, like the one Lustig presented, involved treating just one ear in all of the children—a safety precaution for such early trials. But Shu and colleagues have already moved on to both ears, or bilateral treatment. After presenting a progress report on the first trial, Shu presented unpublished early data on five additional patients who participated in the first in-human trial of bilateral treatment. All had bilateral hearing restoration and speech perception improvement.

“The opportunity of providing the full complexity and spectrum of sound in children born with profound genetic deafness is a phenomenon I did not expect to see in my lifetime,” Lustig said in a statement.

Jumping point

Shu and Lustig’s trials are separate but the treatments are, in broad strokes, similar. Both are aimed at restoring hearing loss caused by mutations in the OTOF gene the codes for the protein otoferlin. Normally, otoferlin is a critical protein for transmitting sound signals to the brain, specifically playing a key role in synaptic transmission between the ear’s inner hair cells and the auditory nerve. Using gutted adeno-associated viruses as vectors for gene delivery, the therapies provide the inner ear with a functional version of the OTOF gene. Once in the ear, the gene can be translated into functional otoferlin, restoring auditory signaling.

In the trial Lustig presented, the two patients saw a gradual improvement of hearing as otoferlin protein built up after treatment. For the 11-month-old, normal levels of hearing were restored within 24 weeks of treatment. For the second patient, a 4-year-old, improvements were detected at a six-week assessment. In the trial Shu presented, children began seeing hearing improvements at three- and four-week assessments. The children will continue to be followed into the future, which holds some uncertainties. It’s unclear if they will, at some point in their lives, need additional treatments to sustain their hearing. In mice, at least, the treatment lasts for the duration of the animals’ lives—but they only live for a few years.

“We expect this to last a long time,” Lustig said Wednesday. But “we don’t know what’s going to happen and we don’t know whether we can do a second dose. But, probably, I would guess, at some point that would have to be done.”

For now, the treatment is considered low-hanging fruit for the burgeoning field of gene therapy since it targets a severe condition caused by recessive mutations in a single gene. Otoferlin mutations lead to a very specific type of deafness called auditory neuropathy, in which the ear fails to send signals to the brain but works perfectly fine otherwise. This is an ultra-rare form of deafness affecting 1–8 percent of people with deafness globally. Only about 30 to 50 people in the US are born with this type of deafness each year.

However, Lustig calls it a “jumping point.” Now that researchers have shown that this gene therapy can work, “This is going to really spark, we hope, the development of gene therapy for more common types of deafness,” he said.

More children gain hearing as gene therapy for profound deafness advances Read More »

elon-musk’s-x-can’t-invent-its-own-copyright-law,-judge-says

Elon Musk’s X can’t invent its own copyright law, judge says

Who owns X data? Everyone but X —

Judge rules copyright law governs public data scraping, not X’s terms.

Elon Musk’s X can’t invent its own copyright law, judge says

A US district judge William Alsup has dismissed Elon Musk’s X Corp’s lawsuit against Bright Data, a data-scraping company accused of improperly accessing X (formerly Twitter) systems and violating both X terms and state laws when scraping and selling data.

X sued Bright Data to stop the company from scraping and selling X data to academic institutes and businesses, including Fortune 500 companies.

According to Alsup, X failed to state a claim while arguing that companies like Bright Data should have to pay X to access public data posted by X users.

“To the extent the claims are based on access to systems, they fail because X Corp. has alleged no more than threadbare recitals,” parroting laws and findings in other cases without providing any supporting evidence, Alsup wrote. “To the extent the claims are based on scraping and selling of data, they fail because they are preempted by federal law,” specifically standing as an “obstacle to the accomplishment and execution of” the Copyright Act.

The judge found that X Corp’s argument exposed a tension between the platform’s desire to control user data while also enjoying the safe harbor of Section 230 of the Communications Decency Act, which allows X to avoid liability for third-party content. If X owned the data, it could perhaps argue it has exclusive rights to control the data, but then it wouldn’t have safe harbor.

“X Corp. wants it both ways: to keep its safe harbors yet exercise a copyright owner’s right to exclude, wresting fees from those who wish to extract and copy X users’ content,” Alsup wrote.

If X got its way, Alsup warned, “X Corp. would entrench its own private copyright system that rivals, even conflicts with, the actual copyright system enacted by Congress” and “yank into its private domain and hold for sale information open to all, exercising a copyright owner’s right to exclude where it has no such right.”

That “would upend the careful balance Congress struck between what copyright owners own and do not own,” Alsup wrote, potentially shrinking the public domain.

“Applying general principles, this order concludes that the extent to which public data may be freely copied from social media platforms, even under the banner of scraping, should generally be governed by the Copyright Act, not by conflicting, ubiquitous terms,” Alsup wrote.

Bright Data CEO Or Lenchner said in a statement provided to Ars that Alsup’s decision had “profound implications in business, research, training of AI models, and beyond.”

“Bright Data has proven that ethical and transparent scraping practices for legitimate business use and social good initiatives are legally sound,” Lenchner said. “Companies that try to control user data intended for public consumption will not win this legal battle.”

Alsup pointed out that X’s lawsuit was “not looking to protect X users’ privacy” but rather to block Bright Data from interfering with its “own sale of its data through a tiered subscription service.”

“X Corp. is happy to allow the extraction and copying of X users’ content so long as it gets paid,” Alsup wrote.

In a sea of vague claims that scraping is “unfair,” perhaps most deficient in X’s complaint, Alsup suggested, was X’s failure to allege that Bright Data’s scraping impaired its services or that X suffered any damages.

“There are no allegations of servers harmed or identities misrepresented,” Alsup wrote. “Additionally, there are no allegations of any damage resulting from automated or unauthorized access.”

X will be allowed to amend its complaint and appeal. The case may be strengthened if X can show evidence of damages or prove that the scraping overburdened X or otherwise deprived X users of their use of the platform in a way that could damage X’s reputation.

But as it currently stands, X’s arguments in many ways appear rather “bare,” Alsup wrote, while its terms of service make crystal clear to users that “[w]hat’s yours is yours—you own your Content.”

By attempting to exclude Bright Data from accessing public X posts owned by X users, X also nearly “obliterated” the “fair use” provision of the Copyright Act, “flouting” Congress’ intent in passing the law, Alsup wrote.

“Only by receiving permission and paying X Corp. could Bright Data, its customers, and other X users freely reproduce, adapt, distribute, and display what might (or might not) be available for taking and selling as fair use,” Alsup wrote. “Thus, Bright Data, its customers, and other X users who wanted to make fair use of copyrighted content would not be able to do so.”

A win for X could have had dire consequences for the Internet, Alsup suggested. In dismissing the complaint, Alsup cited an appeals court ruling “that giving social media companies “free rein to decide, on any basis, who can collect and use data—data that the companies do not own, that they otherwise make publicly available to viewers, and that the companies themselves collect and use—risks the possible creation of information monopolies that would disserve the public interest.”

Because that outcome was averted, Lenchner is celebrating Bright Data’s win.

“Bright Data’s victory over X makes it clear to the world that public information on the web belongs to all of us, and any attempt to deny the public access will fail,” Lenchner said.

In 2023, Bright Data won a similar lawsuit lobbed by Meta over scraping public Facebook and Instagram data. These lawsuits, Lenchner alleged, “are used as a monetary weapon to discourage collecting public data from sites, so conglomerates can hoard user-generated public data.”

“Courts recognize this and the risks it poses of information monopolies and ownership of the Internet,” Lenchner said.

X did not respond to Ars’ request to comment.

Elon Musk’s X can’t invent its own copyright law, judge says Read More »

how-you-can-make-cold-brew-coffee-in-under-3-minutes-using-ultrasound

How you can make cold-brew coffee in under 3 minutes using ultrasound

Save yourself a few hours —

A “sonication” time between 1 and 3 minutes is ideal to get the perfect cold brew.

UNSW Sydney engineers developed a new way to make cold brew coffee in under three minutes without sacrificing taste.

Enlarge / UNSW Sydney engineers developed a new way to make cold brew coffee in under three minutes without sacrificing taste.

University of New South Wales, Sydney

Diehard fans of cold-brew coffee put in a lot of time and effort for their preferred caffeinated beverage. But engineers at the University of New South Wales, Sydney, figured out a nifty hack. They rejiggered an existing espresso machine to accommodate an ultrasonic transducer to administer ultrasonic pulses, thereby reducing the brewing time from 12 to 24 hours to just under three minutes, according to a new paper published in the journal Ultrasonics Sonochemistry.

As previously reported, rather than pouring boiling or near-boiling water over coffee grounds and steeping for a few minutes, the cold-brew method involves mixing coffee grounds with room-temperature water and letting the mixture steep for anywhere from several hours to two days. Then it is strained through a sieve to filter out all the sludge-like solids, followed by filtering. This can be done at home in a Mason jar, or you can get fancy and use a French press or a more elaborate Toddy system. It’s not necessarily served cold (although it can be)—just brewed cold.

The result is coffee that tastes less bitter than traditionally brewed coffee. “There’s nothing like it,” co-author Francisco Trujillo of UNSW Sydney told New Scientist. “The flavor is nice, the aroma is nice and the mouthfeel is more viscous and there’s less bitterness than a regular espresso shot. And it has a level of acidity that people seem to like. It’s now my favorite way to drink coffee.”

While there have been plenty of scientific studies delving into the chemistry of coffee, only a handful have focused specifically on cold-brew coffee. For instance, a 2018 study by scientists at Thomas Jefferson University in Philadelphia involved measuring levels of acidity and antioxidants in batches of cold- and hot-brew coffee. But those experiments only used lightly roasted coffee beans. The degree of roasting (temperature) makes a significant difference when it comes to hot-brew coffee. Might the same be true for cold-brew coffee?

To find out, the same team decided in 2020 to explore the extraction yields of light-, medium-, and dark-roast coffee beans during the cold-brew process. They used the cold-brew recipe from The New York Times for their experiments, with a water-to-coffee ratio of 10:1 for both cold- and hot-brew batches. (Hot brew normally has a water-to-coffee ratio of 20:1, but the team wanted to control variables as much as possible.) They carefully controlled when water was added to the coffee grounds, how long to shake (or stir) the solution, and how best to press the cold-brew coffee.

The team found that for the lighter roasts, caffeine content and antioxidant levels were roughly the same in both the hot- and cold-brew batches. However, there were significant differences between the two methods when medium- and dark-roast coffee beans were used. Specifically, the hot-brew method extracts more antioxidants from the grind; the darker the bean, the greater the difference. Both hot- and cold-brew batches become less acidic the darker the roast.

The new faster cold brew system subjects coffee grounds in the filter basket to ultrasonic sound waves from a transducer, via a specially adapted horn.

Enlarge / The new faster cold brew system subjects coffee grounds in the filter basket to ultrasonic sound waves from a transducer, via a specially adapted horn.

UNSW/Francisco Trujillo

That gives cold brew fans a few handy tips, but the process remains incredibly time-consuming; only true aficionados have the patience required to cold brew their own morning cuppa. Many coffee houses now offer cold brews, but it requires expensive, large semi-industrial brewing units and a good deal of refrigeration space. According to Trujillo, the inspiration for using ultrasound to speed up the process arose from failed research attempts to extract more antioxidants. Those experiments ultimately failed, but the setup produced very good coffee.

Trujillo et al. used a Breville Dual Boiler BES920 espresso machine for their latest experiments, with a few key modifications. They connected a bolt-clawed transducer to the brewing basket with a metal horn. They then used the transducer to inject 38.8 kHz sound waves through the walls at several different points, thereby transforming the filter basket into a powerful ultrasonic reactor.

The team used the machine’s original boiler but set it up to be independently controlled it with an integrated circuit to better manage the temperature of the water. As for the coffee beans, they picked Campos Coffee’s Caramel & Rich Blend (a medium roast). “This blend combines fresh, high-quality specialty coffee beans from Ethiopia, Kenya, and Colombia, and the roasted beans deliver sweet caramel, butterscotch, and milk chocolate flavors,” the authors wrote.

There were three types of samples for the experiments: cold brew hit with ultrasound at room temperature for one minute or for three minutes, and cold brew prepared with the usual 24-hour process. For the ultrasonic brews, the beans were ground into a fine grind typical for espresso, while a slightly coarser grind was used for the traditional cold-brew coffee.

How you can make cold-brew coffee in under 3 minutes using ultrasound Read More »

big-three-carriers-pay-$10m-to-settle-claims-of-false-“unlimited”-advertising

Big Three carriers pay $10M to settle claims of false “unlimited” advertising

False advertising —

States obtain settlement, but it’s unclear whether consumers will get refunds.

The word,

Verizon

T-Mobile, Verizon, and AT&T will pay a combined $10.2 million in a settlement with US states that alleged the carriers falsely advertised wireless plans as “unlimited” and phones as “free.” The deal was announced yesterday by New York Attorney General Letitia James.

“A multistate investigation found that the companies made false claims in advertisements in New York and across the nation, including misrepresentations about ‘unlimited’ data plans that were in fact limited and had reduced quality and speed after a certain limit was reached by the user,” the announcement said.

T-Mobile and Verizon agreed to pay $4.1 million each while AT&T agreed to pay a little over $2 million. The settlement includes AT&T subsidiary Cricket Wireless and Verizon subsidiary TracFone.

The settlement involves 49 of the 50 US states (Florida did not participate) and the District of Columbia. The states’ investigation found that the three major carriers “made several misleading claims in their advertising, including misrepresenting ‘unlimited’ data plans that were actually limited, offering ‘free’ phones that came at a cost, and making false promises about switching to different wireless carrier plans.”

“AT&T, Verizon, and T-Mobile lied to millions of consumers, making false promises of free phones and ‘unlimited’ data plans that were simply untrue,” James said. “Big companies are not excused from following the law and cannot trick consumers into paying for services they will never receive.”

States have options for using money

The carriers denied any illegal conduct despite agreeing to the settlement. In addition to payments to each state, the carriers agreed to changes in their advertising practices. It’s unclear whether consumers will get any refunds out of the settlement, however.

The settlement gives states leeway in how to use the payments from carriers. The payments can be used to cover “attorneys’ fees and other costs of investigation and litigation,” or can go toward “consumer protection law enforcement funds.”

States can use the payments for future consumer protection enforcement, consumer education, litigation, or a consumer aid fund. The money can also be used for “monitoring and potential enforcement” of the settlement terms “or consumer restitution,” the settlement says.

We asked James’ office about whether any consumer restitution is planned and will update this article if we get a response.

Advertising restrictions

The three carriers agreed that all advertisements to consumers must be “truthful, accurate and non-misleading.” They also agreed to the following changes, the NY attorney general’s office said:

  • “Unlimited” mobile data plans can only be marketed if there are no limits on the quantity of data allowed during a billing cycle.
  • Offers to pay for consumers to switch to a different wireless carrier must clearly disclose how much a consumer will be paid, how consumers will be paid, when consumers can expect payment, and any additional requirements consumers have to meet to get paid.
  • Offers of “free” wireless devices or services must clearly state everything a consumer must do to receive the “free” devices or services.
  • Offers to lease wireless devices must clearly state that the consumer will be entering into a lease agreement.
  • All “savings” claims must have a reasonable basis. If a wireless carrier claims that consumers will save using its services compared to another wireless carrier, the claim must be based on similar goods or services or differences must be clearly explained to the consumer.

The advertising restrictions are to be in place for five years.

T-Mobile provided a statement about the settlement to Ars today. “After nine years, we are glad to move on from this industry-wide investigation with this settlement and a continued commitment to the transparent and consumer-friendly advertising practices we’ve undertaken for years,” T-Mobile said.

AT&T and Verizon declined to comment individually and referred us to their lobby group, CTIA. “These voluntary agreements reflect no finding of improper conduct and reaffirm the wireless industry’s longstanding commitment to clarity and integrity in advertising so that consumers can make informed decisions about the products and services that best suit them,” the wireless lobby group said.

Big Three carriers pay $10M to settle claims of false “unlimited” advertising Read More »