retrotech

engineer-creates-first-custom-motherboard-for-1990s-playstation-console

Engineer creates first custom motherboard for 1990s PlayStation console

The nsOne project joins a growing community of homebrew PlayStation 1 hardware developments. Other recent projects include Picostation, a Raspberry Pi Pico-based optical disc emulator (ODE) that allows PlayStation 1 consoles to load games from SD cards instead of physical discs. Other ODEs like MODE and PSIO have also become popular solutions for retrogaming collectors who play games on original hardware as optical drives age and fail.

From repair job to reverse-engineering project

To understand the classic console’s physical architecture, Brodesco physically sanded down an original motherboard to expose its internal layers, then cross-referenced the exposed traces with component datasheets and service manuals.

“I realized that detailed documentation on the original motherboard was either incomplete or entirely unavailable,” Brodesco explained in his Kickstarter campaign. This discovery launched what would become a comprehensive documentation effort, including tracing every connection on the board and creating multi-layer graphic representations of the circuitry.

A photo of the nsOne PlayStation motherboard.

A photo of the nsOne PlayStation motherboard. Credit: Lorentio Brodesco

Using optical scanning and manual net-by-net reverse-engineering, Brodesco recreated the PlayStation 1’s schematic in modern PCB design software. This process involved creating component symbols with accurate pin mappings and identifying—or in some cases creating—the correct footprints for each proprietary component that Sony had never publicly documented.

Brodesco also identified what he calls the “minimum architecture” required to boot the console without BIOS modifications, streamlining the design process while maintaining full compatibility.

The mock-up board shown in photos validates the footprints of chips and connectors, all redrawn from scratch. According to Brodesco, a fully routed version with complete multilayer routing and final layout is already in development.

A photo of the nsOne PlayStation motherboard.

A photo of the nsOne PlayStation motherboard. Credit: Lorentio Brodesco

As Brodesco noted on Kickstarter, his project’s goal is to “create comprehensive documentation, design files, and production-ready blueprints for manufacturing fully functional motherboards.”

Beyond repairs, the documentation and design files Brodesco is creating would preserve the PlayStation 1’s hardware architecture for future generations: “It’s a tribute to the PS1, to retro hardware, and to the belief that one person really can build the impossible.”

Engineer creates first custom motherboard for 1990s PlayStation console Read More »

us-air-traffic-control-still-runs-on-windows-95-and-floppy-disks

US air traffic control still runs on Windows 95 and floppy disks

On Wednesday, acting FAA Administrator Chris Rocheleau told the House Appropriations Committee that the Federal Aviation Administration plans to replace its aging air traffic control systems, which still rely on floppy disks and Windows 95 computers, Tom’s Hardware reports. The agency has issued a Request For Information to gather proposals from companies willing to tackle the massive infrastructure overhaul.

“The whole idea is to replace the system. No more floppy disks or paper strips,” Rocheleau said during the committee hearing. Transportation Secretary Sean Duffy called the project “the most important infrastructure project that we’ve had in this country for decades,” describing it as a bipartisan priority.

Most air traffic control towers and facilities across the US currently operate with technology that seems frozen in the 20th century, although that isn’t necessarily a bad thing—when it works. Some controllers currently use paper strips to track aircraft movements and transfer data between systems using floppy disks, while their computers run Microsoft’s Windows 95 operating system, which launched in 1995.

A pile of floppy disks

Credit: Getty

As Tom’s Hardware notes, modernization of the system is broadly popular. Sheldon Jacobson, a University of Illinois professor who has studied risks in aviation, says that the system works remarkably well as is but that an upgrade is still critical, according to NPR. The aviation industry coalition Modern Skies has been pushing for ATC modernization and recently released an advertisement highlighting the outdated technology.

While the vintage systems may have inadvertently protected air traffic control from widespread outages like the CrowdStrike incident that disrupted modern computer systems globally in 2024, agency officials say 51 of the FAA’s 138 systems are unsustainable due to outdated functionality and a lack of spare parts.

The FAA isn’t alone in clinging to floppy disk technology. San Francisco’s train control system still runs on DOS loaded from 5.25-inch floppy disks, with upgrades not expected until 2030 due to budget constraints. Japan has also struggled in recent years to modernize government record systems that use floppy disks.

If it ain’t broke? (Or maybe it is broke)

Modernizing the air traffic control system presents engineering challenges that extend far beyond simply installing newer computers. Unlike typical IT upgrades, ATC systems must maintain continuous 24/7 operation, because shutting down facilities for maintenance could compromise aviation safety.

US air traffic control still runs on Windows 95 and floppy disks Read More »

bill-atkinson,-architect-of-the-mac’s-graphical-soul,-dies-at-74

Bill Atkinson, architect of the Mac’s graphical soul, dies at 74

Using HyperCard, Teachers created interactive lessons, artists built multimedia experiences, and businesses developed custom database applications—all without writing traditional code. The hypermedia environment also had a huge impact on gaming: 1993 first-person adventure hit Myst originally used HyperCard as its game engine.

An example of graphical dithering, which allows 1-bit color (black and white only) to imitate grayscale.

An example of graphical dithering, which allows 1-bit color (black and white only) to imitate grayscale. Credit: Benj Edwards / Apple

For the two-color Macintosh (which could only display black or white pixels, with no gradient in between), Atkinson developed an innovative high-contrast dithering algorithm that created the illusion of grayscale images with a characteristic stippled appearance that became synonymous with early Mac graphics. The dithered aesthetic remains popular today among some digital artists and indie game makers, with modern tools like this web converter that allows anyone to transform photos into the classic Atkinson dither style.

Life after Apple

After leaving Apple in 1990, Atkinson co-founded General Magic with Marc Porat and Andy Hertzfeld, attempting to create personal communicators before smartphones existed. Wikipedia notes that in 2007, he joined Numenta, an AI startup, declaring their work on machine intelligence “more fundamentally important to society than the personal computer and the rise of the Internet.”

In his later years, Atkinson pursued nature photography with the same artistry he’d brought to programming. His 2004 book “Within the Stone” featured close-up images of polished rocks that revealed hidden worlds of color and pattern.

Atkinson announced his pancreatic cancer diagnosis in November 2024, writing on Facebook that he had “already led an amazing and wonderful life.” The same disease claimed his friend and collaborator Steve Jobs in 2011.

Given Atkinson’s deep contributions to Apple history, it’s not surprising that Jobs’ successor, Apple CEO Tim Cook, paid tribute to the Mac’s original graphics guru on X on Saturday. “We are deeply saddened by the passing of Bill Atkinson,” Cook wrote. “He was a true visionary whose creativity, heart, and groundbreaking work on the Mac will forever inspire us.”

Bill Atkinson, architect of the Mac’s graphical soul, dies at 74 Read More »

endangered-classic-mac-plastic-color-returns-as-3d-printer-filament

Endangered classic Mac plastic color returns as 3D-printer filament

On Tuesday, classic computer collector Joe Strosnider announced the availability of a new 3D-printer filament that replicates the iconic “Platinum” color scheme used in classic Macintosh computers from the late 1980s through the 1990s. The PLA filament (PLA is short for polylactic acid) allows hobbyists to 3D-print nostalgic novelties, replacement parts, and accessories that match the original color of vintage Apple computers.

Hobbyists commonly feed this type of filament into commercial desktop 3D printers, which heat the plastic and extrude it in a computer-controlled way to fabricate new plastic parts.

The Platinum color, which Apple used in its desktop and portable computer lines starting with the Apple IIgs in 1986, has become synonymous with a distinctive era of classic Macintosh aesthetic. Over time, original Macintosh plastics have become brittle and discolored with age, so matching the “original” color can be a somewhat challenging and subjective experience.

A close-up of

A close-up of “Retro Platinum” PLA filament by Polar Filament. Credit: Polar Filament

Strosnider, who runs a website about his extensive vintage computer collection in Ohio, worked for years to color-match the distinctive beige-gray hue of the Macintosh Platinum scheme, resulting in a spool of hobby-ready plastic by Polar Filament and priced at $21.99 per kilogram.

According to a forum post, Strosnider paid approximately $900 to develop the color and purchase an initial 25-kilogram supply of the filament. Rather than keeping the formulation proprietary, he arranged for Polar Filament to make the color publicly available.

“I paid them a fee to color match the speaker box from inside my Mac Color Classic,” Strosnider wrote in a Tinkerdifferent forum post on Tuesday. “In exchange, I asked them to release the color to the public so anyone can use it.”

Endangered classic Mac plastic color returns as 3D-printer filament Read More »

polish-engineer-creates-postage-stamp-sized-1980s-atari-computer

Polish engineer creates postage stamp-sized 1980s Atari computer

In 1979, Atari released the Atari 400 and 800, groundbreaking home computers that included custom graphics and sound chips, four joystick ports, and the ability to run the most advanced home video games of their era. These machines, which retailed for $549 and $999, respectively, represented a leap in consumer-friendly personal computing, with their modular design and serial I/O bus that presaged USB. Now, 46 years later, a hobbyist has shrunk down the system hardware to a size that would have seemed like science fiction in the 1970s.

Polish engineer Piotr “Osa” Ostapowicz recently unveiled “Atarino,” which may be the world’s smallest 8-bit Atari computer re-creation, according to retro computing site Atariteca. The entire system—processor, graphics chips, sound hardware, and memory controllers—fits on a module measuring just 2×1.5 centimeters (about 0.79×0.59 inches), which is roughly the size of a postage stamp.

Ostapowicz’s creation reimplements the classic Atari XL/XE architecture using modern FPGA (field-programmable gate array) technology. Unlike software emulators that simulate old hardware (and modern recreations that run them, like the Atari 400 Mini console) on a complete computer system of another architecture, Atarino reproduces the original Atari components faithfully at the logic level, allowing it to run vintage software while maintaining compatibility with original peripherals.

The Atarino is only slightly larger than a Polish 1 Grosz coin.

The Atarino is only slightly larger than a Polish 1 Grosz coin. Credit: Piotr Ostapowicz

“The current project is not strictly a clone of Atari but basically, well, I’m forming a machine that is compatible with the Atari 8-bit computer itself, but it was created on the basis of the framework that I created some time ago,” Ostapowicz told Atari Online PL in a January 2024 YouTube interview.

An assortment of some of the Atari 8-bit computer systems released in the 1970s and 80s.

An assortment of some of the Atari 8-bit computer systems released in the 1970s and ’80s. Credit: Atari

The project, which began over a decade ago and was first publicly demonstrated in December 2023, includes a 6502C processor, ANTIC and GTIA graphics chips, POKEY sound chip, and memory controllers onto a single Lattice UP5K FPGA chip. Despite its tiny size, the system can run at clock speeds up to 31 MHz—far faster than the original hardware’s 1.79 MHz.

Smaller, faster, and positioned for future projects

While Atarino maintains broad compatibility with classic Atari software, Ostapowicz says he has enhanced the original design in several ways. For example, the 6502 processor core follows the physical chip specifications but adds new instructions. The memory system uses independent channels rather than the original’s “cycle stealing” approach (where the graphics chip temporarily halts the CPU to access memory), improving performance.

Polish engineer creates postage stamp-sized 1980s Atari computer Read More »

the-timeless-genius-of-a-1980s-atari-developer-and-his-swimming-salmon-masterpiece

The timeless genius of a 1980s Atari developer and his swimming salmon masterpiece

Williams’ success with APX led him to create several games for Synapse Software, including the beloved Alley Cat and the incomprehensible fantasy masterpiece Necromancer, before moving to the Amiga, where he created the experimental Mind Walker and his ambitious “cultural simulation” Knights of the Crystallion.

Necromancer, Williams’ later creation for the Atari 800, plays like a fever dream—you control a druid fighting off spiders while growing magic trees and battling an undead wizard. It makes absolutely no sense by conventional standards, but it’s brilliant in its otherworldliness.

“The first games that I did were very hard to explain to people and they just kind of bought it on faith,” Williams said in a 1989 interview with YAAM (Yet Another Amiga Magazine), suggesting this unconventional approach started early. That willingness to create deeply personal, almost surreal experiences defined Williams’ work throughout his career.

An Atari 800 (the big brother of the Atari 400) that Benj Edwards set up to play M.U.L.E. at his mom's house in 2015, for nostalgia purposes.

An Atari 800 that Benj Edwards set up to play M.U.L.E. at his mom’s house in 2015, for nostalgia purposes. Credit: Benj Edwards

After a brief stint making licensed games (like Bart’s Nightmare) for the Super Nintendo at Sculptured Software, he left the industry entirely to pursue his calling as a pastor, attending seminary in Chicago with his wife Martha, before declining health forced him to move to Rockport, Texas. Perhaps reflecting on the choices that led him down this path, Williams had noted years earlier in that 1989 interview, “Sometimes in this industry we tend to forget that life is a lot more interesting than computers.”

Bill Williams died on May 28, 1998, one day before his 38th birthday. He died young, but he outlived his doctors’ prediction that he wouldn’t reach age 13, and created cultural works that stand the test of time. Like Sam the Salmon, Williams pushed forward relentlessly—in his case, creating powerful digital art that was uniquely his own.

In our current era of photorealistic graphics and cinematic game experiences, Salmon Run‘s blocky pixels might seem quaint. But its core themes—persistence, natural beauty, and finding purpose against long odds—remain as relevant as ever. We all face bears in life—whether they come from natural adversity or from those who might seek to do us harm. The beauty of Williams’ game is in showing us that, despite their menacing presence, there’s still a reward waiting upstream for those willing to keep swimming.

If you want to try Salmon Run, you can potentially play it in your browser through an emulated Atari 800, hosted on The Internet Archive. Press F1 to start the game.

The timeless genius of a 1980s Atari developer and his swimming salmon masterpiece Read More »

the-voice-of-america-online’s-“you’ve-got-mail”-has-died-at-age-74

The voice of America Online’s “You’ve got mail” has died at age 74

In 1995, Wired Magazine’s AOL forum asked Edwards to record 10 humorous sound files using his iconic voice. The results, which include classics such as “You want fries with that,” “You’ve got credit card debt,” and “Stop touching me!” still live on in the depths of The Internet Archive. He also ran a side business recording custom sound files for AOL users.

A screenshot of America Online's version 2.5 client in 1995.

A screenshot of America Online’s version 2.5 client in 1995.

A screenshot of America Online’s version 2.5 client in 1995.

Over time, the “You’ve got mail” line became something of a cultural reference point, as tech journalist Harry McCracken pointed out in 2011 on his Technologizer blog, with various news headlines often borrowing the “You’ve got [something]” structure for humorous effect.

Edwards’ voice greeting became so embedded in American popular culture that it inspired the 1998 romantic comedy You’ve Got Mail. The film stars Tom Hanks and Meg Ryan as rival bookstore owners who unknowingly fall in love through anonymous email exchanges. Director Nora Ephron built the movie’s narrative around the anticipation that AOL users felt when hearing Edwards’ voice announce new messages, with the film grossing $250 million worldwide.

Elwood Edwards’ 2015 appearance on The Tonight Show Starring Jimmy Fallon.

At WKYC, Edwards worked behind the scenes as a graphics specialist, camera operator, and general production staff member since 2002. His voice work brought him occasional moments in the spotlight, including an appearance in a 2000 episode of The Simpsons, where he played a virtual doctor announcing, “You’ve got leprosy.” He appeared on The Tonight Show Starring Jimmy Fallon in 2015, performing his classic greeting along with phrases suggested by the audience.

Before his death, Edwards worked as an Uber driver. His voice continues to greet users of AOL’s current email service, maintaining an enduring connection to the early days of consumer Internet access.

The voice of America Online’s “You’ve got mail” has died at age 74 Read More »

ward-christensen,-bbs-inventor-and-architect-of-our-online-age,-dies-at-age-78

Ward Christensen, BBS inventor and architect of our online age, dies at age 78

Their new system allowed personal computer owners with modems to dial up a dedicated machine and leave messages that others would see later. The BBS concept represented a digital version of a push-pin bulletin board that might flank a grocery store entrance, town hall, or college dorm hallway.

Christensen and Suess openly shared the concept of the BBS, and others began writing their own BBS software. As these programs grew in complexity over time, the often hobbyist-run BBS systems that resulted allowed callers to transfer computer files and play games as well as leave messages.

BBSes introduced many home computer users to multiplayer online gaming, message boards, and online community building in an era before the Internet became widely available to people outside of science and academia. It also gave rise to the shareware gaming scene that led to companies like Epic Games today.

A low-key giant

Suess died in 2019, and with the passing of both BBS originators, we find ourselves at the symbolic end of an era, although many BBSes still run today. These are typically piped through the Internet instead of a dial-up telephone line.

While Christensen himself was always humble about his role in creating the first BBS, his contributions to the field did not go unrecognized. In 1992, Christensen received two Dvorak Awards, including a lifetime achievement award for “outstanding contributions to PC telecommunications.” The following year, the Electronic Frontier Foundation honored him with the Pioneer Award.

Professionally, Christensen enjoyed a long and successful career at IBM, where he worked from 1968 until his retirement in 2012. His final position at the company was as a field technical sales specialist.

A still image of Ward Christensen in 2002 being interviewed for BBS: The Documentary.

A still image of Ward Christensen in 2002 being interviewed for BBS: The Documentary.

A still image of Ward Christensen in 2002 being interviewed for BBS: The Documentary. Credit: Jason Scott

But mostly, Christensen kept a low profile.  When visiting online communities in his later years, Ward presented no ostentation, and there was no bragging about having made much of it possible. This amazed Scott, who said, “I was always fascinated that Ward kept a Twitter account, just messing around.”

Scott feels like humility, openness, and the spirit of sharing are key legacies that Christensen has left behind.

“It would be like a person who was in a high school band saying, ‘Eh, never really got into touring, never really had the urge to record albums or become a rock star,'” Scott said.  “And then later people come and go, ‘Oh, you made the first [whatever] in your high school band,’ but that sense of being at that locus of history and the fact that his immediate urge was to share all the code everywhere—that’s to me what I think people should remember about this guy.”

Ward Christensen, BBS inventor and architect of our online age, dies at age 78 Read More »

linux-boots-in-4.76-days-on-the-intel-4004

Linux boots in 4.76 days on the Intel 4004

To pull oneself up by one’s bootstraps —

Historic 4-bit microprocessor from 1971 can execute Linux commands over days or weeks.

A photo of Dmitry Grinberg's custom Linux/4004 circuit board.

Enlarge / A photo of Dmitry Grinberg’s custom Linux/4004 circuit board.

Hardware hacker Dmitry Grinberg recently achieved what might sound impossible: booting Linux on the Intel 4004, the world’s first commercial microprocessor. With just 2,300 transistors and an original clock speed of 740 kHz, the 1971 CPU is incredibly primitive by modern standards. And it’s slow—it takes about 4.76 days for the Linux kernel to boot.

Initially designed for a Japanese calculator called the Busicom 141-PF, the 4-bit 4004 found limited use in commercial products of the 1970s before being superseded by more powerful Intel chips, such as the 8008 and 8080 that powered early personal computers—and then the 8086 and 8088 that launched the IBM PC era.

If you’re skeptical that this feat is possible with a raw 4004, you’re right: The 4004 itself is far too limited to run Linux directly. Instead, Grinberg created a solution that is equally impressive: an emulator that runs on the 4004 and emulates a MIPS R3000 processor—the architecture used in the DECstation 2100 workstation that Linux was originally ported to. This emulator, along with minimal hardware emulation, allows a stripped-down Debian Linux to boot to a command prompt.

Linux/4004.

Grinberg is no stranger to feats of running Linux in unlikely places. As he explains on his website, “In 2012, I ran real Linux on an 8-bit microcontroller (AVR), setting a new world record for lowest-end-machine to ever run Linux.” After others improved on that record in recent years, he decided to surpass himself and others by targeting the very first microprocessor.

The long, slow boot

To make Linux on the 4004 work, Grinberg had to overcome numerous challenges. The 4004 has extremely limited ROM and RAM, no interrupts, and lacks even basic logical operations like AND and OR. Grinberg’s emulator makes clever use of lookup tables and other tricks to squeeze maximum performance out of the primitive CPU.

The final hardware uses the 4004 (overclocked to 790 kHz) along with several other period-correct support chips from Intel’s MCS-4 chipset. It includes a VFD display to show Linux output and can accept input over a serial connection. The whole setup draws about 6 W of power.

To pull it all together, Grinberg designed a custom circuit board with no vias (paths from one side of the circuit board to the other) and only right-angle traces for a retro aesthetic. It’s meant to be wall-mountable as an art piece, slowly executing Linux commands over the course of days or weeks.

While it has no practical purpose, the Linux/4004 project demonstrates the flexibility of Linux and pushes emulation to its limits. Grinberg is considering the possibility of offering kits or fully assembled boards for others who want to experience Linux at its slowest, though this is not yet definite.

The full details of the project, including schematics and source code, are available on Grinberg’s website. For those interested in vintage computing or extreme Linux implementations, it’s a fascinating look at what’s possible with 1970s technology and a lot of clever engineering.

Linux boots in 4.76 days on the Intel 4004 Read More »

why-1994’s-lair-of-squid-was-the-weirdest-pack-in-game-of-all-time

Why 1994’s Lair of Squid was the weirdest pack-in game of all time

digital archaeology —

The HP 200LX included a mysterious maze game called Lair of Squid. We tracked down the author.

Artist's impression of a squid jumping forth from a HP 200LX.

Enlarge / Artist’s impression of a squid jumping forth from an HP 200LX.

Aurich Lawson / HP

In 1994, Hewlett-Packard released a miracle machine: the HP 200LX pocket-size PC. In the depths of the device, among the MS-DOS productivity apps built into its fixed memory, there lurked a first-person maze game called Lair of Squid. Intrigued by the game, we tracked down its author, Andy Gryc, and probed into the title’s mysterious undersea origins.

“If you ask my family, they’ll confirm that I’ve been obsessed with squid for a long time,” Gryc told Ars Technica. “It’s admittedly very goofy—and that’s my fault—although I was inspired by Doom, which had come out relatively recently.”

In Lair of Squid, you’re trapped in an underwater labyrinth, seeking a way out while avoiding squid roaming the corridors. A collision with any cephalopod results in death. To progress through each stage and ascend to the surface, you locate the exit and provide a hidden, scrambled code word. The password is initially displayed as asterisks, with letters revealed as you encounter them within the maze.

Lair of Squid running on the author’s HP 200LX, shortly after the moment of discovery.” height=”480″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/07/squid_photo_1-640×480.jpg” width=”640″>

Enlarge / A photo of Lair of Squid running on the author’s HP 200LX, shortly after the moment of discovery.

Benj Edwards

Buckle up for a tale of rogue coding, cephalopod obsession, and the most unexpected Easter egg in palmtop history. This is no fish story—it’s the saga of Lair of Squid.

A computer in the palm of your hand

Introduced in 1994, the HP 200LX palmtop PC put desktop functionality in a pocket-size package. With a small QWERTY keyboard, MS-DOS compatibility, and a suite of productivity apps, the clamshell 200LX offered a vision of one potential future of mobile computing. It featured a 7.91 MHz 80186 CPU, a monochrome 640×200 CGA display, and 1–4 megabytes of RAM.

The cover of the HP 200LX User's Guide (1994).

Enlarge / The cover of the HP 200LX User’s Guide (1994).

Hewlett Packard

I’ve collected vintage computers since 1993, and people frequently offer to send me old devices they’d rather not throw away. Recently, a former HP engineer sent me his small but nice collection of ’90s HP handheld palmtop computers, including a 95LX (1991), 100LX (1993), and 200LX.

HP designed its portable LX series to run many MS-DOS programs that feature text mode or CGA graphics, and each includes built-in versions of the Lotus 1-2-3 spreadsheet, a word processor, terminal program, calculator, and more.

I owned a 95LX as a kid (a hand-me-down from my dad’s friend), which came with a simplistic overhead maze game called TigerFox. So imagine my surprise in 2024, when trawling through the productivity and personal organization apps on that 200LX, to find a richly detailed first-person maze game based around cephalopods, of all things.

(I was less surprised to find an excellent built-in Minesweeper clone, Hearts and Bones, which is definitely a more natural fit for the power and form of the 200LX itself.)

Lair of Squid isn’t a true Doom clone since it’s not a first-person shooter (in some ways, it’s more like a first-person Pac-Man without pellets), but its mere existence—on a black-and-white device best suited for storing phone numbers and text notes—deserves note as one of the weirdest and most interesting pack-in games to ever exist.

Just after discovering Lair of Squid on my device earlier this year, I tweeted about it, and I extracted the file for the game (called “maze.exe”) from the internal ROM drive and sent it to DOS gaming historian Anatoly Shashkin, who put the game on The Internet Archive so anyone can play it in their browser.

After that, I realized that I wanted to figure out who wrote this quirky game, and thanks to a post on RGB Classic Games, I found a name: Andy Gryc. With some luck in cold-emailing, I found him.

Why 1994’s Lair of Squid was the weirdest pack-in game of all time Read More »

gordon-bell,-an-architect-of-our-digital-age,-dies-at-age-89

Gordon Bell, an architect of our digital age, dies at age 89

the great memory register in the sky —

Bell architected DEC’s VAX minicomputers, championed computer history, mentored at Microsoft.

A photo of Gordon Bell speaking at the annual PC Forum in Palm Springs, California, March 1989.

Enlarge / A photo of Gordon Bell speaking at the annual PC Forum in Palm Springs, California, March 1989.

Computer pioneer Gordon Bell, who as an early employee of Digital Equipment Corporation (DEC) played a key role in the development of several influential minicomputer systems and also co-founded the first major computer museum, passed away on Friday, according to Bell Labs veteran John Mashey. Mashey announced Bell’s passing in a social media post on Tuesday morning.

“I am very sad to report [the] death May 17 at age 89 of Gordon Bell, famous computer pioneer, a founder of Computer Museum in Boston, and a force behind the @ComputerHistory here in Silicon Valley, and good friend since the 1980s,” wrote Mashey in his announcement. “He succumbed to aspiration pneumonia in Coronado, CA.”

Bell was a pivotal figure in the history of computing and a notable champion of tech history, having founded Boston’s Computer Museum in 1979 that later became the heart of Computer History Museum in Mountain View, with his wife Gwen Bell. He was also the namesake of the ACM’s prestigious Gordon Bell Prize, created to spur innovations in parallel processing.

Born in 1934 in Kirksville, Missouri, Gordon Bell earned degrees in electrical engineering from MIT before being recruited in 1960 by DEC founders Ken Olsen and Harlan Anderson. As the second computer engineer hired at DEC, Bell worked on various components for the PDP-1 system, including floating-point subroutines, tape controllers, and a drum controller.

Bell also invented the first UART (Universal Asynchronous Receiver-Transmitter) for serial communication during his time at DEC. He went on to architect several influential DEC systems, including the PDP-4 and PDP-6. In the 1970s, he played a key role in overseeing the aforementioned VAX minicomputer line as the engineering manager, with Bill Strecker serving as the primary architect for the VAX architecture.

After retiring from DEC in 1983, Bell remained active as an entrepreneur, policy adviser, and researcher. He co-founded Encore Computer and helped establish the NSF’s Computing and Information Science and Engineering Directorate.

In 1995, Bell joined Microsoft Research where he studied telepresence technologies and served as the subject of the MyLifeBits life-logging project. The initiative aimed to realize Vannevar Bush’s vision of a system that could store all the documents, photos, and audio a person experienced in their lifetime.

Bell was elected to the National Academy of Engineering, National Academy of Sciences, and American Academy of Arts and Sciences. He received the National Medal of Technology from President George H.W. Bush in 1991 and the IEEE’s John von Neumann medal in 1992.

“He was immeasurably helpful”

As news of Bell’s passing spread on social media Tuesday, industry veterans began sharing their memories and condolences. Former Microsoft CTO Ray Ozzie wrote, “I can’t adequately describe how much I loved Gordon and respected what he did for the industry. As a kid I first ran into him at Digital (I was then at DG) when he and Dave were working on VAX. So brilliant, so calm, so very upbeat and optimistic about what the future might hold.”

Ozzie also recalled Bell’s role as a helpful mentor. “The number of times Gordon and I met while at Microsoft – acting as a sounding board, helping me through challenges I was facing – is uncountable,” he wrote.

Former Windows VP Steven Sinofsky also paid tribute to Bell on X, writing, “He was immeasurably helpful at Microsoft where he was a founding advisor and later full time leader in Microsoft Research. He advised and supported countless researchers, projects, and product teams. He was always supportive and insightful beyond words. He never hesitated to provide insights and a few sparks at so many of the offsites that were so important to the evolution of Microsoft.”

“His memory is a blessing to so many,” wrote Sinofsky in his tweet memorializing Bell. “His impact on all of us in technology will be felt for generations. May he rest in peace.”

Gordon Bell, an architect of our digital age, dies at age 89 Read More »

here’s-your-chance-to-own-a-decommissioned-us-government-supercomputer

Here’s your chance to own a decommissioned US government supercomputer

But can it run Crysis —

145,152-core Cheyenne supercomputer was 20th most powerful in the world in 2016.

A photo of the Cheyenne supercomputer, which is now up for auction.

Enlarge / A photo of the Cheyenne supercomputer, which is now up for auction.

On Tuesday, the US General Services Administration began an auction for the decommissioned Cheyenne supercomputer, located in Cheyenne, Wyoming. The 5.34-petaflop supercomputer ranked as the 20th most powerful in the world at the time of its installation in 2016. Bidding started at $2,500, but it’s price is currently $27,643 with the reserve not yet met.

The supercomputer, which officially operated between January 12, 2017, and December 31, 2023, at the NCAR-Wyoming Supercomputing Center, was a powerful (and once considered energy-efficient) system that significantly advanced atmospheric and Earth system sciences research.

“In its lifetime, Cheyenne delivered over 7 billion core-hours, served over 4,400 users, and supported nearly 1,300 NSF awards,” writes the University Corporation for Atmospheric Research (UCAR) on its official Cheyenne information page. “It played a key role in education, supporting more than 80 university courses and training events. Nearly 1,000 projects were awarded for early-career graduate students and postdocs. Perhaps most tellingly, Cheyenne-powered research generated over 4,500 peer-review publications, dissertations and theses, and other works.”

UCAR says that Cheynne was originally slated to be replaced after five years, but the COVID-19 pandemic severely disrupted supply chains, and it clocked in two extra years in its tour of duty. The auction page says that Cheyenne recently experienced maintenance limitations due to faulty quick disconnects in its cooling system. As a result, approximately 1 percent of the compute nodes have failed, primarily due to ECC errors in the DIMMs. Given the expense and downtime associated with repairs, the decision was made to auction off the components.

  • A photo gallery of the Cheyenne supercomputer up for auction.

With a peak performance of 5,340 teraflops (4,788 Linpack teraflops), this SGI ICE XA system was capable of performing over 3 billion calculations per second for every watt of energy consumed, making it three times more energy-efficient than its predecessor, Yellowstone. The system featured 4,032 dual-socket nodes, each with two 18-core, 2.3-GHz Intel Xeon E5-2697v4 processors, for a total of 145,152 CPU cores. It also included 313 terabytes of memory and 40 petabytes of storage. The entire system in operation consumed about 1.7 megawatts of power.

Just to compare, the world’s top-rated supercomputer at the moment—Frontier at Oak Ridge National Labs in Tennessee—features a theoretical peak performance of 1,679.82 petaflops, includes 8,699,904 CPU cores, and uses 22.7 megawatts of power.

The GSA notes that potential buyers of Cheyenne should be aware that professional movers with appropriate equipment will be required to handle the heavy racks and components. The auction includes seven E-Cell pairs (14 total), each with a cooling distribution unit (CDU). Each E-Cell weighs approximately 1,500 lbs. Additionally, the auction features two air-cooled Cheyenne Management Racks, each weighing 2,500 lbs, that contain servers, switches, and power units.

As of this writing, 12 potential buyers have bid on this computing monster so far. The auction closes on May 5 at 6: 11 pm Central Time if you’re interested in bidding. But don’t get too excited by photos of the extensive cabling: As the auction site notes, “fiber optic and CAT5/6 cabling are excluded from the resale package.”

Here’s your chance to own a decommissioned US government supercomputer Read More »