retrotech

stewart-cheifet,-pbs-host-who-chronicled-the-pc-revolution,-dies-at-87

Stewart Cheifet, PBS host who chronicled the PC revolution, dies at 87

Stewart Cheifet, the television producer and host who documented the personal computer revolution for nearly two decades on PBS, died on December 28, 2025, at age 87 in Philadelphia. Cheifet created and hosted Computer Chronicles, which ran on the public television network from 1983 to 2002 and helped demystify a new tech medium for millions of American viewers.

Computer Chronicles covered everything from the earliest IBM PCs and Apple Macintosh models to the rise of the World Wide Web and the dot-com boom. Cheifet conducted interviews with computing industry figures, including Bill Gates, Steve Jobs, and Jeff Bezos, while demonstrating hardware and software for a general audience.

From 1983 to 1990, he co-hosted the show with Gary Kildall, the Digital Research founder who created the popular CP/M operating system that predated MS-DOS on early personal computer systems.

Computer Chronicles – 01×25 – Artificial Intelligence (1984)

From 1996 to 2002, Cheifet also produced and hosted Net Cafe, a companion series that documented the early Internet boom and introduced viewers to then-new websites like Yahoo, Google, and eBay.

A legacy worth preserving

Computer Chronicles began as a local weekly series in 1981 when Cheifet served as station manager at KCSM-TV, the College of San Mateo’s public television station. It became a national PBS series in 1983 and ran continuously until 2002, producing 433 episodes across 19 seasons. The format remained consistent throughout: product demonstrations, guest interviews, and a closing news segment called “Random Access” that covered industry developments.

After the show’s run ended and Cheifet left television production, he worked to preserve the show’s legacy as a consultant for the Internet Archive, helping to make publicly available the episodes of Computer Chronicles and Net Cafe.

Stewart Cheifet, PBS host who chronicled the PC revolution, dies at 87 Read More »

in-1995,-a-netscape-employee-wrote-a-hack-in-10-days-that-now-runs-the-internet

In 1995, a Netscape employee wrote a hack in 10 days that now runs the Internet

Thirty years ago today, Netscape Communications and Sun Microsystems issued a joint press release announcing JavaScript, an object scripting language designed for creating interactive web applications. The language emerged from a frantic 10-day sprint at pioneering browser company Netscape, where engineer Brendan Eich hacked together a working internal prototype during May 1995.

While the JavaScript language didn’t ship publicly until that September and didn’t reach a 1.0 release until March 1996, the descendants of Eich’s initial 10-day hack now run on approximately 98.9 percent of all websites with client-side code, making JavaScript the dominant programming language of the web. It’s wildly popular; beyond the browser, JavaScript powers server backends, mobile apps, desktop software, and even some embedded systems. According to several surveys, JavaScript consistently ranks among the most widely used programming languages in the world.

In crafting JavaScript, Netscape wanted a scripting language that could make webpages interactive, something lightweight that would appeal to web designers and non-professional programmers. Eich drew from several influences: The syntax looked like a trendy new programming language called Java to satisfy Netscape management, but its guts borrowed concepts from Scheme, a language Eich admired, and Self, which contributed JavaScript’s prototype-based object model.

A screenshot of the Netscape Navigator 2.0 interface.

A screenshot of the Netscape Navigator 2.0 interface. Credit: Benj Edwards

The JavaScript partnership secured endorsements from 28 major tech companies, but amusingly, the December 1995 announcement now reads like a tech industry epitaph. The endorsing companies included Digital Equipment Corporation (absorbed by Compaq, then HP), Silicon Graphics (bankrupt), and Netscape itself (bought by AOL, dismantled). Sun Microsystems, co-creator of JavaScript and owner of Java, was acquired by Oracle in 2010. JavaScript outlived them all.

What’s in a name?

The 10-day creation story has become programming folklore, but even with that kernel of truth we mentioned, it tends to oversimplify the timeline. Eich’s sprint produced a working demo, not a finished language, and over the next year, Netscape continued tweaking the design. The rushed development left JavaScript with quirks and inconsistencies that developers still complain about today. So many changes were coming down the pipeline, in fact, that it began to annoy one of the industry’s most prominent figures at the time.

In 1995, a Netscape employee wrote a hack in 10 days that now runs the Internet Read More »

in-1982,-a-physics-joke-gone-wrong-sparked-the-invention-of-the-emoticon

In 1982, a physics joke gone wrong sparked the invention of the emoticon


A simple proposal on a 1982 electronic bulletin board helped sarcasm flourish online.

Credit: Benj Edwards / DEC

On September 19, 1982, Carnegie Mellon University computer science research assistant professor Scott Fahlman posted a message to the university’s bulletin board software that would later come to shape how people communicate online. His proposal: use 🙂 and 🙁 as markers to distinguish jokes from serious comments. While Fahlman describes himself as “the inventor… or at least one of the inventors” of what would later be called the smiley face emoticon, the full story reveals something more interesting than a lone genius moment.

The whole episode started three days earlier when computer scientist Neil Swartz posed a physics problem to colleagues on Carnegie Mellon’s “bboard,” which was an early online message board. The discussion thread had been exploring what happens to objects in a free-falling elevator, and Swartz presented a specific scenario involving a lit candle and a drop of mercury.

That evening, computer scientist Howard Gayle responded with a facetious message titled “WARNING!” He claimed that an elevator had been “contaminated with mercury” and suffered “some slight fire damage” due to a physics experiment. Despite clarifying posts noting the warning was a joke, some people took it seriously.

A DECSYSTEM-20 KL-10 (1974) that was once located at the Living Computer Museum in Seattle.

A DECSYSTEM-20 KL-10 (1974) seen at the Living Computer Museum in Seattle. Scott Fahlman used a similar system with a terminal to propose his smiley concept. Credit: Jason Scott

The incident sparked immediate discussion about how to prevent such misunderstandings and the “flame wars” (heated arguments) that could result from misread intent.

“This problem caused some of us to suggest (only half seriously) that maybe it would be a good idea to explicitly mark posts that were not to be taken seriously,” Fahlman later wrote in a retrospective post published on his CMU website. “After all, when using text-based online communication, we lack the body language or tone-of-voice cues that convey this information when we talk in person or on the phone.”

On September 17, 1982, the next day after the misunderstanding on the CMU bboard, Swartz made the first concrete proposal: “Maybe we should adopt a convention of putting a star

in the subject field of any notice which is to be taken as a joke.”

Within hours, multiple Carnegie Mellon computer scientists weighed in with alternative proposals. Joseph Ginder suggested using % instead of *. Anthony Stentz proposed a nuanced system: “How about using for good jokes and % for bad jokes?” Keith Wright championed the ampersand (&), arguing it “looks funny” and “sounds funny.” Leonard Hamey suggested # because “it looks like two lips with teeth showing between them.”

Meanwhile, some Carnegie Mellon users were already using their own solution. A group on the Gandalf VAX system later revealed they had been using __/ as “universally known as a smile” to mark jokes. But it apparently didn’t catch on beyond that local system.

The winning formula

Two days after Swartz’s initial proposal, Fahlman entered the discussion with his now-famous post: “I propose that the following character sequence for joke markers: 🙂 Read it sideways.” He added that serious messages could use :-(, noting, “Maybe we should mark things that are NOT jokes, given current trends.”

What made Fahlman’s proposal work wasn’t that he invented the concept of joke markers—Swartz had done that. It wasn’t that he invented smile symbols at Carnegie Mellon, since the __/ already existed. Rather, Fahlman synthesized the best elements from the ongoing discussion: the simplicity of single-character proposals, the visual clarity of face-like symbols, the sideways-reading principle hinted at by Hamey’s #, and a complete binary system that covered both humor 🙂 and seriousness :-(.

Early computer terminals like the DEC VT-100 did not support graphics, requiring typographic solutions for displaying

Early computer terminals like the DEC VT-100 did not support graphics, requiring typographic solutions for displaying “images.” Credit: Digital Equipment Corporation

The simplicity of Fahlman’s emoticons was key to their adoption. The university’s network ran on large DEC mainframes accessed via video terminals (Fahlman himself made his posts from a terminal attached to a DECSYSTEM-20) that were strictly limited to the 95 printable characters of the US-ASCII set. With no ability to display graphics or draw pixels, Fahlman’s solution used the only tools available: standard punctuation marks rearranging the strict grid of the terminal screen into a “picture.”

The emoticons spread quickly across ARPAnet, the precursor to the modern Internet, reaching other universities and research labs. By November 10, 1982—less than two months later—Carnegie Mellon researcher James Morris began introducing the smiley emoticon concept to colleagues at Xerox PARC, complete with a growing list of variations. What started as an internal Carnegie Mellon convention over time became a standard feature of online communication, often simplified without the hyphen nose to 🙂 or :(, among many other variations.

Lost backup tapes

There’s an interesting coda to this story: For years, the original bboard thread existed only in fading memory. The bulletin board posts had been deleted, and Carnegie Mellon’s computer science department had moved to new systems. The old messages seemed lost forever.

Between 2001 and 2002, Mike Jones, a former Carnegie Mellon researcher then working at Microsoft, sponsored what Fahlman calls a “digital archaeology” project. Jeff Baird and the Carnegie Mellon facilities staff undertook a painstaking effort: locating backup tapes from 1982, finding working tape drives that could read the obsolete media, decoding old file formats, and searching for the actual posts. The team recovered the thread, revealing not just Fahlman’s famous post but the entire three-day community discussion that led to it.

The recovered messages, which you can read here, show how collaboratively the emoticon was developed—not a lone genius moment but an ongoing conversation proposing, refining, and building on the group’s ideas. Fahlman had no idea his synthesis would become a fundamental part of how humans express themselves in digital text, but neither did Swartz, who first suggested marking jokes, or the Gandalf VAX users who were already using their own smile symbols.

From emoticon to emoji

While Fahlman’s text-based emoticons spread across Western online culture and remained text-character-based for a long time, Japanese mobile phone users in the late 1990s developed a parallel system: emoji. For years, Shigetaka Kurita’s 1999 set for NTT DoCoMo was widely cited as the original. However, recent discoveries have revealed earlier origins. SoftBank released a picture-based character set on mobile phones in 1997, and the Sharp PA-8500 personal organizer featured selectable icon characters as early as 1988.

Unlike emoticons that required reading sideways, emoji were small pictographic images that could convey emotion, objects, and ideas with more detail. When Unicode standardized emoji in 2010 and Apple added an emoji keyboard to iOS in 2011, the format exploded globally. Today, emoji have largely replaced emoticons in casual communication, though Fahlman’s sideways faces still appear regularly in text messages and social media posts.

IBM's Code Page 437 character set included a smiley face as early as 1981.

IBM’s Code Page 437 character set included a smiley face as early as 1981. Credit: Matt Giuca

As Fahlman himself notes on his website, he may not have been “the first person ever to type these three letters in sequence.” Others, including teletype operators and private correspondents, may have used similar symbols before 1982, perhaps even as far back as 1648. Author Vladimir Nabokov suggested before 1982 that “there should exist a special typographical sign for a smile.” And the original IBM PC included a dedicated smiley character as early as 1981 (perhaps that should be considered the first emoji).

What made Fahlman’s contribution significant wasn’t absolute originality but rather proposing the right solution at the right time in the right context. From there, the smiley could spread across the emerging global computer network, and no one would ever misunderstand a joke online again. 🙂

Photo of Benj Edwards

Benj Edwards is Ars Technica’s Senior AI Reporter and founder of the site’s dedicated AI beat in 2022. He’s also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC.

In 1982, a physics joke gone wrong sparked the invention of the emoticon Read More »

celebrated-game-developer-rebecca-heineman-dies-at-age-62

Celebrated game developer Rebecca Heineman dies at age 62

From champion to advocate

During her later career, Heineman served as a mentor and advisor to many, never shy about celebrating her past as a game developer during the golden age of the home computer.

Her mentoring skills became doubly important when she publicly came out as transgender in 2003. She became a vocal advocate for LGBTQ+ representation in gaming and served on the board of directors for GLAAD. Earlier this year, she received the Gayming Icon Award from Gayming Magazine.

Andrew Borman, who serves as director of digital preservation at The Strong National Museum of Play in Rochester, New York, told Ars Technica that her influence made a personal impact wider than electronic entertainment. “Her legacy goes beyond her groundbreaking work in video games,” he told Ars. “She was a fierce advocate for LGBTQ rights and an inspiration to people around the world, including myself.”

The front cover of

The front cover of Dragon Wars on the Commodore 64, released in 1989. Credit: MobyGames

In the Netflix documentary series High Score, Heineman explained her early connection to video games. “It allowed me to be myself,” she said. “It allowed me to play as female.”

“I think her legend grew as she got older, in part because of her openness and approachability,” journalist Ernie Smith told Ars. “As the culture of gaming grew into an online culture of people ready to dig into the past, she remained a part of it in a big way, where her war stories helped fill in the lore about gaming’s formative eras.”

Celebrated to the end

Heineman was diagnosed with adenocarcinoma in October 2025 after experiencing shortness of breath at the PAX game convention. After diagnostic testing, doctors found cancer in her lungs and liver. That same month, she launched a GoFundMe campaign to help with medical costs. The campaign quickly surpassed its $75,000 goal, raising more than $157,000 from fans, friends, and industry colleagues.

Celebrated game developer Rebecca Heineman dies at age 62 Read More »

modder-injects-ai-dialogue-into-2002’s-animal-crossing-using-memory-hack

Modder injects AI dialogue into 2002’s Animal Crossing using memory hack

But discovering the addresses was only half the problem. When you talk to a villager in Animal Crossing, the game normally displays dialogue instantly. Calling an AI model over the Internet takes several seconds. Willison examined the code and found Fonseca’s solution: a watch_dialogue() function that polls memory 10 times per second. When it detects a conversation starting, it immediately writes placeholder text: three dots with hidden pause commands between them, followed by a “Press A to continue” prompt.

“So the user gets a ‘press A to continue’ button and hopefully the LLM has finished by the time they press that button,” Willison noted in a Hacker News comment. While players watch dots appear and reach for the A button, the mod races to get a response from the AI model and translate it into the game’s dialog format.

Learning the game’s secret language

Simply writing text to memory froze the game. Animal Crossing uses an encoded format with control codes that manage everything from text color to character emotions. A special prefix byte (0x7F) signals commands rather than characters. Without the proper end-of-conversation control code, the game waits forever.

“Think of it like HTML,” Fonseca explains. “Your browser doesn’t just display words; it interprets tags … to make text bold.” The decompilation community had documented these codes, allowing Fonseca to build encoder and decoder tools that translate between a human-readable format and the GameCube’s expected byte sequences.

A screenshot of LLM-powered dialog injected into Animal Crossing for the GameCube.

A screenshot of LLM-powered dialog injected into Animal Crossing for the GameCube. Credit: Joshua Fonseca

Initially, he tried using a single AI model to handle both creative writing and technical formatting. “The results were a mess,” he notes. “The AI was trying to be a creative writer and a technical programmer simultaneously and was bad at both.”

The solution: split the work between two models. A Writer AI creates dialogue using character sheets scraped from the Animal Crossing fan wiki. A Director AI then adds technical elements, including pauses, color changes, character expressions, and sound effects.

The code is available on GitHub, though Fonseca warns it contains known bugs and has only been tested on macOS. The mod requires Python 3.8+, API keys for either Google Gemini or OpenAI, and Dolphin emulator. Have fun sticking it to the man—or the raccoon, as the case may be.

Modder injects AI dialogue into 2002’s Animal Crossing using memory hack Read More »

microsoft-open-sources-bill-gates’-6502-basic-from-1978

Microsoft open-sources Bill Gates’ 6502 BASIC from 1978

On Wednesday, Microsoft released the complete source code for Microsoft BASIC for 6502 Version 1.1, the 1978 interpreter that powered the Commodore PET, VIC-20, Commodore 64, and Apple II through custom adaptations. The company posted 6,955 lines of assembly language code to GitHub under an MIT license, allowing anyone to freely use, modify, and distribute the code that helped launch the personal computer revolution.

“Rick Weiland and I (Bill Gates) wrote the 6502 BASIC,” Gates commented on the Page Table blog in 2010. “I put the WAIT command in.”

For millions of people in the late 1970s and early 1980s, variations of Microsoft’s BASIC interpreter provided their first experience with programming. Users could type simple commands like “10 PRINT ‘HELLO'” and “20 GOTO 10” to create an endless loop of text on their screens, for example—often their first taste of controlling a computer directly. The interpreter translated these human-readable commands into instructions that the processor could execute, one line at a time.

The Commodore PET (Personal Electronic Transactor) was released in January 1977 and used the MOS 6502 and ran a variation of Microsoft BASIC. Credit: SSPL/Getty Images

At just 6,955 lines of assembly language—Microsoft’s low-level 6502 code talked almost directly to the processor. Microsoft’s BASIC squeezed remarkable functionality into minimal memory, a key achievement when RAM cost hundreds of dollars per kilobyte.

In the early personal computer space, cost was king. The MOS 6502 processor that ran this BASIC cost about $25, while competitors charged $200 for similar chips. Designer Chuck Peddle created the 6502 specifically to bring computing to the masses, and manufacturers built variations of the chip into the Atari 2600, Nintendo Entertainment System, and millions of Commodore computers.

The deal that got away

In 1977, Commodore licensed Microsoft’s 6502 BASIC for a flat fee of $25,000. Jack Tramiel’s company got perpetual rights to ship the software in unlimited machines—no royalties, no per-unit fees. While $25,000 seemed substantial then, Commodore went on to sell millions of computers with Microsoft BASIC inside. Had Microsoft negotiated a per-unit licensing fee like they did with later products, the deal could have generated tens of millions in revenue.

The version Microsoft released—labeled 1.1—contains bug fixes that Commodore engineer John Feagans and Bill Gates jointly implemented in 1978 when Feagans traveled to Microsoft’s Bellevue offices. The code includes memory management improvements (called “garbage collection” in programming terms) and shipped as “BASIC V2” on the Commodore PET.

Microsoft open-sources Bill Gates’ 6502 BASIC from 1978 Read More »

aol-announces-september-shutdown-for-dial-up-internet-access

AOL announces September shutdown for dial-up Internet access

A screenshot of America Online's version 2.5 client in 1995.

A screenshot of America Online’s version 2.5 client in 1995.

The company’s cultural impact extended far beyond mere connectivity. AOL Instant Messenger introduced many users to real-time digital communication. Chat rooms created some of the Internet’s first social networks. The famous “You’ve Got Mail” notification became so iconic that it was a title for a 1998 romantic comedy. For better or worse, AOL keywords trained a generation to navigate the web through corporate-curated portals rather than open searching.

Over the years, Ars Technica documented numerous dial-up developments and disasters that plagued AOL users. In 2015, 83-year-old Ron Dorff received phone bills totaling $24,298.93 after his AOL modem started dialing a long-distance number instead of a local access point—a problem that had plagued users since at least 2002, when New York’s attorney general received more than 50 complaints about similar billing disasters.

The financial risks weren’t limited to technical mishaps: AOL itself contributed to user frustration by repeatedly adjusting its pricing strategy. In 2006, the company raised dial-up rates to $25.90 per month—the same price as broadband—in an attempt to push users toward faster connections. This followed years of subscriber losses that saw AOL’s user base fall over time as the company struggled with conflicting strategies that included launching a $9.95 Netscape-branded service in 2003 while maintaining premium pricing for its main offering.

The infrastructure that remains

AOL’s shutdown doesn’t mean dial-up is completely dead. Several niche providers like NetZero, Juno, and Dialup 4 Less continue to offer dial-up services, particularly in areas where it remains the only option. In the past, some maintained dial-up connections as a backup connection for emergencies, though many still use it for specific tasks that don’t require high bandwidth, like processing credit card payments.

The Public Switched Telephone Network that carries dial-up signals still exists, though telephone companies increasingly route calls through modern packet-switched networks rather than traditional circuit-switched systems. As long as traditional phone service exists, dial-up remains technically possible—just increasingly impractical as the web grows more demanding.

For AOL, maintaining dial-up service likely became more about serving a dwindling but dependent user base than generating meaningful revenue. The infrastructure requirements, customer support needs, and technical maintenance for such a legacy system eventually outweigh the benefits.

The September 30 shutdown date gives remaining dial-up users just over one month now to find alternative Internet access—a challenge for those in areas where alternatives don’t exist. Some may switch to satellite or cellular services despite higher costs. Others may lose Internet access entirely, further widening the digital divide that dial-up, for all its limitations, helped bridge for three decades.

This article was updated on August 12, 2025 at 10: 45 AM Eastern to add details about when AOL began offering true Internet access.

AOL announces September shutdown for dial-up Internet access Read More »

after-27-years,-engineer-discovers-how-to-display-secret-photo-in-power-mac-rom

After 27 years, engineer discovers how to display secret photo in Power Mac ROM

“If you double-click the file, SimpleText will open it,” Brown explains on his blog just before displaying the hidden team photo that emerges after following the steps.

The discovery represents one of the last undocumented Easter eggs from the pre-Steve Jobs return era at Apple. The Easter egg works through Mac OS 9.0.4 but appears to have been disabled by version 9.1, Brown notes. The timing aligns with Jobs’ reported ban on Easter eggs when he returned to Apple in 1997, though Brown wonders whether Jobs ever knew about this particular secret.

The G3 All-in-One is often nicknamed the

The ungainly G3 All-in-One set the stage for the smaller and much bluer iMac soon after. Credit: Jonathan Zufi

In his post, Brown expressed hope that he might connect with the Apple employees featured in the photo—a hope that was quickly fulfilled. In the comments, a man named Bill Saperstein identified himself as the leader of the G3 team (pictured fourth from left in the second row) in the hidden image.

“We all knew about the Easter egg, but as you mention; the technique to extract it changed from previous Macs (although the location was the same),” Saperstein wrote in the comment. “This resulted from an Easter egg in the original PowerMac that contained Paula Abdul (without permissions, of course). So the G3 team wanted to still have our pictures in the ROM, but we had to keep it very secret.”

He also shared behind-the-scenes details in another comment, noting that his “bunch of ragtag engineers” developed the successful G3 line as a skunk works project, with hardware that Jobs later turned into the groundbreaking iMac series of computers. “The team was really a group of talented people (both hw and sw) that were believers in the architecture I presented,” Saperstein wrote, “and executed the design behind the scenes for a year until Jon Rubenstein got wind of it and presented it to Steve and the rest is ‘history.'”

After 27 years, engineer discovers how to display secret photo in Power Mac ROM Read More »

engineer-creates-first-custom-motherboard-for-1990s-playstation-console

Engineer creates first custom motherboard for 1990s PlayStation console

The nsOne project joins a growing community of homebrew PlayStation 1 hardware developments. Other recent projects include Picostation, a Raspberry Pi Pico-based optical disc emulator (ODE) that allows PlayStation 1 consoles to load games from SD cards instead of physical discs. Other ODEs like MODE and PSIO have also become popular solutions for retrogaming collectors who play games on original hardware as optical drives age and fail.

From repair job to reverse-engineering project

To understand the classic console’s physical architecture, Brodesco physically sanded down an original motherboard to expose its internal layers, then cross-referenced the exposed traces with component datasheets and service manuals.

“I realized that detailed documentation on the original motherboard was either incomplete or entirely unavailable,” Brodesco explained in his Kickstarter campaign. This discovery launched what would become a comprehensive documentation effort, including tracing every connection on the board and creating multi-layer graphic representations of the circuitry.

A photo of the nsOne PlayStation motherboard.

A photo of the nsOne PlayStation motherboard. Credit: Lorentio Brodesco

Using optical scanning and manual net-by-net reverse-engineering, Brodesco recreated the PlayStation 1’s schematic in modern PCB design software. This process involved creating component symbols with accurate pin mappings and identifying—or in some cases creating—the correct footprints for each proprietary component that Sony had never publicly documented.

Brodesco also identified what he calls the “minimum architecture” required to boot the console without BIOS modifications, streamlining the design process while maintaining full compatibility.

The mock-up board shown in photos validates the footprints of chips and connectors, all redrawn from scratch. According to Brodesco, a fully routed version with complete multilayer routing and final layout is already in development.

A photo of the nsOne PlayStation motherboard.

A photo of the nsOne PlayStation motherboard. Credit: Lorentio Brodesco

As Brodesco noted on Kickstarter, his project’s goal is to “create comprehensive documentation, design files, and production-ready blueprints for manufacturing fully functional motherboards.”

Beyond repairs, the documentation and design files Brodesco is creating would preserve the PlayStation 1’s hardware architecture for future generations: “It’s a tribute to the PS1, to retro hardware, and to the belief that one person really can build the impossible.”

Engineer creates first custom motherboard for 1990s PlayStation console Read More »

us-air-traffic-control-still-runs-on-windows-95-and-floppy-disks

US air traffic control still runs on Windows 95 and floppy disks

On Wednesday, acting FAA Administrator Chris Rocheleau told the House Appropriations Committee that the Federal Aviation Administration plans to replace its aging air traffic control systems, which still rely on floppy disks and Windows 95 computers, Tom’s Hardware reports. The agency has issued a Request For Information to gather proposals from companies willing to tackle the massive infrastructure overhaul.

“The whole idea is to replace the system. No more floppy disks or paper strips,” Rocheleau said during the committee hearing. Transportation Secretary Sean Duffy called the project “the most important infrastructure project that we’ve had in this country for decades,” describing it as a bipartisan priority.

Most air traffic control towers and facilities across the US currently operate with technology that seems frozen in the 20th century, although that isn’t necessarily a bad thing—when it works. Some controllers currently use paper strips to track aircraft movements and transfer data between systems using floppy disks, while their computers run Microsoft’s Windows 95 operating system, which launched in 1995.

A pile of floppy disks

Credit: Getty

As Tom’s Hardware notes, modernization of the system is broadly popular. Sheldon Jacobson, a University of Illinois professor who has studied risks in aviation, says that the system works remarkably well as is but that an upgrade is still critical, according to NPR. The aviation industry coalition Modern Skies has been pushing for ATC modernization and recently released an advertisement highlighting the outdated technology.

While the vintage systems may have inadvertently protected air traffic control from widespread outages like the CrowdStrike incident that disrupted modern computer systems globally in 2024, agency officials say 51 of the FAA’s 138 systems are unsustainable due to outdated functionality and a lack of spare parts.

The FAA isn’t alone in clinging to floppy disk technology. San Francisco’s train control system still runs on DOS loaded from 5.25-inch floppy disks, with upgrades not expected until 2030 due to budget constraints. Japan has also struggled in recent years to modernize government record systems that use floppy disks.

If it ain’t broke? (Or maybe it is broke)

Modernizing the air traffic control system presents engineering challenges that extend far beyond simply installing newer computers. Unlike typical IT upgrades, ATC systems must maintain continuous 24/7 operation, because shutting down facilities for maintenance could compromise aviation safety.

US air traffic control still runs on Windows 95 and floppy disks Read More »

bill-atkinson,-architect-of-the-mac’s-graphical-soul,-dies-at-74

Bill Atkinson, architect of the Mac’s graphical soul, dies at 74

Using HyperCard, Teachers created interactive lessons, artists built multimedia experiences, and businesses developed custom database applications—all without writing traditional code. The hypermedia environment also had a huge impact on gaming: 1993 first-person adventure hit Myst originally used HyperCard as its game engine.

An example of graphical dithering, which allows 1-bit color (black and white only) to imitate grayscale.

An example of graphical dithering, which allows 1-bit color (black and white only) to imitate grayscale. Credit: Benj Edwards / Apple

For the two-color Macintosh (which could only display black or white pixels, with no gradient in between), Atkinson developed an innovative high-contrast dithering algorithm that created the illusion of grayscale images with a characteristic stippled appearance that became synonymous with early Mac graphics. The dithered aesthetic remains popular today among some digital artists and indie game makers, with modern tools like this web converter that allows anyone to transform photos into the classic Atkinson dither style.

Life after Apple

After leaving Apple in 1990, Atkinson co-founded General Magic with Marc Porat and Andy Hertzfeld, attempting to create personal communicators before smartphones existed. Wikipedia notes that in 2007, he joined Numenta, an AI startup, declaring their work on machine intelligence “more fundamentally important to society than the personal computer and the rise of the Internet.”

In his later years, Atkinson pursued nature photography with the same artistry he’d brought to programming. His 2004 book “Within the Stone” featured close-up images of polished rocks that revealed hidden worlds of color and pattern.

Atkinson announced his pancreatic cancer diagnosis in November 2024, writing on Facebook that he had “already led an amazing and wonderful life.” The same disease claimed his friend and collaborator Steve Jobs in 2011.

Given Atkinson’s deep contributions to Apple history, it’s not surprising that Jobs’ successor, Apple CEO Tim Cook, paid tribute to the Mac’s original graphics guru on X on Saturday. “We are deeply saddened by the passing of Bill Atkinson,” Cook wrote. “He was a true visionary whose creativity, heart, and groundbreaking work on the Mac will forever inspire us.”

Bill Atkinson, architect of the Mac’s graphical soul, dies at 74 Read More »

endangered-classic-mac-plastic-color-returns-as-3d-printer-filament

Endangered classic Mac plastic color returns as 3D-printer filament

On Tuesday, classic computer collector Joe Strosnider announced the availability of a new 3D-printer filament that replicates the iconic “Platinum” color scheme used in classic Macintosh computers from the late 1980s through the 1990s. The PLA filament (PLA is short for polylactic acid) allows hobbyists to 3D-print nostalgic novelties, replacement parts, and accessories that match the original color of vintage Apple computers.

Hobbyists commonly feed this type of filament into commercial desktop 3D printers, which heat the plastic and extrude it in a computer-controlled way to fabricate new plastic parts.

The Platinum color, which Apple used in its desktop and portable computer lines starting with the Apple IIgs in 1986, has become synonymous with a distinctive era of classic Macintosh aesthetic. Over time, original Macintosh plastics have become brittle and discolored with age, so matching the “original” color can be a somewhat challenging and subjective experience.

A close-up of

A close-up of “Retro Platinum” PLA filament by Polar Filament. Credit: Polar Filament

Strosnider, who runs a website about his extensive vintage computer collection in Ohio, worked for years to color-match the distinctive beige-gray hue of the Macintosh Platinum scheme, resulting in a spool of hobby-ready plastic by Polar Filament and priced at $21.99 per kilogram.

According to a forum post, Strosnider paid approximately $900 to develop the color and purchase an initial 25-kilogram supply of the filament. Rather than keeping the formulation proprietary, he arranged for Polar Filament to make the color publicly available.

“I paid them a fee to color match the speaker box from inside my Mac Color Classic,” Strosnider wrote in a Tinkerdifferent forum post on Tuesday. “In exchange, I asked them to release the color to the public so anyone can use it.”

Endangered classic Mac plastic color returns as 3D-printer filament Read More »