The Computer Industry For All Mankind Never Needed to Build

The Computer Industry For All Mankind Never Needed to Build
And the one that we may be rebuilding.

Disclosure: This reflects my personal experience and interpretation of publicly available information. It represents my views alone—not any employer or organization—and is not professional advice.

Fifty-three years. That’s how long it took us to go back to the Moon. Apollo 17 returned in December 1972. Artemis II splashed down today, April 10, 2026, in the Pacific off San Diego. Four astronauts. Ten days. 695,000 miles. The farthest any human has ever traveled from Earth.

In the gap between those two missions, we dismantled the computing infrastructure that made the first one possible and rebuilt it from scratch. Twice. We are arguably in the middle of building it a third time. At some point this stops being progress and starts being a renovation project that never ends.

This is a what-if essay. I’m over-indulging myself. I know that. But I think the thought exercise has real relevance to where the technology industry is actually ending up, and the timing of Artemis II makes it impossible not to write.

There’s a throwaway line in For All Mankind that stuck with me. An astronaut casually notes that a 20-year-old spacecraft they’re refurbishing has less compute than his Newton. It lands like a joke. It isn’t. It’s a glimpse into a different trajectory for computing, one where the industry we take for granted never fully forms.

Most people watch that show for the geopolitics. I keep watching for the computers.

The show is careful about this. You see people using laptops, desktops, devices. But you never get a clear look at the systems behind them. The production design keeps it deliberately ambiguous, showing just enough to suggest a different world without ever confirming what it looks like. You’re reverse-engineering an entire industry from set dressing and a single throwaway line about a Newton. This is either brilliant speculation or the nerdiest possible use of a television show. Probably both.

The real divergence isn’t who gets to Mars first. It’s structural. In that timeline, the space race never stops being the primary driver of computing. And that single difference changes everything.

Years ago, I wrote a series for ZDNet called “To the Moon,” which included a piece documenting IBM and UNIVAC’s roles as the system integrators behind Apollo 11. IBM’s involvement was staggering. The System/360s in Houston processed telemetry in real time. UNIVAC’s 1230 systems handled decoding at tracking stations around the world. These weren’t peripheral contributors. They were the infrastructure. What struck me then, and strikes me harder now watching the Artemis II coverage, is how completely that infrastructure disappeared from the story we tell about computing. We remember the Apollo Guidance Computer, the plucky little machine in the spacecraft. We forget the mainframes and tracking arrays that made the whole thing work. In the For All Mankind timeline, those systems never become a footnote. They remain the foundation.

The Break We Don’t Talk About

In our world, Apollo ends and something breaks. Funding shifts. Urgency fades. The large-scale systems that defined computing lose their central purpose. Into that vacuum comes the personal computer, commodity hardware, distributed systems, the internet, the cloud. We broke computing into pieces. Then we spent 40 years putting it back together.

We may be finishing that reassembly right now. The AI infrastructure being built today looks nothing like the distributed web that preceded it. It looks like the centralized systems we dismantled. The question worth asking is whether the timeline depicted in For All Mankind simply never dismantled them in the first place.

The Collapse That Enabled Everything

The modern stack only exists because several pillars fell at once. IBM lost cultural dominance. Digital Equipment Corporation disappeared. UNIVAC and Burroughs merged into Unisys and faded into irrelevance. Centralized computing gave way to independent machines, and from that collapse came a genuinely new idea: anyone can own a computer, anyone can write software, anyone can build something new.

That idea reshaped the world. But it wasn’t inevitable. It required a specific sequence of institutional failures, and in a world where those institutions never fail, the idea never arrives in the same form.

A World Where the Stack Never Flattened

In the world implied by For All Mankind, the industry doesn’t flatten. It stratifies, and stays that way. Three layers persist, each doing the work our timeline compressed into a single device on every desk.

At the top, IBM, Unisys, and Burroughs don’t fade. They evolve into planetary-scale coordination engines, mission-critical infrastructure, real-time always-on compute. Not relics. The core.

In the middle, DEC survives. When DEC dies in our timeline, the entire middle layer collapses. Everything becomes a PC or nothing. In that alternate world, the middle holds. Shared computing, interactive systems, departmental-scale infrastructure all persist. The personal computer never takes over because the problems it solves are already solved.

At the bottom, Apple’s Newton doesn’t fail. It evolves into something we never built: an iPad mini-scale device, stylus-first, with a paper-like display, days-long battery, and instant-on responsiveness. Not a phone. Not a tablet. A paper-native computer. And critically, not a standalone device but a window into the systems above it. Think of it as a very expensive piece of glass that lets you see into the mainframe. Which, if you think about it, is basically what your iPhone is now. We just took a 40-year detour to get there.

The Companies That Evolve

In our timeline, Apple’s story is collapse and reinvention. Jobs leaves, struggles, returns through NeXT, and NeXT becomes the foundation of everything modern Apple builds. In this world, that second act never happens. Apple buys BeOS instead of NeXT, gets a modern OS with real-time performance, and continues rather than reinventing itself. The Newton evolves. Apple becomes a company focused on thinking, writing, and interaction rather than engagement and platform dominance. In a world dominated by centralized systems, the unsolved problem is interface, not operating systems. Apple doesn’t conquer computing. It integrates into it.

The GUI theft narrative only works because the Macintosh exists as a vehicle. Xerox PARC invents the future, Jobs visits, Apple ships it, Microsoft copies it. In this world, the PARC research doesn’t get stolen and democratized. It gets absorbed into institutional computing. The GUI still happens. It just happens on terminals and smart devices connected to the systems that already run everything. Xerox may remain a significant technology company, which would certainly confuse anyone from our timeline who associates the name primarily with copy machines and missed opportunities.

Hewlett-Packard is the wildcard. The laser printer business is durable regardless of timeline. LaserJet doesn’t care what’s sending it a print job. HP’s acquisition of Apollo Computer gives it a serious workstation line, and HP-UX gives it a credible Unix. In a world where the middle layer survives, HP sits comfortably in the DEC tier: departmental workstations, engineering systems, HP-UX where IBM’s mainframes are overkill. HP is one of the few companies that looks recognizable in both worlds. It just doesn’t have to spend three decades figuring out what it is.

Motorola becomes the dominant chipmaker. The 68000 was arguably the superior architecture, cleaner and more elegant than x86, but it lost because Intel had the volume play. The 68000 didn’t need a consumer PC explosion. It was already the processor of choice for aerospace, defense, telecommunications, and industrial control. In a world where those markets remain the center of computing, Motorola becomes the Intel of that timeline. Not because it won a price war, but because the institutions never stopped buying it. If you want a sense of how different the semiconductor landscape looks: PowerPC never exists. It’s born from desperation in 1991 when Apple, IBM, and Motorola ally to counter x86. Without that war, the alliance never forms. IBM keeps POWER on its RS/6000s. Motorola keeps the 68000 line. Each serves its tier. Nobody panics.

Computer Associates becomes the Microsoft of that timeline. Charles Wang’s model of acquiring mainframe software companies and consolidating maintenance contracts works even better when the mainframe market doesn’t shrink. CA controls the software layer on top of the platforms that actually matter. Control Data Corporation survives, because centralized high-performance computing remains the norm. Cray still leaves to form his own company, because that’s who Cray is, but the market never evaporates.

EDS thrives. Ross Perot’s insight that organizations need someone to run their computers never gets commoditized by cheap distributed IT or the cloud. EDS becomes the operator class. If you’ve ever wondered what a world looks like where Ross Perot is a more consequential figure than Bill Gates, this is it.

The Lives That Change

This is where the alternate history stops being about companies and starts being about people.

Jobs becomes a movie mogul. Pixar still happens. The Disney acquisition likely still happens. He may end up running the most powerful entertainment company in the world. He just never returns to computing. In our timeline, the NeXT acquisition saves both Apple and Jobs. In theirs, Jobs doesn’t need saving. He’s fine. He’s just in a different industry. And when pancreatic cancer kills him in 2011, the world mourns a visionary filmmaker and entertainment executive, not the man who put a computer in every pocket. His obituary leads with Pixar, not the iPhone. His closest friends are George Lucas and Steven Spielberg, not the founders of Silicon Valley companies that in this world were never founded.

Then there is Tim Cook. In our timeline, Cook spends 12 years at IBM’s Personal Computer Company, rising to director of North American fulfillment. He leaves for Compaq. Jobs recruits him to Apple, where he transforms the supply chain and eventually becomes CEO, the first openly LGBTQ CEO of a Fortune 500 company. Remove the PC industry and Cook’s trajectory changes completely. He’s still at IBM, running logistics for the mainframe business or services division. He might still rise. IBM added sexual orientation to its nondiscrimination policy in 1984. But Cook as a public figure, the face of the most valuable company on earth, is a product of a specific sequence: IBM PC division to Compaq to Apple. Without the PC, that sequence never forms. Cook is brilliant and probably runs something significant. He just never becomes a household name.

I know something about this. I spent years at IBM myself, leaving in 2012. In the alternate timeline, there’s no ZDNet as we know it, no consumer tech media ecosystem, no “To the Moon” series documenting Apollo infrastructure from the outside. I’m probably still at IBM. Still inside the system. Probably writing internal white papers about mainframe optimization and wondering what my life would be like if I’d gone into journalism. The personal computer didn’t just create companies. It created careers, identities, entire professional categories that didn’t exist before the stack flattened. Every technology journalist, every VAR sales engineer, every independent consultant, every startup founder is a product of the collapse. Remove it, and we don’t disappear. We just never become who we became.

The Companies That Never Form

Intel never becomes the center of gravity. Without the PC creating insatiable x86 demand, semiconductor advancement is driven by institutional procurement rather than consumer retail. The x86 architecture may never achieve dominance at all. Microsoft never captures the platform. Dell and Compaq never dominate hardware. Three titans of our world, reduced to footnotes in theirs.

Sun Microsystems doesn’t exist. “The network is the computer” is a thesis about distributed computing. In a world where computing never distributes, that thesis has no audience. It would be like coining the slogan “the horse is the automobile” after the Model T ships.

Oracle likely doesn’t exist either. Ellison builds Oracle on the premise that commodity Unix hardware needs a database IBM won’t provide. Without that gap, DB2 is the leading relational database because IBM controls the platforms it runs on. Larry Ellison is probably still rich. He’s just rich from something else. Maybe yachts. Actually, probably yachts regardless.

Silicon Graphics never forms. In a world where IBM and HP own the workstation tier, there is no gap for Jim Clark to fill. Jurassic Park still needs rendering. Toy Story still needs computing. But those machines are HP Apollos and IBM RS/6000s. Pixar buys its rendering hardware from HP rather than from a company that was never founded.

Cisco is equally unlikely. Bosack and Lerner found it at Stanford to route traffic between incompatible networks. The multiprotocol router is a product for a world where nobody agrees on anything. In a world where IBM sets the networking standard, there is less heterogeneity to bridge. Cisco’s core insight, that the money is in connecting the chaos, has no chaos to connect.

The Network That Never Opens

Ethernet wins our timeline because it is cheap and good enough. Token Ring loses because it is expensive and designed for environments where reliability matters more than cost. In a world where IBM never fades, Token Ring scales. Deterministic access, where every node gets a guaranteed turn to transmit, is exactly what you want for mission-critical planetary-scale operations. Ethernet’s approach, where nodes shout into the wire and hope for the best, looks reckless by comparison. It’s the networking equivalent of a four-way stop versus a traffic light. One is cheaper. The other doesn’t result in collisions.

The internet still exists. ARPANET is a government project. TCP/IP is a government protocol. But the World Wide Web doesn’t arrive the same way. Berners-Lee invents it in 1989 on a NeXT workstation at CERN. NeXT exists because Jobs founds it after leaving Apple. In the alternate timeline, Apple buys BeOS instead. NeXT either never matures or never exists in the same form. Berners-Lee still wants to solve the problem of linked documents. But the tool he reaches for isn’t there.

Without the web, there are no browsers. Without browsers, no Netscape. Without Netscape, no browser war. Without the browser war, no antitrust case against Microsoft. Without the consumer web, no Google, no Amazon as we know it, no Facebook, no social media, no ad-supported internet economy. Just that sentence alone should make you sit with this for a minute. No ad-supported internet economy. Imagine it. Take your time.

What exists instead is something IBM would build. A networked information system: institutional, credentialed, structured. An evolved Prodigy or CompuServe run at scale by the dominant computing vendor. Access is mediated. Identity is institutional. The network is a service, not a commons. Whether that is better or worse depends entirely on what you value, but at minimum you’d never have to see a popup ad or accept a cookie policy.

We don’t even know if AT&T gets broken up in that world. The 1984 divestiture is a political and regulatory decision, and the political landscape in For All Mankind diverges significantly from ours. The show’s timelines suggest less regulation, not more. If AT&T stays intact, Bell Labs remains unified, and that changes the entire software landscape.

I wrote another piece for ZDNet years ago arguing that without Dennis Ritchie, there would be no Steve Jobs. Ritchie and Ken Thompson, working at Bell Labs, created C and UNIX, the two technologies that form the DNA of virtually every modern operating system and programming language. C begat C++, which begat Java and Objective-C. UNIX begat BSD, which begat Linux, which begat Android. Every iPhone runs on a descendant of work done by a bearded computer scientist in a cardigan in New Jersey.

But C and UNIX succeed in our timeline because the computing world fragments. UNIX’s entire value proposition is portability: one operating system that runs on anyone’s hardware. In a world where IBM dominates and the stack never splinters, you don’t need a portable operating system. You need an operating system that runs on IBM hardware, and IBM already has one. Ritchie and Thompson still work at Bell Labs. They’re still brilliant. They may still create C and UNIX. But UNIX doesn’t spread through universities on cheap PDP-11s because the cheap PDP-11 ecosystem doesn’t exist in the same way. C doesn’t become the lingua franca of a thousand incompatible platforms because there aren’t a thousand incompatible platforms.

What dominates instead is COBOL and Fortran. COBOL runs the business logic on the mainframes. Fortran runs the scientific computing and the space program simulations. These are not glamorous languages. Nobody gets a tattoo of a COBOL subroutine. But they are the languages of institutional computing, and in a world where institutional computing never loses its grip, they never get displaced. The programmers writing them are professionals with degrees and institutional affiliations, not self-taught hackers building things in garages. The entire culture of software development looks different: more engineering discipline, less move-fast-and-break-things. Whether that produces better or worse software is an argument that could outlast this essay. But at least there’s no Java. And no JavaScript. So it’s not all bad.

And without the consumer web, the cloud never needs to exist. The cloud isn’t innovation. It’s reconstruction. We fragmented computing into millions of nodes, then rebuilt centralized systems on top of them. In a world where that fragmentation never happens, there is nothing to rebuild. Which, when you say it out loud, makes the cloud sound less like the future and more like a very expensive apology.

The Culture That Never Forms

The entire channel ecosystem that defines computing culture in the 1980s and 1990s never materializes. No VARs, no storefront dealers, no mail-order catalogs, no CompUSA, no Micro Center. When computing stays institutional, that work is done by the vendors themselves or by companies like EDS. Nobody is building custom rigs in a strip mall.

The media landscape is unrecognizable. PC Magazine, Computer Shopper, InfoWorld, PC World, MacUser, MacWorld: none of them exist. The publications that define how a generation thinks about technology are all products of a consumer computing market that never forms.

Byte survives. It predates the PC explosion and covers computing as a discipline rather than a consumer product category. In a world where institutional computing remains dominant, Byte’s editorial model is exactly what the market wants. It may still be in actual print. The economics that kill print magazines depend on a consumer electronics market generating enough volume for glossy full-page hardware ads. Without that market, print doesn’t collapse the same way. Byte becomes the journal of record for a computing industry that still thinks of itself as a profession rather than a lifestyle. A world where the most important tech publication is Byte rather than The Verge is a world with very different vibes.

Netflix doesn’t exist. Streaming requires a consumer internet built on cheap Ethernet, commodity PCs, and open protocols. Without that infrastructure, Blockbuster never gets disrupted. The death of appointment television, the rise of binge culture, the collapse of the theatrical window: none of it happens. The entertainment industry looks more like 1995 than 2025. Video rental stores still exist. You’re still arguing with your spouse about late fees on a Friday night. Some things are better in our timeline.

No Linux, But SHARE Scales Instead

Linux emerges from cheap hardware, fragmented systems, and individual access to development tools. Remove those conditions and open source changes shape entirely. But collaboration doesn’t disappear. It evolves differently.

SHARE, founded in 1955 as an IBM mainframe user group, was one of the earliest examples of organized software sharing. Member organizations contributed code, documentation, and operational knowledge into a common pool. Vendors participated directly. Contributions were identity-bound, institutionally backed, curated. It was collaborative development before anyone called it that.

In the For All Mankind timeline, SHARE becomes the dominant model. Instead of GitHub and anonymous contributors, you get institutional collaboration, identity-bound contributions, curated ecosystems, and vendor-aligned standards. Not open source as we know it. Structured collaboration inside the system.

Here’s the thing: look at open source today. It is dominated by corporate contributors, governed by foundations, aligned with vendor roadmaps. The Linux Foundation’s membership roster reads like a Fortune 500 directory. Kubernetes, TensorFlow, PyTorch were all born inside corporations, developed by salaried engineers, governed by institutional processes. The romantic image of a lone hacker in a dorm room changing the world is mostly a founding myth at this point. The center of gravity shifted years ago from individuals building outside the system to organizations building together inside it.

We may be rediscovering SHARE. We just call it something else.

The Mainframes Are Back

The canonical story of computing: mainframes gave way to minicomputers, which gave way to PCs, which gave way to distributed systems, which gave way to the cloud. Smaller and more distributed wins. Every CTO presentation for 30 years has told some version of this story, usually with an arrow pointing down and to the right.

AI workloads don’t follow that arc. They require massive centralized compute, tightly coupled systems, specialized hardware, enormous power density, coordinated execution at scale. The systems being built are hyperscale clusters, GPU superpods, training rigs costing hundreds of millions of dollars. Centralized, expensive, controlled by a few companies, inaccessible to individuals.

They are mainframes. We just don’t call them that.

The parallels are hard to ignore. In the For All Mankind timeline, institutional computing never fragments. IBM, DEC, and their peers build increasingly powerful centralized systems, connected by deterministic networks, managed by professional operators, accessed through smart terminals and Newtons. In our timeline, we fragmented everything, spent decades rebuilding, and are now converging on a model where a handful of companies operate enormous centralized computing facilities that individuals access through smart glass rectangles in their pockets. The architectures are different. The economics are different. The end state is starting to look uncomfortably similar.

The difference is that their world built it intentionally. Ours stumbled back into it after trying everything else first. There’s a Winston Churchill line about Americans always doing the right thing after exhausting all other options. The computing industry may be proving him right.

The Long Way Around

The difference between their world and ours may not be the destination. It may be the path.

They never fragmented. They stayed centralized. They built structured collaboration from the start. We fragmented everything, democratized access, built extraordinary things in the chaos, and are now re-centralizing under new constraints. Different tradeoffs, not obviously better or worse ones. We got the internet, open source, and the iPhone. They probably got stable infrastructure, fewer billionaires, and a Byte magazine subscription.

The consolidation pattern tells it most clearly. In our world, the mergers of the 2000s are attempts to undo the fragmentation of the previous decades. HP buys Compaq in 2002, and Compaq had already swallowed DEC in 1998, so what HP is really doing is reassembling the institutional computing stack that collapsed in the first place. Oracle buys Sun. Dell buys EMC. IBM buys Red Hat. Every one of these is an attempt to reconstruct vertical integration that the PC era destroyed. In their world, DEC never collapses, Compaq never forms, and HP never needs to buy either of them. The companies that would have been acquired were never created. There is less to consolidate because there was less to fragment.

Tonight, Artemis II is splashing down in the Pacific. Four astronauts returning from ten days around the Moon, the first humans to travel there since 1972. They broke the distance record set by Apollo 13. They saw a solar eclipse from beyond the far side of the Moon.

Fifty-three years passed between the last Apollo crew and this one. In that gap, we dismantled the computing infrastructure that put them there, rebuilt it in a completely different form, and are now rebuilding it again in a form that looks remarkably like what we started with. We fragmented IBM’s mainframes into PCs, reassembled them into cloud data centers, and are now concentrating them into AI superclusters that would be recognizable to the engineers who built the System/360s in Houston.

The computer industry we got wasn’t inevitable. It was one path, shaped by collapse, fragmentation, and reconstruction. In another timeline, the space race never stopped, and the infrastructure I documented in that ZDNet series never became a footnote. It just kept evolving.

We took the long way around. But tonight, at least, we ended up in the same place: bringing people home from the Moon.