Apple at 50: Own the Whole Stack
Fifty years ago today — April 1, 1976 — Steve Jobs, Steve Wozniak, and Ron Wayne signed the Apple Computer Company partnership agreement in a garage in Los Altos, California. Woz had built both the hardware and the software for the Apple I. One person. The whole stack.
Half a century later, Apple is worth $3.8 trillion and every product it sells — from the MacBook Neo to the iPhone 17 to the Apple Watch — runs on silicon it designed itself, inside hardware it designed itself, running software it designed itself. The integration of the full stack is the thread that connects a circuit board in a garage to a 194-billion-transistor system-on-chip. It is the only idea Apple has never abandoned.
And the most interesting part of the story isn’t when integration worked. It’s what happened the one time they stopped.
The Rise Before the Fall
The first two decades were extraordinary. The Apple II, released in 1977, dominated American classrooms so completely that by the mid-1980s Apple held roughly two-thirds of the education computer market. A state education body in Minnesota — MECC, the consortium that ported The Oregon Trail — became the single largest purchaser of Apple II computers. Jobs’ “Kids Can’t Wait” programme donated machines to 9,000 California schools at an actual net cost, after the tax credit loophole, of about a million dollars. An entire generation learned to type on Apple hardware.
In 1982, Apple became the first personal computer company to hit $1 billion in annual revenue. Then the Macintosh arrived in 1984, and with it came desktop publishing — the LaserWriter, Aldus PageMaker, and a graphical interface that made professional typesetting possible on a desk. An entirely new industry, born from the integration of Apple hardware, Adobe’s PostScript, and third-party software. Mac market share surged to 8%. The LaserWriter cost more than the Mac itself, but the combination was irresistible.
This is what kept Apple alive through what came next. Desktop publishing was expensive, and the people who needed it had no alternative. Integration created a niche with pricing power.
The One Time Apple Stopped Integrating
After Jobs was forced out in 1985, Apple gradually abandoned the principle that had made it. Three successive CEOs — Sculley, Spindler, Amelio — looked at Microsoft’s dominance and drew the obvious conclusion: modular wins. Let other companies build the hardware. License the OS. Compete on volume.
In 1994, Apple began licensing Mac OS to clone makers like Power Computing and Umax. It was the most significant strategic reversal in the company’s history. Apple was no longer a vertically integrated hardware-software company. It was becoming a software licenser — Microsoft’s model.
The results were catastrophic. The clones didn’t expand the Mac market. They cannibalised it, undercutting Apple’s own machines on price while Apple bore all the R&D costs. Meanwhile, without the discipline of integration, the product line metastasised. By 1997 Apple was selling roughly 350 products — including a gaming console (the Pippin, somewhere between 12,000 and 42,000 units sold), a PDA whose handwriting recognition was mocked in Doonesbury and The Simpsons, and dozens of confusingly named Performa models that even Apple’s own salespeople couldn’t tell apart.
Three CEOs in four years. Market share collapsed from 12% to 3%. The company lost over a billion dollars in a single fiscal year. At one point, lawyers from two different Apple divisions showed up at the US Patent Office to sue each other. Sun Microsystems was, by Scott McNealy’s own admission, “literally hours away” from buying the company at $5-6 per share.
When Jobs returned, he understood immediately that the problem wasn’t execution. It was architecture. Apple had tried to be a modular company, and modular was killing it.
He drew a four-quadrant grid on a whiteboard. Consumer and professional. Desktop and portable. Four products. Everything else was killed. The clone licences exploited a loophole — they covered “System 7,” so when Apple released Mac OS 8, the cloners had no legal right to ship it.
Integration was back. And Apple never abandoned it again.
The iMac and the Air Freight
The turnaround wasn’t just product focus. It was operational integration.
Tim Cook joined Apple in March 1998, six months before the iMac G3 shipped. His first move: spend $100 million reserving nearly all available holiday-season air freight capacity months in advance. Dell and Compaq literally could not book freight during the critical shipping window. Within seven months, Cook had slashed Apple’s inventory from $400 million to $78 million and closed 10 of 19 warehouses. “Inventory is fundamentally evil,” he said.
The iMac itself was integration made physical. Bondi Blue translucent polycarbonate, designed by a 31-year-old Jony Ive who’d spent five years in obscurity at Apple before Jobs noticed him. It sold 800,000 units in its first 20 weeks. 32% of buyers had never owned a computer. The stock finished 1998 up 200%.
But the real story was what Apple dropped. The floppy drive. ADB. SCSI. Serial ports. Every legacy connection, replaced with USB — a standard almost nobody made peripherals for yet. The outcry was fierce. Apple was right. Within two years, nobody missed any of it.
The iPod and the Trojan Horse
The iMac saved Apple. The iPod transformed it. And the critical move was one that seemed to break Apple’s own rules.
In October 2001, Apple launched the iPod — hardware and software integrated so tightly that the scroll wheel, the interface, and the iTunes sync experience felt like a single product. No other MP3 player came close. But the iPod was Mac-only, and Macs were 3% of the market.
In October 2003, Apple released iTunes for Windows. It was heresy. Apple’s entire identity was built on controlling both sides. But Jobs understood something subtle: iTunes wasn’t the product. The iPod was. And putting iTunes on Windows meant every PC user in the world could buy an iPod. By 2005, the iPod was generating more revenue than the Mac.
More importantly, the iPod put Apple hardware in the pockets of people who had never owned a Mac. When the iPhone arrived two years later, those 100 million iPod owners already trusted Apple to make beautiful, integrated devices. The iPod was a trojan horse for the iPhone.
A Computer That Made Calls
The iPhone, launched in June 2007, was the purest expression of Apple’s integration thesis. RIM, Palm, and Nokia all made smartphones that integrated hardware and software. But they built phones that could run some apps. Apple built a full computer — running a scaled-down version of Mac OS X, with a multi-touch interface nobody had seen before — that happened to make calls.
The distinction mattered. A phone OS hits a ceiling. A computer OS doesn’t. Within a year, the App Store turned the iPhone into a platform, and the integration advantage compounded: Apple controlled the chip, the OS, the development tools, the distribution, and the hardware. No Android manufacturer could match that. Samsung made the hardware but didn’t control the OS. Google controlled the OS but didn’t make most of the hardware.
The results were staggering. The iPhone generated over $2 trillion in cumulative revenue across its first two decades. By 2012, it was producing more profit than all of Microsoft. It made Apple the most valuable company on earth. And it funded everything that came next — including the silicon team that would bring integration full circle.
The Thermal Corner
Integration doesn’t always work.
In 2013, Phil Schiller unveiled the cylindrical Mac Pro with the line “Can’t innovate anymore, my ass.” It was stunning. A radical thermal core design that delivered 7 teraflops from dual AMD GPUs in one-eighth the volume of its predecessor.
It was also a dead end. The dual-GPU design assumed symmetric workloads. The industry moved to single, larger GPUs. The thermal envelope couldn’t accommodate the change. Apple didn’t update the Mac Pro for six years. The same 2013 specs were still being sold at 2013 prices in 2017. Professionals fled to Windows and NVIDIA. Hackintosh builds became common in creative studios.
In April 2017, Apple did something it essentially never does. It invited five journalists to an on-the-record briefing where Craig Federighi admitted they had “designed ourselves into a bit of a thermal corner.” Phil Schiller said: “If we’ve had a pause in upgrades and updates, we’re sorry for that.”
An apology. From Apple. About a product they were still selling.
Forty Engineers and a Billion Transistors
While the Mac Pro languished, something extraordinary was happening quietly inside Apple.
In 2008, Bob Mansfield recruited Johny Srouji from IBM to build Apple’s chip design team. That year, Apple also acquired P.A. Semi for $278 million, bringing 150 engineers with backgrounds in Alpha, StrongARM, and Itanium processors. Two years later, they bought Intrinsity for $121 million, gaining circuit-level tricks that pushed a standard ARM core from 650 MHz to 1 GHz.
For a decade, this team designed chips for iPhones and iPads. The A-series got better every year. Each generation was proof that Apple’s integration of chip design and software could beat generic processors. By 2019, the iPhone’s A13 chip matched Intel’s best desktop CPUs in single-core performance at roughly one-tenth the price per chip.
Then, at WWDC 2020, Apple announced the Mac would move to its own silicon.
The M1 arrived in November 2020. The benchmark results were absurd. Running x86 code through Rosetta 2 emulation — not even native — the M1 scored 1,313 on Geekbench 5 single-core. Intel’s own i9-10910 running native x86 scored 1,251. Apple’s translated code was faster than Intel’s native code.
A year later, the M1 Max MacBook Pro could play 30 simultaneous streams of 4K ProRes video. The 28-core Mac Pro with a $2,000 Afterburner card couldn’t match it. A $2,499 laptop chip had made a $25,000 workstation obsolete.
This is what happens when one company controls the chip, the operating system, the hardware, and the codec. Apple didn’t just build a faster processor. It built a hardware ProRes encoder into the silicon, wrote the software to use it, and designed the laptop around the thermal envelope of its own chip. No other company on earth can do this. Not because they lack talent. Because they don’t own all the layers.
The Accidental AI Machine
The punchline is that Apple’s most consequential architectural decision may have been an accident.
Unified memory — where the CPU, GPU, and Neural Engine share a single pool of RAM with zero-copy access — was designed for creative workloads. Video editors and 3D artists need the GPU to see what the CPU is doing without expensive data transfers across a PCIe bus. Apple’s solution was elegant: put everything on one die, sharing one memory pool.
Then large language models arrived. And it turned out that LLM inference is memory-bandwidth-bound, not compute-bound. Every token generated requires reading the entire model from memory. What matters isn’t how many teraflops you have. It’s how much memory you have and how fast you can read it.
A Mac Studio with 192GB of unified memory can run a 70-billion-parameter model locally. A $1,600 NVIDIA RTX 4090 has 24GB of VRAM. The model doesn’t fit. You’d need multiple $25,000+ data centre GPUs to match what a single Mac does.
Apple didn’t design unified memory for AI. It designed it for Final Cut Pro. The architecture just happened to be exactly right for a workload nobody anticipated at the consumer level. Integration has a way of creating advantages you didn’t plan for.
The Screw in Austin
Apple tried to build a Mac Pro in Texas. The Austin factory, run by Flextronics, was meant to prove American manufacturing could work. It mostly proved the opposite.
Apple needed a custom screw. In Shenzhen, this is trivial. In Texas, they searched for months. They eventually found Caldwell Manufacturing, a 20-employee shop in Lockhart. Maximum output: 1,000 screws per day. The owner, Stephen Melo, personally drove 28,000 screws to the factory across 22 trips. They still weren’t quite right. Apple ended up sourcing from China anyway.
A former senior Apple manager described the experiment as proving “that the U.S. supply chain could work as good as China’s, and it failed miserably.”
Integration isn’t just a product philosophy. It’s a manufacturing reality. Apple owns the world’s largest fleet of CNC milling machines — roughly 40,000 across its supply chain. It books 90% of TSMC’s cutting-edge 3nm capacity. It prepays billions to lock out competitors from component supplies. The company that controls the whole stack controls the whole supply chain.
Fifty Years, One Idea
Every competitor has been missing a piece. IBM outsourced software. Microsoft outsourced hardware. Android is modular by design. The smartphone makers Apple destroyed — RIM, Palm, Nokia — all integrated hardware and software, but they built phones. Apple built a computer that happened to make calls, running a scaled-down version of its desktop operating system.
The open question is whether AI breaks this pattern. If the point of integration shifts from the device to the model, Apple’s position weakens. OpenAI is hiring Apple engineers and has Jony Ive designing a dedicated AI device. Apple’s response is characteristically Apple: open Siri to third-party AI providers, take 30% of the subscriptions, and let the model makers fight over who sits on top. The company that owns the integration point commoditises its complements. It’s the same playbook it has run for fifty years.
I don’t know if it’ll work for the next fifty. But I know the pattern. Every time Apple has bet on owning the whole stack, it has won. The one time it tried modular, it nearly went bankrupt.
Fifty years to the day. One idea. The screw has to be right, and you have to make it yourself.