If you really want to understand why Arm was so successful as a company I'd suggesting spending time listening to their their first CEO - Robin Saxby - in this engaging interview by Charbax from 2017.
And they are unlikely winners not because they came from the UK (seriously?) but because of the weakness of their commercial position when they started. It's a really interesting story.
This video has no auto-subtitles, but in general if a video has them then click on the 3 dots next to 'Share' and click 'Open transcript' for full transcript with clickable timestamps.
I had an Archimedes, the first Arm machine, in 1988 or so. It was marketed as the fastest microcomputer in the world at the time so I think Acorn knew they'd hit on something special. I was 14 and used to write games on the BBC Micro Model B and so upgrading to the Archimedes with it's speed, Sprite system and millions of colours was like suddenly being let loose in a huge unbounded creative arena. Good times.
I grew up programming the BBC Micro Model B. Starting a new school they had a computer lab kitted out with Archimedes, and once the computer teacher realised I could use it better than anyone else in the school (mostly programming in Basic), she left me to my own devices in the corner of the lab whilst the other students played around with "Turtles" (if you remember those robots).
In 1997 when I was in 5th grade, the first computer I used was a Acorn (it was either a Electron or Acorn) that was a hand me down from some UK schools that were done with them. We had this professor that got them as a donation for out school in Romania and set up a lab with these things. They were pretty old for that time, I assume, but they were my first contact with such a beast and I was hooked.
I did not pay attention to their make and did not know the history, I just got to learn by interacting with them. It would be about 4 years more before I managed to persuade my parents to buy me an actual PC.
I took a 6502 assembly language class when I was in high school I already had some experience with it so I spent my time doing machine language monitor with disassembler and helped others. It was the best class I had at the time.
My very first computer was a 4 bit TRS-80 pocket computer (actually made by Sharp), then moved to the 6502, the KIM-1, then 'upgraded' to a better processor with the Dragon 32, and reasonably crappy but still much better than the KIM graphics (essentially just 6 7 segment LED displays), the 6809 was so much nicer to program for. Then the BBC Micro hit general availability and the software that shipped with it was lightyears ahead of whatever else was avaialble for personal computers, and BBC Basic was actually a fairly good language to learn structured programming (can you imagine an era before functions?).
I too owe my career to the 8 bit machines that allowed me to own a proper computer.
Firstly, it's unapologetically super-british in way you can only experience in person (games for instance, live in the "diversions" folder, settings are called "choices", the desktop background is called the "backdrop", the media player is subscribed to the BBC by default. Instead of asking for localization, the clock goes to British time, the prices are in pounds, and you need to go out of your way to tell it you have a US keyboard. There's apps for british football news, unmounting disks is called "dismounting", etc... it's fun)
It's also unique in its own special way, not really fitting clearly into the UNIX or DOS lineage.
Hmm, non-British AmigaOS also used backdrop¹. Was that perhaps the common name back then?
Plus, you can experience /some/ of the quirkiness of RiscOS on Linux by playing around with rox². rox-filer is packaged for Debian and it spits a Choices directory in to ~, even though it is now just a bunch of symlinks to $XDG_CONFIG_HOME³.
AmigaOS was Canadian. We all know that's like halfway to being british. :)
Rox is a great theatrical production but I want to encourage people to spend some time with RiscOS ... it becomes less and less normal the more time you spend with it. You can do shell scripting with BBC Basic for instance and the coding styles honestly look closer to some elaborate preprocessor language than Basic.
RiscOS is essentially what I think would happen if you took a bunch of programmers and threw them on an island in 1985, told them to write an operating system and that you'd be back in a few years.
Meanwhile, an occasional soiled computer book would wash on the shore like tanenbaum's operating system book with 80% of the pages missing and the programmers pieced together incomplete passages based on conjecture and consensus like a bunch of biblical scholars with a torn ancient codex.
I read a comment on a forum where someone said programming in it reminded them of GEM (https://en.wikipedia.org/wiki/GEM_(desktop_environment)) but I haven't spent any significant time programming applications on either system so I can't really confirm.
As a Brit I agree, I muttered my way through the parochial tone. Now imagine the parallel universe where it had been the Dragon 32 whose CPU had taken over the world :)
That said, the degree to which the CoCo used software where other computers used hardware was very impressive and allowed for some pretty neat modifications without even touching a soldering iron. Oh, and the Dragon 32 actually had 64K of RAM which you could unlock entirely from software (it took a buddy of mine a couple of months to figure out how to keep the computer running when enabling that upper 32K after I figured out that it had the extra memory).
I'd doubt a lot of people who knew about the BBC Micro would know who made it. Acorn was obscure as the company wasn't front and center of the BBC initiative.
I want a proper "open architecture" arm computer form factor to blossom sooo much. Imagine building ARM computers like we build our X86/X64 boxes today....
I have a lot of doubts this will ever happen. If I'm not mistaken, the relative open-ness of modern PCs (being able to build a gaming pc for example) is basically an accident. It would be nice to have some regulation backing it up, too.
It all started with IBM publishing detailed technical information for the very first PCs, i.e. the PC, XT, and AT. This included full schematics for the motherboard as well as the BIOS source code listing (it was copyrighted, but only in the "copy" sense.) Companies like Intel were also far less secretive about their processors' documentation in those days. This started the whole "PC clone" industry, and things like the ISA bus became a de-facto standard as a result. There was far more cooperation between companies to enable expansion and compatibility, but I guess these days they are more interested in the amount of additional profit made by forcing consumers to buy an entire new system instead of upgrading parts, hence trends like soldered RAM and storage, and some companies (notably, one with a fruit logo...) outright attempting to squash the aftermarket/third-party ecosystem.
ARM (and many of its competitors, including MIPS and RISC-V) being only an IP core means they get put into lots of different SoC hardware where the only thing they have in common is the instruction set.
(There is "embedded x86", but as the reason for choosing that is compatibility with existing hardware and software, those do tend to be quite PC compatible, if only missing some of the normal peripherals.)
RiscOS definitely had significant influence on early Windows releases but the Taskbar ala Windows 95 was primarily a derivative of Windows 1.0 (minimized apps at the bottom of the screen) and Cairo (a reserved screen area) and Taskman (list of running apps)
The Win95 UI was a copy from NextStep, down to the teal desktop, every nerd understood that one at the time. Sure, some elements from prior Windows version survived, like the DOS icon.
> Windows 1.0 (minimized apps at the bottom of the screen)
This could very well be the case. The Win95 Taskbar could be positioned at the other three sides of the screen, IIRC. At least it could be positioned at the top.
Mac was more of an influence on Win95 than NeXT for obvious reasons. We had a couple of NeXT machines but didn't use them much. We had a couple of Archimedes machines and at least one team member was addicted to Lemmings :)
Yes, the Taskbar could be docked on any screen edge. FWIW, I wrote some of that docking code. I preferred the left edge (ala NeXT), the designers preferred the top (ala Mac) but it broke too many apps so it ended up back at the bottom.
"But what if, instead, you decided to make those CPUs all hail from a barely-known company from a country usually not the first to come to mind as a global leader in high-tech innovations (well, not since, say, the 1800s)?"
LEDs, TV, the Bombe & Colossus, microprogramming, first jet liner, first compiled computer language, DNA, caesium clock, first commercial transistor computer, first electronic desktop calculator, the RSA cipher, lithium-ion battery, the WWW.
I grew up on a beach in California and I'm aware of the amazing scientific and engineering achievements of the UK. I would also like to point the 30th anniversary of the web is on the front page and it was invented by a brit, so this author to me seems like he's trolling.
I am glad the first comment on HN is pointing this out.
I still think it is unfortunate UK couldn't bring its tech to the next level with IMG, ARM, Icera and arguably Dialog Semiconductor.
Or Interesting UniKernel project from Cambridge UK [1]. I mean there are lots of Innovation happening even right now in UK. Both hardware and Software.
> I still think it is unfortunate UK couldn't bring its tech to the next level with IMG, ARM, Icera and arguably Dialog Semiconductor.
In large part I think it's simply down to the size of the local market. If you're successful in the US you're already 5-10x the size a successful British company. This is in large part why Britain advocated for the creation of the Common Market which it is now inexplicably leaving.
The UK was essentially the France of the information technology industrial revolution. It made major technological contributions, but failed to capitalise on them successfully.
Among British computer enthusiasts, Acorn was as well-known as Dell might be here. They're only obscure to the Yanks, who think they run everything in the computer trade because they have IBM, Microsoft, and Apple.
We had a bunch of BBC Micros at school and they were really cool. A working LAN, a proper file server, logins - and ways to circumvent security ;-) Good old times!
> Acorn was well-suited to design a RISC CPU since the chip they were most familiar with, the 6502, is often said to be a sort of proto-RISC design.
I keep on hearing this claim, and I still don't buy it.
The 6502 was not "proto-RISC". It was a pretty traditional accumulator design, with an instruction set which lacked most of the typical features of a RISC design like minimal addressing modes (6502 had plenty of weird modes, 65C02 added even more) or general-purpose registers (6502 had only A, X, and Y, all of which were special-purpose to some degree or another).
Plus, the 6502 was hardly unique on the market. It was nearly a clone of the Motorola 6800, and wasn't all that far off from the Intel 8080 or its clone, the Z80.
Very annoyed by the comment of no innovation since 1800. Basically computer is invented there, developed there and now sort of re-invented there.
May be the intel and Motorola is just another Nokia, if one has to fight this undeserving fake war. An article about Uk innovation does not deserve to have an attack on a country which invented so many things, let alone computer itself.
It's really sad to hear that Acorn Computers is described as "obscure" nowadays... I wonder how many people aren't even aware of the British heritage of ARM. Anyway, the Nvidia acquisition has put everything to an end.
The greatness of British inventors is only exceeded by the small-mindedness, short-termism and self-sabotage of British investment and successive governments.
Let's not forget that Britain could have had mega fast fibre broaband in the 1990s if Margaret Thatcher had allowed British Telecom to proceed instead of 'leaving it to the market to decide'.
Plenty of similar stories in the US. In particular our Superconducting Super Collider which would have been around 3 times the energy of the LHC. We spent $2 Billion on it, dug 14 miles of tunnel, and then shut it all down and let it fill up with water. What a colossal waste.
To be fair it was a bit early technology-wise and they would have had to rip the entire thing out and start again three times since. We did pretty good out of copper in the end.
The expensive bit of putting in a fibre network is the holes and ducts.
Once you've got that you can replace fibre relatively cheaply. Guess what, BT's labs were well there too. They also invented the tech for blown fibre installation before anyone else.
BT were the world leaders until the Torys thew that away, for who knows why.
IMHO twisted pair copper or coax is nothing more than a slow cludge compared to real fibre.
...and by market she really meant wholly foreign owned corporations in the form of NTL and Telewest. Personally I blame Norman Tebbit for screwing the pooch on that one.
Still it's not like the current bunch of "classics" graduates have any more forsight, sadly.
I would like to learn a bit more of the details of how this developed from start to finish. Were there any other bidders apart from NTL and Telewest? This must be a matter of public record somewhere.
There are just so many projects that we've either canned or try to do on the cheap with disastrous results.
We're the only nation to develop launch capability not to use it, for example.
British soldiers had a barely functional, unreliable, rifle for years (the SA80) because the phrase "Buy cheap, buy twice" still hasn't entered the minds of own managing classes.
We are of course also running straight off a cliff soon, thanks to Brexit.
> British soldiers had a barely functional, unreliable, rifle for years (the SA80)
This is true, although the problems were all with the original A1 variant which, from everything I've read and watched, was absolutely an under-designed PoS. The subsequent A2 upgrade programme addressed these issues but it took about a decade longer than it should have done before it was implemented.
More so tbh. There are surely a lot more Brits who thought they were cool in 1977 who couldn't name a Clash song than Brits who went to school in 1997 who haven't used an Archimedes
BBC Micro and its evolution to mobile dominance is one of my go-to examples for how a lot of private sector innovations were originally high risk investments subsidized by the public.
That's pretty debatable. The British government never explicitly supported development of Arm technology. Acorn had a commercial arrangement with the BBC and schools bought BBC micros (including Archimedes but also others such as the PCs) but its a stretch to say that Arm was subsidized by the public.
Inmos which was a government supported CPU designer at about that time is long gone.
If anything the lesson from Arm is that it succeeded because of acute awareness of the commercial needs of its customers - e.g. the addition of Thumb to the ISA - and the need to build an international client base.
Inmos and the transputer morphed into xmos[1] - same basic ideas (the David May[2] connection) but refined, and the programmable gpio is pretty nice - seriously there aren’t many chips where you write your SDRAM controller in software...
They’re doing pretty well in audio, iirc. I tend to use FPGAs for most of the things I’d use an xmos for, but I’m probably not their market. If you want a microcontroller with guaranteed low latency input/output, they’re pretty neat, and the links between chips can be very useful.
I don't know but I have a portable dac from iFi that uses a XMOS processor for some operations. They warn against flashing firmware meant for other XMOS-based dacs, so it must have a certain market share in the segment.
Back in NZ in the 90’s my first computers were, at home, an IBM AT dad had salvaged, then a compaq 386 laptop which ran dos, had one those GUI app loaders and windows. I spent more time in DOS. In the classrooms we had Acorns, until Pentium was released.
That's a 'personal computer' only in the sense that IBM put 'PC' in the name - it's a $20k workstation. (Not suggesting that three's any point editing the title further)
I know a lot less about these (the name and the fact they existed covers most of it) than the wikipedia page, but it's such an early commercial RISC microprocessor, first desktop seems like a plausible title. This made me wonder whether ROMP isn't a sort of 'lost' architecture. There are probably a lot more of other failed RISC designs still kicking around the world (the 88k, i960, etc)
I think outside the UK Acorn has been obscure, at least before the Archimedes. First time I learned about Acorn in Germany was when they started marketing the Archimedes against the Amiga, and I only learned a few years ago when diving into emulator programming that Acorn had built 6502-based computers before the Archimedes.
AFAIK the BBC micro hasn't been sold outside the UK, or if it was, then it was vastly overshadowed by the Sinclair and Amstrad machines.
> I think outside the UK Acorn has been obscure, at least before the Archimedes
No way. The BBC Micro was seen all over Europe, if you could afford one you had one, and plenty of schools signed on to the econet and installed them by the dozens.
Commodore was seen as not appropriate for classroom use ('a gaming machine') whereas the BBC Micro was seen as an educational tool. Which it actually was (it also made for a pretty good games machine, as many students found out to their joy).
My first micro was a BBC, purchased all the way down in South Africa. Maybe it was not known in the USA, but computer magazines everywhere else were stuffed with BBC ads, accessories and software at the time.
Yes, but outside the UK the Beeb and Acorn were very obscure. Sinclair and Amstrad had some degree of recognition outside the UK, but Acorn was pretty much entirely UK specific.
Interestingly when I researched the Acorn Atom (very simple 6502-based home computer from the early 80's) for an emulator, I found a very active Dutch community which helped me out with some questions I had. But AFAIK the Netherlands seemed to be the only '8-bit Acorn enclave' outside the UK, in other European countries Acorn only became somewhat popular with the Archimedes 3000 which was marketed as an "Amiga killer".
I've seen them in Germany, Belgium, France, and those were the only countries I had regular access to at the time besides Poland where I never saw any western machines other than an Atari 520 and the occasional whitebox IBM PC clone.
Because there were multiple relative giants in the 80's and 90's that had their own processor designs. The competition was broad, and much of it from far more well funded companies.
For all of them to falter and ARM to succeed was a real long-shot.
Even well into the 90's, when several companies were looking at switching CPU architectures or supporting alternative architectures, ARM was just not on the radar for most developments, because people were looking to compete with x86 on performance.
People talked about the Alpha, MIPS, PPC, PA-RISC, not ARM. When ARM started appearing on the radar as more than a low end embedded option towards the end of the 90's, it was DEC's StrongARM family.
It was a very long way coming. Which makes it all the more impressive that they made it.
Isn't that mostly because all the RISC's either killed themselves or got killed, rather than ARM outcompeting as per se?
With the Alpha especially, it was a good architecture but (before my time, so I may be wrong) DEC seemed to have no idea how to actually sell it sustainably. Those companies probably would've been shocked at the idea of today's world where your average consumer can buy 20 billion transistor chip just to play games on and everyone owns about 10 other smaller processors (as opposed to a computer being a thing you either bought for your family, or a massive workstation that costs as much as a car)
Sort of, but the point is that ARM avoided that, while half a dozen companies with financial muscle allowing them to try and fail far more ended up failing outright or relegating themselves to niches.
ARM could have just as easily ended up making similar mistakes and died. That they didn't is pretty remarkable.
> Isn't that mostly because all the RISC's either killed themselves or got killed
Every ecosystem where Windows was relevant ended consolidating on x86, because that was the platform that had best Windows support (Microsoft didn't even make their own software available on most of those).
POWER survived because it has its own ecosystem. Alpha was axed by Compaq, because Windows on Alpha sucked and they knew little else, PA was starved to death by HP with lack of investment, then buried by moving to Itanium (with the spectacular results we all know). SPARC wandered in the desert until Oracle acquired Sun and promptly bled it to death.
The main benefit of the ARM processor design was very high performance per watt which carved out a significant niche that then grew into a massive market
In the beginning, it had great power characteristics, but it was also much faster than any x86.
That it evolved in the direction of power efficiency is more an effect of it being power efficient and being popular with embedded manufacturers, which created a feedback look that guided it to extreme power efficiency with reasonable performance, while x86 went the route of performance, power be damned.
They didn't try to compete on performance and didn't try to compete with Intel - both would have been losing strategies. Instead they competed on cost, power and on the ability to work better with other firms who didn't want to support their own architecture.
The comparison with MIPS I think is most interesting. MIPS could easily have taken the markets that Arm dominated but the strategy wasn't right.
Even so, going from designing computers using off-the-shelf components to creating your own processor design with an innovative architecture was a bold and risky bet. Not many computer manufacturers took this route.
I remember UK microcomputers like the BBC Micro, Sinclair Spectrum and Acorn micros were mentioned often in UK conputer magazines and even in books back in the day.
"It's nice to just take a moment and reflect: because the British felt they were being left behind by the computer revolution, they decided to make TV shows about computers. To do that, they needed a computer, so an underdog British company came up with a good one. And when that little company needed to build a faster CPU, because Intel couldn't be bothered to answer their calls, they made their own. This in-house CPU just so happened to not use much power or make much heat, which got the attention of Apple, who used it to power what most people consider to be its biggest failure. From there, of course, the company went on to take over the fucking world.
If I made that up, you'd say I was trying too hard to be quirky or that I'd seen too many Wes Anderson movies. But that's reality."
Um, not really, with the Sinclair machines, the UK had more computers per capita than anywhere else at the time. They were very very limited, and ultimately the bet on ubiquity over capability didn’t pay off, but pretty much every kid I knew had one.
I was a docker’s brat in a run-down Northern city, we didn’t have much, growing up, but my parents bought me the zx81 components kit (£49) and a second-hand black&white TV (£20) for Xmas when I was 11, and that year at school, computers were the thing. Like, everyone had one.
I’d suggest the number of British expats in Silicon Valley who started programming on a BBC Micro or Spectrum suggests that it very much did pay off, just not in the way anticipated.
That’s a good point. I originally meant that the PC took over the world, but if you take a step back and look at the influence of that ubiquity, yeah. That’s a good point :)
Yup, and me ;-) I probably encounter at least one or two more expats I wasn't aware of per week at the FAANG I work for, and always make a point of asking them about this - certainly seems that the ubiquitous small machines that someone could completely understand have made their mark!
Did you have a Multiface? Not all that cheap, but probably ended up teaching me more about low-level programming than anything else, short of writing a Spectrum 128 emulator for Risc OS :)
> ultimately the bet on ubiquity over capability didn’t pay off
Because they didn't know where to go from there. Instead of capitalizing on their existing userbase by making better machines that were also compatible with the currently popular model, they alienated their fans by making incompatible or barely-improved crap that nobody asked for: the ZX Spectrum 128, with just slightly more memory and a music chip but still using horrible graphics and tape storage, the QL, the Commodore 128...
The masters of both Sinclair and Commodore were tone deaf suit-zombies with no clue about the zeitgeist, no idea what gamers, the majority of their users, really wanted. Most of them were people who couldn't afford to run multiple computers (only having 1 TV or computer desk to spare) or part with their existing software library for minimal benefits.
On the other hand you had consoles like the NES and SNES, which offered enough new features and came after a long enough time for people to not mind the incompatibility.
That's not actually how it went, either. One of the major aims of the BBC Micro was educational use, with use in TV an afterthought, and the processor was designed after they'd already made several MOS-based computers.
The ARM chip was low power essentially by accident. The designers wanted to keep it under (I think) 10 watts which would let them use cheap plastic packages for the chips. But they didn't have good power simulation tools so they needed to be very conservative with the design. When they manufactured the chips, they found the power consumption was closer to 1 watt.
Edit: as beagle3 pointed out, I got the numbers wrong; they aimed for 1 watt and came in under 0.1 watt. For more details, see Steve Furber's oral history, page 15. He says "Effectively, we used Victorian engineering margins on the power consumption of the chip." https://archive.computerhistory.org/resources/access/text/20...
May find it later when I am not on mobile, but iirc they aimed for 1W and ended up achieving 0.1W or so - at first they were sure something was wrong about their test rig
because it seemed like an unpowered CPU was actually working - and then they realized that it was getting enough power through the data and address lines, even though the Vcc was not connected.
> The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.
> Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.
> As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."
> Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt.
I remember one quote from Hermann Houser (one of the founders of Acorn)... "I gave them two important things to develop the Arm chip. I gave them no money, and I gave them no people".
That's reminiscent of the famous "janitor" letter written by IBM's president Watson in 1963. He asked why the CDC 6600 was a better computer than IBM's when Seymour Cray's team had "34 people including the janitor" while IBM had a "vast development effort".
Cray's response (probably apocryphal) was, "It seems like Mr. Watson has answered his own question."
I'd love to see a deep analysis of this, but I think a confluence of a few elements:
- fixed instruction width, small instruction set => simple decode
- instructions wide enough to load many constants into registers in immediate mode
- comparatively large register file (15 general + PC) compared to the 80386 (effectively 8 + PC, plus the segment registers) meant reduced need to fill/spill intermediate values, a tradeoff which gets better over time as the power consumption of bus doesn't benefit from Moore's law since it's off chip
- any-to-any register instructions saves space and time shuffling values within the CPU (again, a hassle on the x86 series)
- conditional execution bits were a nice pipelining assistance trick, not clear how widely used they were as they weren't included in Thumb
I don't think RISC makes software more complex, rather the contrary. You need to use more instructions to do the same task, but they are simpler, more regular. Especially when writing compilers, RISC is actually a nice target.
It was about packaging. If you run cool, you can get away with much cheaper materials by this saving significant portions of the total manufacturing costs. This was an explicit design goal.
https://www.youtube.com/watch?v=FO5PsAY5aaI
And they are unlikely winners not because they came from the UK (seriously?) but because of the weakness of their commercial position when they started. It's a really interesting story.