Hacker News new | past | comments | ask | show | jobs | submit login
Brian Kernighan on the Origins of Unix (lwn.net)
210 points by signa11 on Jan 28, 2022 | hide | past | favorite | 44 comments



I read Brian's "UNIX: A History and a Memoir"[0] last year, which wound up also being an excellent snapshot of the goings-on in Bell Labs around that era. Kicked off a long month of Wikipedia binging. If you want more detail from a guy who was there, it's a good read.

I recommend not buying the print copy though: Amazon prints it themselves on demand and it's far and away the worst quality book I've ever held. The cover is pixellated and skewed, and most of the images inside look like they started life as 50×50px thumbnails.

Edit: apparently the Kindle version is "just the PDF of the paper version"[1] per Brian himself and is thus also terribly formatted. Welp, damned if you do. Still worth a read.

[0] https://www.amazon.com/dp/B07ZQHX3R1/

[1] https://www.cs.princeton.edu/~bwk/memoir.html


I got the kindle version and couldn't read it on the kindle due to how it was formatted, but I agree its a great book. I'm pretty sure it says it was created with troff, which is neat because a lot of the early days stuff he covers is about printing and laying out text.


my print copy didn't have the issues you mention.


I ordered it from Amazon and had the same issues...the cover was terrible quality and looked like it had been printed on someone's inkjet from the 90s. It was still a great read, though.


Interesting! Maybe they changed the process at some point. I don’t own the book, but a while ago I had a quick look at my brother-in-law’s copy and it also had terrible printing quality.


> Maybe they changed the process at some point.

I think it is more likely that, depending on your geographical location, they use different printing shops (they pick the one which is closer to you). And printing shops have widely disparate qualities, impossible to predict beforehand.

In the last years I've returned several 100-dollar Springer "hardcovers" because they were of laughably bad quality. Now I take only second-hand copies or buy them in bookstores, it's the only way to be sure.


Could be that Amazon sold you a counterfeit copy — it happens a lot with popular titles because of their inventory commingling.


Wow. I knew they had issues with counterfeit stuff on Amazon, but I didn’t know it also applied to the books they’re selling. Fascinating and horrifying :)


I know there was a widespread issue of counterfeit copies of "The Art of Electronics" to the point where the book's official website has a large warning on the front page.

Luckily you can return stuff to Amazon, but makes me think, does that mean someone else will have to deal with purchasing that book?


I bought it new, which means Amazon printed it on demand for me. So I'm not sure how counterfeiting would be possible.


Maybe the good copy was the counterfeit.


Does not matter. Get the book! It’s a concise and entertaining read! I loved it.


I just bought OSTEP from Amazon and it also has the problems you’re describing.


Video of the talk is available on YouTube https://www.youtube.com/watch?v=ECCr_KFl41E


> Sixth-edition Unix, released in 1975, was the first widely used version of the operating system. It included a number of other core Unix concepts, including the hierarchical filesystem, devices represented as files, a programmable shell, regular expressions, and more. All of this was implemented in about 9,000 lines of C code.

Probably old news but fascinating all the same - clearly it was a simpler time.


This was before OOP and "overabstraction" really took off and distorted our perception of how much code is actually necessary to accomplish something. It's important to remember that Kernighan and the others who worked on it at the time weren't really "code-golfing"; this was just their normal "code density".


I mean... original UNIX was multi-process but single-thread. It time shared by swapping the current process out to disk entirely and swapping in the next runnable task. When you get to ignore concurrency that makes life simpler but I don't think you'd enjoy using a system designed that way today. And that's just one part of what most of us consider a "minimal" OS today yet was absent in UNIX originally.

I guess this is the coding equivalent of 1950s Nuclear Family nostalgia? Pining for something you'd hate if you had to suffer under it, idealizing the past, and ignoring how and why we got to modern times (the past sucked in a lot of ways).


You might want to look up Xv6. Very similar to the original V6 kernel, just as compact, but runs on standard x86 PCs, and also supports SMP. There's a few articles on HN about it.


The article quotes the model of the first (rather than 0th) edition system as a PDP-11/20; which the architecture page on Wikipedia says: "So originally, a fully expanded PDP-11 had 28K words, or 56 kbytes in modern terms."

28K instructions / data ops for the 'OS' and all guest programs, with 4K words (also 16 bit) of memory mapped IO / device registers.

It's clear the entire system would have been constructed in what today might be thought of as a 'Minimum Viable Product' format. Powerful functions and operations, but focused on a very narrow tolerance for inputs. The intelligence of even 'dumb' terminals makes a ton of sense and this gives me a greater perspective of the relative power required for early laser printers (rasterizing vector graphics on the fly) in the 80s.


It is also because the code didn't include much error handling or documentation. Everybody can remember their first experience with the editor 'ed'. Nor did it handle complex things like non-English characters or smart terminals. The system was minimal in the extreme because the hardware they were working on was shockingly underpowered by today's standards. A full lab worth of users would be sharing 56kB of RAM on their PDP11. When your environment is that constrained you don't have the luxury of writing bloated code.


More importantly, thay had no major back compatibility requirements, a small amount of hardware to write drivers for, and fairly simple algorithmic choices would work.

("When in doubt, use brute force" -- Ken Thompson).


And no need to worry about the Internet (as we know it) or cybersecurity.


Exactly. A large chunk of their security needs were taken care of by locking the server and terminal rooms.


I don't find the SerenityOS - where kernel and user-land is all C++ - really that "overabstracted".


Sure you got the right culprit here?

The Alto ran a full-fledged mouse-driven GUI, IDE + programming language with rich class library, E-Mail, text-processing in 64KB memory (128KB RAM, half of that for the screen) and 2.5MB disk.


And safe systems programming languages on top of that.


They were actually "code-squeezing", and that was normal, because computing power was really scarse those days.


People keep doing it to this day on embedded systems.


Isn't it amazing to think that the Sixth edition of Unix was 2.3% the size of a base Wordpress install.


I just checked the line count of a simple tar implementation I'm working on, and so far it is around 4K lines and I'm not even finished yet.


For anyone interested in Unix history, I'm currently reading hist book "UNIX A History and a Memoir". A pleasant well-written book. He explains many interesting details about people, namings, challenges and rationale behind technical choices. I would recommend the book. The only downside is the very bad print by "Kindle Directl Publishg", unfortunately. The first page reads "Printed in America" and the last page reads "Printed in Poland" and the pictures are printed in grayscale.


This short video demonstrates why pipes were crucial for UNIX's success:

https://www.youtube.com/watch?v=3Ea3pkTCYx4


I like this interview with Brian from the corecursive podcast. He explains how it was like for him working at Bells Labs https://corecursive.com/brian-kernighan-unix-bell-labs1/


Brian is great at painting a picture of what it was like in "The Unix Room" and it's interesting to think about working in a world where you and your colleagues are all sharing machine, adding things to bin for others to use, etc.

    And in fact, the only real rule there was, you changed it last, it’s yours. 

    At one point, I had an idea for improving the text editor, ed at the time. And so I went in and added some things to ed, very small stuff, but in that sense. And I was just perfectly fine, but I guess technically at least for a brief period, I owned ed. That encouraged you to be somewhat careful. 

    But I think that having everything on the same computer was contributed to a sense of community as well. And part of the community was the shared information, but the other part of it was simply knowing that other people were logged in at the same time.


I’ve been fascinated with the origins of FOSS. Wikipedia works to explain in the early days the OS was free, because it was all about the hardware.

It’s fantastic for the Heroes of the early days of Software to have their time in the sun. It really is the age of software.


Oh it's still a thing from hardware companies, I recently had to warn a dev away from using a lib by ST because it didn't have a license and it created legal vuln they might have to deal with down the line. While that would have been fine in a hack project; in a commercial product (what they were working on) that was unacceptable. Hardware companies rarely understand or care about the software side of things and it makes for all sorts of fun in places like the Linux kernel where they actually have to make that fun do what the end user actually expects.


> Could something like Unix happen again?

I wish Plan 9 had taken off.


I'm sorry they don't talk more the New Jersey area where all this was taking place. Obviously, Jersey is a technological inspirational landscape.


I'm sitting here with a baffled look, unsure whether to laugh, so I'll ask: are you joking or serious? I've never been to New Jersey, but I've never heard anything to suggest it's inspiring, technologically or otherwise.


Not OP, but Jersey is where Marconi first set up trans-Atlantic radio towers, and also home to Tesla and Edison's experiments. Then of course there's Princeton and Bell Labs. Must be something in the air.


Tbe story of how they reverse engineered the brand new typesetting printer as it was so buggy they wrote their own software for it is most fascinating. A typical equivalent would be rewriting one's laser printer's firmware to perform better nowadays. Made me grin for the hell of it.


"In 1991, Linus Torvalds announced his work on Linux, and "the rest is history"."


Today we have software that span 1M LOC or more, because there are companies (software companies) who just sold software! so they can be payed millions $$. Can a software company be payed well if it just ships a solution with just 1K LOC or less?


Great video of Brian Kernighan & friends at Bell Labs explaining the UNIX system:

https://www.youtube.com/watch?v=tc4ROCJYbm0




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: