I read Brian's "UNIX: A History and a Memoir"[0] last year, which wound up also being an excellent snapshot of the goings-on in Bell Labs around that era. Kicked off a long month of Wikipedia binging. If you want more detail from a guy who was there, it's a good read.
I recommend not buying the print copy though: Amazon prints it themselves on demand and it's far and away the worst quality book I've ever held. The cover is pixellated and skewed, and most of the images inside look like they started life as 50×50px thumbnails.
Edit: apparently the Kindle version is "just the PDF of the paper version"[1] per Brian himself and is thus also terribly formatted. Welp, damned if you do. Still worth a read.
I got the kindle version and couldn't read it on the kindle due to how it was formatted, but I agree its a great book. I'm pretty sure it says it was created with troff, which is neat because a lot of the early days stuff he covers is about printing and laying out text.
I ordered it from Amazon and had the same issues...the cover was terrible quality and looked like it had been printed on someone's inkjet from the 90s. It was still a great read, though.
Interesting! Maybe they changed the process at some point. I don’t own the book, but a while ago I had a quick look at my brother-in-law’s copy and it also had terrible printing quality.
I think it is more likely that, depending on your geographical location, they use different printing shops (they pick the one which is closer to you). And printing shops have widely disparate qualities, impossible to predict beforehand.
In the last years I've returned several 100-dollar Springer "hardcovers" because they were of laughably bad quality. Now I take only second-hand copies or buy them in bookstores, it's the only way to be sure.
Wow. I knew they had issues with counterfeit stuff on Amazon, but I didn’t know it also applied to the books they’re selling. Fascinating and horrifying :)
I know there was a widespread issue of counterfeit copies of "The Art of Electronics" to the point where the book's official website has a large warning on the front page.
Luckily you can return stuff to Amazon, but makes me think, does that mean someone else will have to deal with purchasing that book?
> Sixth-edition Unix, released in 1975, was the first widely used version of the operating system. It included a number of other core Unix concepts, including the hierarchical filesystem, devices represented as files, a programmable shell, regular expressions, and more. All of this was implemented in about 9,000 lines of C code.
Probably old news but fascinating all the same - clearly it was a simpler time.
This was before OOP and "overabstraction" really took off and distorted our perception of how much code is actually necessary to accomplish something. It's important to remember that Kernighan and the others who worked on it at the time weren't really "code-golfing"; this was just their normal "code density".
I mean... original UNIX was multi-process but single-thread. It time shared by swapping the current process out to disk entirely and swapping in the next runnable task. When you get to ignore concurrency that makes life simpler but I don't think you'd enjoy using a system designed that way today. And that's just one part of what most of us consider a "minimal" OS today yet was absent in UNIX originally.
I guess this is the coding equivalent of 1950s Nuclear Family nostalgia? Pining for something you'd hate if you had to suffer under it, idealizing the past, and ignoring how and why we got to modern times (the past sucked in a lot of ways).
You might want to look up Xv6. Very similar to the original V6 kernel, just as compact, but runs on standard x86 PCs, and also supports SMP. There's a few articles on HN about it.
The article quotes the model of the first (rather than 0th) edition system as a PDP-11/20; which the architecture page on Wikipedia says: "So originally, a fully expanded PDP-11 had 28K words, or 56 kbytes in modern terms."
28K instructions / data ops for the 'OS' and all guest programs, with 4K words (also 16 bit) of memory mapped IO / device registers.
It's clear the entire system would have been constructed in what today might be thought of as a 'Minimum Viable Product' format. Powerful functions and operations, but focused on a very narrow tolerance for inputs. The intelligence of even 'dumb' terminals makes a ton of sense and this gives me a greater perspective of the relative power required for early laser printers (rasterizing vector graphics on the fly) in the 80s.
It is also because the code didn't include much error handling or documentation. Everybody can remember their first experience with the editor 'ed'. Nor did it handle complex things like non-English characters or smart terminals. The system was minimal in the extreme because the hardware they were working on was shockingly underpowered by today's standards. A full lab worth of users would be sharing 56kB of RAM on their PDP11. When your environment is that constrained you don't have the luxury of writing bloated code.
More importantly, thay had no major back compatibility requirements, a small amount of hardware to write drivers for, and fairly simple algorithmic choices would work.
("When in doubt, use brute force" -- Ken Thompson).
The Alto ran a full-fledged mouse-driven GUI, IDE + programming language with rich class library, E-Mail, text-processing in 64KB memory (128KB RAM, half of that for the screen) and 2.5MB disk.
For anyone interested in Unix history, I'm currently reading hist book "UNIX A History and a Memoir". A pleasant well-written book. He explains many interesting details about people, namings, challenges and rationale behind technical choices. I would recommend the book. The only downside is the very bad print by "Kindle Directl Publishg", unfortunately. The first page reads "Printed in America" and the last page reads "Printed in Poland" and the pictures are printed in grayscale.
Brian is great at painting a picture of what it was like in "The Unix Room" and it's interesting to think about working in a world where you and your colleagues are all sharing machine, adding things to bin for others to use, etc.
And in fact, the only real rule there was, you changed it last, it’s yours.
At one point, I had an idea for improving the text editor, ed at the time. And so I went in and added some things to ed, very small stuff, but in that sense. And I was just perfectly fine, but I guess technically at least for a brief period, I owned ed. That encouraged you to be somewhat careful.
But I think that having everything on the same computer was contributed to a sense of community as well. And part of the community was the shared information, but the other part of it was simply knowing that other people were logged in at the same time.
Oh it's still a thing from hardware companies, I recently had to warn a dev away from using a lib by ST because it didn't have a license and it created legal vuln they might have to deal with down the line. While that would have been fine in a hack project; in a commercial product (what they were working on) that was unacceptable. Hardware companies rarely understand or care about the software side of things and it makes for all sorts of fun in places like the Linux kernel where they actually have to make that fun do what the end user actually expects.
I'm sitting here with a baffled look, unsure whether to laugh, so I'll ask: are you joking or serious? I've never been to New Jersey, but I've never heard anything to suggest it's inspiring, technologically or otherwise.
Not OP, but Jersey is where Marconi first set up trans-Atlantic radio towers, and also home to Tesla and Edison's experiments. Then of course there's Princeton and Bell Labs. Must be something in the air.
Tbe story of how they reverse engineered the brand new typesetting printer as it was so buggy they wrote their own software for it is most fascinating. A typical equivalent would be rewriting one's laser printer's firmware to perform better nowadays. Made me grin for the hell of it.
Today we have software that span 1M LOC or more, because there are companies
(software companies) who just sold software! so they can be payed millions $$.
Can a software company be payed well if it just ships a solution with just 1K LOC
or less?
I recommend not buying the print copy though: Amazon prints it themselves on demand and it's far and away the worst quality book I've ever held. The cover is pixellated and skewed, and most of the images inside look like they started life as 50×50px thumbnails.
Edit: apparently the Kindle version is "just the PDF of the paper version"[1] per Brian himself and is thus also terribly formatted. Welp, damned if you do. Still worth a read.
[0] https://www.amazon.com/dp/B07ZQHX3R1/
[1] https://www.cs.princeton.edu/~bwk/memoir.html