Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
MIT Missing Semester 2026 (csail.mit.edu)
90 points by vismit2000 1 day ago | hide | past | favorite | 66 comments




There's definitely a tension at top STEM schools (probably especially in CS) between assuming students have some baseline knowledge of whatever field and just tossing them into the deep end of the pool and figuring out the practicalities on their own.

I did take one of the MIT intro CS MOOCs at one point for kicks. Very good. But it was more or less learn Python on your own if you don't already know it (or how to program more broadly). That doesn't really happen in a lot of other disciplines other than some areas of the arts.


At one university I went to, the head of the CS department was quoted as saying "[We don't need to care about the job market,] Our job is to create researchers."

I thought that was pretty strange at the time because like 5% of the students end up going into research. So that was basically like him saying I'm totally cool with our educational program being misaligned for 95% percent of our customers...

Maybe it makes sense for the big picture though. If all the breakthroughs come from those 5%, it might benefit everyone to optimize for them. (I don't expect they would have called the program particularly optimized either though ;)


Well you can say there is a difference between "computer science" and "software engineering", plus many "universities" are particularly research focused.

A chemistry, physics, or even MechE BS is coming out only at the very beginning of their training, and will require lots of specific on-the-job training if they go into industry. School is about the principles of the field and how to think critically / experimentally. E.g. software debugging requires an understanding of hypothesis testing and isolation before the details of specific tech ever come into play. This is easy to take for granted because many people have that skill naturally, others need to be trained and still never quite get it.

Edit: of course if only 5% of grads are going on to research then maybe the department is confused. A lot of prestigious schools market themselves as research institutions and advertise the undergrad research opportunities etc. If you choose to go there then you know what you're getting into.


>A lot of prestigious schools market themselves as research institutions

Out of one side of their mouth maybe.

Out of the other, they absolutely are not telling potential undergrads that they may tolerate them but they're really focused on research.


> I don't expect [the 5% of students who end up going into research] would have called the program particularly optimized either

This. I went to the University of Iowa in the aughts. My experience was that because they didn't cover a lot of the same material in this MIT Missing Semester 2026 list, a lot of the classes went poorly. They had trouble moving students through the material on the syllabus because most students would trip over these kinds of computing basics that are necessary to experiment with the DS+A theory via actual programming. And the department neither added a prereq that covers these basics or nor incorporated them into other courses's syllabi. Instead, they kept trying what wasn't working: having a huge gap between the nominal material and what the average student actually got (but somehow kept going on to the next course). I don't think it did any service to anyone. They could have taken time to actually help most students understand the basics, they could have actually proceeded at a quicker pace through the theoretical material more for the students who actually did understand the basics, they could have ensured their degree actually was a mark of quality in the job market, etc.

It's nice that someone at MIT is recognizing this and putting together this material. The name and about page suggest though it's not something the department has long recognized and uncontroversially integrated into the program (perhaps as an intro class you can test out of), which is still weird.


>It's nice that someone at MIT is recognizing this and putting together this material. The name and about page suggest though it's not something the department has long recognized and uncontroversially integrated into the program (perhaps as an intro class you can test out of), which is still weird.

While this comes out of CSAIL, I wouldn't ascribe too much institutional recognition to this. Given the existence of independent activities period, it's probably a reasonable place for it given MIT's setup. Other institutions have "math camp" and the like pre-classes starting.

It's probably a reasonable compromise. Good schools have limited bandwidth or interest in remedial education/hand-holding and academics don't have a lot of interest in putting together materials that will be outdated next year.


> Good schools have limited bandwidth or interest in remedial education/hand-holding and academics don't have a lot of interest in putting together materials that will be outdated next year.

I think they rarely escape doing this hand-holding unless they're actually willing to flunk out students en masse. Maybe MIT is; the University of Iowa certainly wasn't. So they end up just in a state of denial in which they say they're teaching all this great theoretical material but they're doing a half-assed job of teaching either body of knowledge.

I also don't think this knowledge gets outdated that quickly. I'd say if they'd put together a topic list like this for 2006, more than half the specific tools would still be useful, and the concepts from the rest would still transfer over pretty well to what people use today. For example, yeah, we didn't have VS Code and LSP back then, but IDEs didn't look that different. We didn't (quite) have tmux but used screen for the same purpose. etc. Some things are arguably new (devcontainers have evolved well beyond setting up a chroot jail, AI tools are new) but it's mostly additive. If you stay away from the most bleeding-edge stuff (I'm not sure the "AI for the shell (Warp, Zummoner)" is wise to spend much time on) you never have to throw much out.


The whole container universe is pretty different even if the process/threads/etc. foundations aren't that changed. Certainly <umm> a book I wrote about the state of computing in the early 2010s--largely derived from things I had written over a few prior years--was hopelessly out of date within just a few years.

There certainly are fits and starts in the industry. I'm not sure the past 5 years or so looks THAT different from today. (Leaving aside LLMs.)

From my peripheral knowledge, MIT does try to hand-hold to some degree. Isn't the look-left and look-right, one of those people won't be here next year sort of places. But, certainly, people do get in over their head at some places. I tutored/TAd in (business) grad school and some people just didn't have the basics. I couldn't do remedial high school arithmetic from the ground up--especially for some people who weren't even willing to try seriously.


> Certainly <umm> a book I wrote about the state of computing in the early 2010s--largely derived from things I had written over a few prior years--was hopelessly out of date within just a few years.

I could see it being obsolete quickly to the extent that when someone was trying to learn devops and saw a book on the (virtual) shelf that didn't cover containers next to one that did, they'd pick the latter every time. You probably saw this in your sales tanking. But I'm not sure many of the words you actually did write became wrong or unimportant either. That's what I mean by additive. And in the context of a CS program, even if their students were trying out these algorithms with ridiculously out-of-date, turn-of-the-century tools like CVS, they'd still have something that works, as opposed to fumbling because they have no concept of how to manage their computing environment.


I didn't care about sales :-) It was free and I did a couple of book-signings at sponsored conferences that other people paid for. A lot of the historical content remained accurate but the going-forward trajectory shifted a lot.

The way DevOps evolved was sort of a mess anyway but welcome to tech.

I sort of agree more broadly but I can also see a lot of students rolling their eyes at using outdated tools which is probably less the case in other disciplines.


I could definitely see eye-rolls if students who know (of) git are being taught about CVS. But I'm not sure it matters that much. This stuff is tangential to the core course material, so a student (or small project group) can pick the tool of their choice. If they know something newer or better than suggested, great.

> Isn't the look-left and look-right, one of those people won't be here next year sort of places.

the same MIT that doesn't give out grades in the first year? (just Pass / NoPass)

the high achievers who scored solid grades to get there literally kill themselves when they pull Cs and Ds, even though it's a hard class and is sort of "look left, look right"


Not sure of your point. Pass/Fail was intended to ease freshmen in. (Most people didn't fail.)

Yes, poor grades were often a shock to people accustomed to being straight A students in high school. Though most made it through or ended up, in some cases, going elsewhere.


Probably one of those thoughts you should self-filter (and the alumni association sure wishes you would).

But it's also the case that (only half-joking) a lot of faculty at research universities regard most undergrads as an inconvenience at best.


In some schools they have a separate degree program in informatics or computer technology, for precisely this reason -- computer science is a different field.

They like to say things like that or some version of "we want to teach the concepts, the specific technology changes too fast". Does it? Just seems lazy to me.

Historically, the point of a university is not to be a jobs training program.

It kind of depends on how you define "history". Before STEM dominated the hiring landscape, Universities were less career focused. No employers in these fields, as far as I know, have ever offered apprenticeships to teach new hires chemical engineering or applied mathematics from the ground up. University will not prepare you for a corporate job, exactly, but it gives you a background that lets you step into that, or go into research, etc. Lots of employers expect new hires to have research skills as well.

I think there are a number of ways in which financial incentives and University culture are misaligned with this reality.


Historically that's true, but I don't think it's true in 2025.

I'm not gonna recommend them to anyone then, because the number one problem most of my friends have is having crappy jobs

Youre not going to recommend college? Or jobs?

Personally, I do not recommend jobs. Avoid them as much as possible.

So true

It's tough to for me to judge cause I've been programming for 30 years maybe I'm underestimating how hard it is, but I look at learning a new language very different that trying to understand the graduate level CS work I've seen at a top STEM school.

Git, shell, basics.. even simple python if you have any at all programming experience - not nearly as hard as what they're teaching in the class.

Most of the time something like that like learning latex or git basics.. they'll say.. you'll pick up what you need. They're not gonna spend 12 weeks on those subjects they aren't hard enough.


Discrete tools are fairly easy. On the other hand, I think a lot of people here would laugh at the "text book" for the introductory FORTRAN course I took at said school.

Of course, you were struggling with fairly primitive tools at the time as well. Made a typo? Time to beg the grad students running the facility for some more compute cycles.

Although it's out of print I don't immediately see a full copy online. https://www2.seas.gwu.edu/~kaufman1/FortranColoringBook/Colo...


I feel like most first intro classes in Computer Science is learn the coding language on your own. At first I was like why? Why don't they hold our hands while we do this. But since I have had some space to look back it really is a pretty good representation of our industry. You are going to need to learn new languages. So getting thrown in the deep end is a pretty good precursor for what work is going to look like.

This isn’t a bad idea, just not for the intro course. When I was an undergrad “programming language” was this course. You were given a brief introduction to brew languages and paradigms then expected to figure it out from there. But at this point you had a foundation of experience to build on.

I don't totally disagree. On the other hand, based on the MOOC I took, had I been going in literally cold (as in college, new experiences, this is my chance to dive into CS and programming), I'd have been completely lost in a way that wouldn't have been the case in other engineering disciplines.

Now, I'm sure some would argue "tough." What are you doing at MIT then? And certainly, there are SO many opportunities these days to get some grounding in a way that may not be as readily possible with chemistry much less nuclear engineering for example. But it is something I think about now and then.


What makes you think this would not have been the case in other engineering disciplines?

I'm also a CS guy so I can't directly challenge this on the whole, but my experiences in some classes outside of this in other domains didn't feel like they were 'comfortably' paced at all. Without extensive out-of-class work I'd have been completely lost in no time. In fact one electrical engineering course I took was ironically considered a weed out course, for computer science, as it was required, and was probably the most brutal (and amazing) class I've ever taken in my life.


Personal experience?

I had basically a machine shop course in mechanical engineering in college. OK, it was a bit more than that but I had no "shop" in high school.

Certainly nothing in high school anything that would have really prepared me for a civil engineering or or chemical engineering degree.

I had actually done a little bit of fiddling around with electronics (and maybe should have majored in that). But certainly college would have been a whole different level. (With a whole lot more math which was never my strong suit.)

So, yeah, these days I think there's a different baseline assumption for CS/programming than many other majors.


Is the MOOC the same as the actual MIT course though? I went through one of the old Grimson Guttag Intro to CS courses on MIT OCW years ago, with zero programming background I found it a very gentle on-ramp with all the basics explained.

I think it was this one, unfortunately archived now. I don't know the new one

https://ocw.mit.edu/courses/6-00-introduction-to-computer-sc...


No idea how similar it was to what's taught in the classroom. Of course you have access to TAs and other students IRL. And I have no doubt that assumptions about prior exposure and skills have changed over time.

I can only report that, had you dumped me into that content with those assignments, with no prior background I'd probably have been dropping that class.

The online version was more Grimson on the algorithms and Guttag (who wrote the Python book) on a bit of the programming. But the emphasis was more on the algorithms.


> There's definitely a tension at top STEM schools (probably especially in CS) between assuming students have some baseline knowledge of whatever field and just tossing them into the deep end of the pool and figuring out the practicalities on their own.

Pretty sure most college CS programs have an optional class for those new to programming ( Introduction to Java or C or Python ). But after that, you are expected to learn new languages/tools on your own mostly.


Most? Probably.

Not sure how common at what are considered top schools without looking at course catalogs. I expect if you're really new to programming, jumping into a CS program at an elite school could be a bumpy ride given 90% of the class will have a fair bit of experience and the class will be pitched to that level.


> Not sure how common at what are considered top schools without looking at course catalogs.

I am fairly certain 100% of the top CS programs ( and 99% that every CS program ) in the country have an intro to programming class for incoming freshman with no background in programming - usually Python, Java or C. MIT does. Besides, there are tons of material online to learn programming on your own.

> I expect if you're really new to programming, jumping into a CS program at an elite school could be a bumpy ride given 90% of the class will have a fair bit of experience and the class will be pitched to that level.

Agreed. But the challenge isn't insurmountable.


That’s generally how CS is taught at many top schools. If someone needs handholding just to figure out python then that’s bad news for the someone

It's interesting to see that MIT is still like this. Canonically, there were no classes that taught programming per se: if you needed that, there were (often volunteer-taught) courses over IAP, the January Independent Activities Period, that would attempt to fill the gap - but you were still expected to pick it up on your own. I taught the Caffeinated Crash Course in C way back when. Good times.

Way back in the day, you did have a few programming classes especially outside of CS/EE given that it was perfectly reasonable for students to have no or little prior exposure to computers and programming. See FORTRAN coloring book. And, as you say, although I haven't dropped by since pre-COVID, there was as you say a smattering of stuff during IAP.

But my general sense based on some level of connections is you're expected to figure out a lot of, for lack of a better term, practicalities on your own. I don't think there's a lot of hand-holding in many cases--probably more so in some domains than others.


I feel like anyone with enough talent to get into MIT will have no problem picking up a programming language in a month or two on their own. Heck there are freshmen there who write programming languages for fun

I think that assumes a base level of programming knowledge in the generic which may be a reasonable assumption in this day and age if you're applying to MIT/Stanford/etc. It wasn't going back a decade or two but may be today. Perhaps if you've never written a program, you're just not a candidate for some undergraduate programs today whatever your other talents.

Yup. Back in my day there was 1.00, a Civil Engineering course, a pretty standard intro to programming in plain old C. I don't know if it still exists. There was nothing of that sort in EECS, though there are lots of IAP courses (which take place in January, before spring semester starts). IMO a month is about right to spend on (leisurely) picking up a programming language for fun. A friend and I learned APL that way.

In 2004 or so, 1.00 was an intro to Java course. I took it very cynically to pad out my units; I was a course 6 senior at the time. I got side-eyed by TAs a lot.

when I took 1.00 it was FORTRAN IV on IBM 370... with actual punchcards, batch.

During my time there (late 2000s) there was a Software Lab (6.170) that focused on programming fundamentals and culminated in a four-person, month-or-so long project. At least at the time, it was one of the more notorious courses in terms of time investment. It was common for people to live like monks during project time.

Unfortunately I heard that class was retired and there was no direct replacement, which is a shame. It was an excellent crash course in shipping.


Project courses were pretty notorious. I had a few. 2.70 (which I think is a different number now) in mechanical engineering was a HUGE time sink. [For others: was a design challenge competition with a live context.] Did another all-terrain vehicle competition in grad school which was probably an even bigger time sink.

when i took 6.170 it was CLU. it was 15 units because of lab, but yeah, time sink.

There was this one grad class taught by a professor who was also a capable programmer, and the class incidentally used this one programming language that many grad students wanted to learn.

So the word on the street was that his was a good class to take if you wanted a chance to learn the programming language. (Because you have only so much time in the day to allocate to labs.)

And rumor was also not to say to the professor that you want to learn that language, because word had gotten back to him about the off-label draw of his class to many, and he didn't like it.


you are allowed to say the language name now, tho right? or is he still prowling about?

Wasn't the SICP course a course in programming per se?

i've often marvelled that hardly anybody on the MIT CS faculty has ever considered RDBMS / SQL / Codd-Date / relational model worthy of pedagogical consideration. you might cover some crud when surveying semaphores and mutexes, but actually learn SQL? nope.

I was an undergrad at Rutgers in the late 1990s as they transitioned from teaching Pascal to Java for the introductory CS classes.

The lectures were primarily about algorithms, basic data structures etc and the extra "labs", taught by Teaching Assistants, was almost always for reviewing the lecture notes with a focus on answering questions.

At no point was there any discussion around "hey, here is a good way to design, write and test a program in a compiled language". My prior experience was with BASIC so just figuring out how to compile a program was a skill to pick up. I thankfully picked it up quickly but others struggled.

Another thing I saw often was people writing ENTIRE programs and then trying to compile them and getting "you have 500 compilation errors". I never wrote programs this way, I was more "write a couple lines, compile, see what happens etc" but it always struck me that even just suggesting that option in class would have helped a lot of folks.

(This being HN, I'm sure some people will say that students figuring this stuff out on their own helps weed out non-serious people but I still don't 100% buy that argument)


Link to the About page that clearly describes the effort and rationale.

https://missing.csail.mit.edu/about/


Anyone know how/if this differs from the 2020 one?

Edit: Nvm, they comment on it. https://missing.csail.mit.edu/2026/development-environment/


good link but I want to call out how they expect you to use AI like Claude.

wonder how that's gonna work and if students will learn...


This looks like it is part of MIT's Independent Activities Period (IAP)?

Which seems like a brilliant idea (part of their 4-1-4 academic calendar.)

https://elo.mit.edu/iap/


I always thought this practical side of development was missing in a CS or engineering curriculum. This is awesome.

For similar reasons I think arts and humanities students should take marketing and business courses.


There are probably books out there that probably do a better job than most discrete courses I can think of--i.e. an accounting course is probably too in the weeds for most people--but I agree that some sort of business/marketing/finance 101 would be useful for a lot of people (engineers as well).

Awesome course and I encourage everyone to check out the previous iteration (and the corresponding discussions on HN)

If you're interested, see also https://bernsteinbear.com/isdt/ by me and Tom

There should be something like this available for any student at University, regardless of field. Perhaps less geared towards programming tasks but basic computing productivity.

very useful, took me couple months brute forcing to grasp the know hows because my school doesn't teach it. glad to see a course for it now getting out there

Basic FLOSS desktop knowledge must be in high schools for everyone, you can't study in the modern time without contemporary tools. LaTeX must also be in the game, because we need people who know how to express themselves crafting good quality documents.

Conspicuously missing is a direct mention of AI tools. Is MIT, like others, side-stepping the use of AI by students to (help them) complete homework assignments and projects?

A question. If you think AI use by students to "bypass homework" is anything remotely approaching a problem, then I must ask you how you felt/feel about:

- University being cost prohibitive to 90 percent of all humans as financial driven institutions, not performance.

- Before AI, 20 + years of google data indexing/searches fueling academia

- study groups before that allowing group completion (or, cheating, in your view)

- The textbook that costs 500 dollars, or the textbook software from pearson that costs 500, that has the homework answers.

I think it's a silly posit that students using AI is...anything to even think about. I use it at my fortune 500 job every day, and have learned about my field's practical day-to-day from it than any textbook, homework assignment, practical etc.


>study groups before that allowing group completion (or, cheating, in your view)

Totally dependent on school/department/professor policy.

Some are very strict. Others allow working together on assignments. (And then there are specific group projects.)


If you click through the lectures they are mentioned in several of them.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: