Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No one really goes to college and grad school to learn. Higher ed is commoditized as a:

* insurance policy, where students use it to avoid limiting themselves to low-status and low-paying careers

* lottery ticket, the gateway to a "prestige" career track [1]

* status signaling mechanism, where young people flex their educational elitism [2]

And higher ed is arguably bad at what its supposed be good at - a delivery mechanism of relevant knowledge. If you have ever interviewed a fresh grad who didn't study "Cracking the Coding Interview" or did not have a few "side projects" under their belt, then you know what I am talking about.[3] The internet is the most effective information distribution delivery and it's much cheaper, no wonder the value of college is in question.

[1] Consulting, finance, corporate middle management, grad school

[2] How many times have you been on a date and you tell or ask the other person where they wen to college, what they majored in, etc

[3] Arguably, the standard that hiring managers set for new grads is too high. Perhaps it's the employer that should be training fresh grads, but that is not how the current job market works. I think about this a lot because it causes problems for both young people and employers.



> If you have ever interviewed a fresh grad who didn't study "Cracking the Coding Interview"

Goodhart's Law.

> or did not have a few "side projects" under their belt

IME: end-of-course projects at places like MIT and Carneige Mellon are often much more impressive than the CRUD web app or copy-pasta Jupyter notebooks I tend to see from e.g. coding bootcamps or especially self-taught applicants. And students usually have a half dozen or more of those, in addition to internship projects.

Sometimes a self-taught person comes along with a genuinely impressive project, and I push hard to hire those folks. But for the most part it's silly little single-person-project web apps and such.

> The internet is the most effective information distribution delivery and it's much cheaper, no wonder the value of college is in question.

The questioning only really has a loud voice in places like the USA with horrendously expensive higher ed systems. I don't really hear a lot of griping about the cost of education or opining on internet alternatives when I'm in Munich or Vienna.


> end-of-course projects at places like MIT and Carneige Mellon

Very few students study at places like MIT and Carnegie Mellon. They typical CS program, even the typical "good but not exemplary" CS program, does not direct students to produce "projects" on nearly the same level. I even went to a college that has a well-regarded senior capstone project for its traditional engineering majors, but the CS equivalent felt far more low-effort and unconsequential. Among my classmates, the ones who had significant side projects and internship/co-op results were the ones with preferential outcomes and the ones who only had end-of-course projects to show were the ones with "typical" outcomes.


Certainly, but lots of places do.

The non-flagship state uni a lot of my friends went to had fantastic project-based coursework in CS. Acceptance rate is like 70% and tuition is very affordable.

Agreed that ENG departments tend to do better at this, especially at mid-tier state schools. CS departments need to take project-based work and especially capstones much more seriously.


> Sometimes a self-taught person comes along with a genuinely impressive project, and I push hard to hire those folks. But for the most part it's silly little single-person-project web apps and such.

In your view, what distinguishes the former from the latter? CS is not my primarily skillset, but I do know "how to program". And, because I'm a grad student, I want to do something with a very high impressive:time ratio, if that makes sense.


> In your view, what distinguishes the former from the latter?

Hard to give a generic answer.

Basically, anything that makes me say "yeah there's no youtube tutorial for that, no public github repo you could copy ideas from, and you had to have solved a lot of difficult gritty problems in creative ways to get it to work".

For that reason, A lot of the most impressive projects I've seen demonstrates the potential of a new tech stack. Demonstrations of promise can be either really cool demos or else pieces of infrastructure that lower the barrier to entry/iteration time/etc.

Back when 3D printing and laser sintering where new technologies, here are examples of cool projects around that emerging stack:

* digging into the firmware and/or hacking around the firmware to fix some limitation of the machine. This is probably not relevant anymore, but was when 3D printers were still new and had annoying bugs.

* A domain-specific CAD-like tool that did a bunch of "physically possible" checks for certain types of objects by using numerical analysis to do a bunch of ad hoc checks. Fantastic project because you can look at it and say "if you took two years and did this in a more systematic way it'd be a great product"

* auto-generated statues/art

* etc.

NB: that would be a bit less impressive these days, because 3D printing is now mainstream and a lot of these projects can be done via copy-pasta development from github repos, or even by following a youtube tutorial.

What is your primary skillset, and what's an emerging capability in that space? How can you use software to either demonstrate the promise of that capability or else build useful infrastructure that could make that capability easier to access?

NB: this is mostly for undergrad and maybe non-thesis masters projects. If you're in a thesis masters program or phd program, focus on doing science. Pick a good advisor and listen.


Thanks for the ideas. I'm well into a physics phd program, specializing in biophysics. I want to work in biotech/pharma afterward, so I want to "prove my skill" with R/Python/C++. Trying to find a doable project that overlaps, that doesn't just seem like one of the genomics/bioinformatics practice problems.


Cracking the Coding Interview isn't really needed if the candidate's had a real algorithm and data structure class.

What is great with MIT/CMU style projects is that they demonstrate a good knowledge of the fundamentals of engineering and often required a significant amount of learning on a very scoped subject to complete the project. That is something I want to hire for, because I know that even if these guys know nothing about my specific tech stack and problem space, they will be able to figure it out quick.

I repeated it a couple of times, bootcamp projects are often worthless, and more often than not the goal of the bootcamp is to manage to complete the project, which is identical to the one that the other 40 students completed. They are often made to be flashy and show (non-technical) hiring managers that they have the right applied skills to churn out code immediately, but I'm pretty sure a lot of bootcamp grads could not figure out how to translate their toy app to a different framework or ecosystem in any reasonable amount of time.


>The questioning only really has a loud voice in places like the USA with horrendously expensive higher ed systems. I don't really hear a lot of griping about the cost of education or opining on internet alternatives when I'm in Munich or Vienna.

There's also a growing anti intellectual movement in the US, you even see it here on hacker news. If you start with the premise that education is at best useless and at worse actively harmful of course the cost would be questioned.


What sort of things do you see in these end-of-course projects? Genuinely curious


At the University of Washington (which consistently ranks at the level of MIT and Stanford for Computer Science), someone with a bachelor's in Computer Science would generally be expected to complete at least two of: * Operating Systems: Either implement lock/fork, assorted system calls, and virtual memory for OS/161; or implement a device driver. * Networks: Implement the Tor protocol or a project of similar complexity. * Compilers: Implement a compiler for simplified Java including at least constant propagation or a similar optimization. * Animation: A year-long course series that culminates with animating a few-minute long movie. * A small video game. * A Maps-style program for finding the shortest walk between any two locations on campus, including displaying that information. This one was required for Computer Science.

All but the last of these were done in groups of 2-3 people. It might have been theoretically possible to graduate with only one big project, if you took machine learning, security (the final project for this one was finding an exploit in Firefox, but no actual code was required), and some heavy theory classes. Most people would have done at least 3-4 big projects like this.


Some random classes that come to mind:

Electronics: https://www.google.com/search?q=6.115+final+projects

Interactive music: https://vimeo.com/user42907764

Product design: https://www.youtube.com/watch?v=RqKSAz3yxMM


> No one really goes to college and grad school to learn

I did. As did many of my classmates. I've found engineering fascinating since I was a child so I enjoyed the opportunity.

> How many times have you been on a date and you tell or ask the other person where they wen to college, what they majored in, etc

How is this "[flexing] educational elitism"? That is standard small talk to gauge another person's life experience and interests.


> I did. As did many of my classmates. I've found engineering fascinating since I was a child so I enjoyed the opportunity.

I was told from a young age college was about learning and not grades. I believed that lie, aggressively optimized for learning over grades and that was a huge mistake.

Several times I skipped doing boring unnecessary homework or going to a useless class to work on projects that would teach me more. I would routinely do things like skip class to go above and beyond on interesting projects.

Despite getting above a 90 on all of my projects and exams, I graduated with a 2.4. I would routinely have calls with potential employers where they would say "wow after talking to you we're really impressed with your knowledge of c.s. and the projects you've completed. Oh btw I didn't see your gpa on your resume, what was it? Oh, I'm sorry. We only take people who have a gpa about 2.5/3.0/3.5. but don't worry you seem super personable and smart I know you're going to find something".

Now I tell all of my younger siblings. Optimize for grades and that's the only thing that matters. You can always pick up a book but you'll never be able to fix your gpa. My 2.4 and my wife's 4.0 continue to influence our lives (albeit now much more mildly) 10 years later.


Apparently, I'm among the select few who learned things during my college years and actually apply it professionally. Go me.

The fact that most software developer jobs of the present day don't really make use of a college education says more about the profession of software development than the value of getting a degree. Software development is becoming a commoditized, blue collar job that anybody off the street can do and that will have long term impact on the market value of ordinary developers.


> No one really goes to college and grad school to learn.

I went to college and grad school to learn.


> Perhaps it's the employer that should be training fresh grads, but that is not how the current job market works.

Funnily enough that is effectively what happens in many cases anyway. You still need to shell out the 40k for the undergrad degree to even get the opportunity to learn on the job.


And higher ed is arguably bad at what its supposed be good at - a delivery mechanism of relevant knowledge. If you have ever interviewed a fresh grad who didn't study "Cracking the Coding Interview" or did not have a few "side projects" under their belt, then you know what I am talking about.[3] The internet is the most effective information distribution delivery and it's much cheaper, no wonder the value of college is in question.

I don't think it's true. Colleges and education in general are teaching people. But it's more like drinking from a firehose.

But nobody cares about effort to retain what you drank from said firehose, so 90% to maybe 99% of it basically disappears into ether, unless you're in a profession that actually makes use of what you're taught.


status signaling has a bigger impact to employers as a basic signal of competency

I think the deluge of undergrad degrees is the cause of this? a degree isn't a meaningful differentiate? rather just a expected baseline all prospective employees must have




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: