Hacker News new | past | comments | ask | show | jobs | submit login

I fully support the author’s point but it’s hard to argue with the economics and hurdles around obtaining degrees. Most people do view obtaining a degree as just a hurdle to getting a decent job, that’s just the economics of it. And unfortunately the employers these days are encouraging this kind of copy/paste work. Look at how Meta and Google claim the majority of the new code written there is AI created?

The world will be consumed by AI.






You get what you measure, and you should expect people to game your metric.

Once upon a time only the brightest (and / or richest) went to college. So a college degree becomes a proxy for clever.

Now since college graduates get the good jobs, the way to give everyone a good job is to give everyone a degree.

And since most people are only interested in the job, not the learning that underpins the degree, well, you get a bunch of students that care only for the pass mark and the certificate at the end.

When people are only there to play the game, then you can't expect them to learn.

However, while 90% will miss the opportunity right there in front of them, 10% will grab it and suck the marrow. If you are in college I recommend you take advantage of the chance to interact with the knowledge on offer. College may be offered to all, but only a lucky few see the gold on offer, and really learn.

That's the thing about the game. It's not just about the final score. There's so much more on offer.


> However, while 90% will miss the opportunity right there in front of them, 10% will grab it and suck the marrow.

Learning is not just a function of aptitude and/or effort. Interest is a huge factor as well, and even for a single person, what they find interesting changes over time.

I don't think it's really possible to have a large cohort of people pass thru a liberal arts education, with everyone learning the same stuff at the same time, and have a majority of them "suck the marrow" out of the opportunity.


I did a comp science degree, so I can't speak for the liberal arts. However I imagine the same experience could apply.

For us the curriculum was the start of the learning, not the end. We'd get a weekly assignment that could be done in an afternoon. Most of the class did the assignments, and that was enough.

There was a small group of us that lived (pretty much) in the lab. We'd take the assignment and run with it, for days, nights, spare periods, whatever. That 10 line assignment? We turned it into 1000 lines every week.

For example the class on sorting might specify a specific algorithm. We'd do all of them. Compete against each other to make the fastest one. Compare one dataset to another. Investigate data distributions. You know, suck the marrow.

(Our professors would also swing by the lab from time to time to see how things were going, drop the odd hint, or prod the bear in a direction and so on. And this is all still undergrad.

I can imagine a History major doing the same. Researching beyond the curriculum. Going down rabbit holes.

My point is though is that you're right. You need to be interested. You need to have this compulsion. You can't tell a person "go, learn". All you can do is offer the environment, sit back, and see who grabs the opportunity.

I get that you cant imagine this playing out. To those interested only in the degree, it's unimaginable. And no, as long as burning-desire is not on the entry requirements, it most certainly will not be the majority.

In truth the lab resources eoild never have coped if the majority did what we did.


> I did a comp science degree, so I can't speak for the liberal arts.

By 'liberal arts' I meant the common 4 year, non-vocational education. My major was CS too, but well over half of the time was spent on other subjects.

> I get that you cant imagine this playing out. To those interested only in the degree, it's unimaginable

I can easily imagine what you describe playing out. I just wouldn't call it 'sucking the marrow' (unless you were equally avid in all your classes, which time likely would not permit).

But as you allude to in your last point, the system isn't really designed for that. It's nice when it does effectively support the few who have developed the interest, and have extra time to devote to it, as it did for you.

I'd rather see systems that were designed for it though.


> you get a bunch of students that care only for the pass mark and the certificate at the end.

This is because that is what companies care about. It's not a proxy for cleverness or intelligence - it's a box to check.


That's entirely the point. If you see the degree only as a stepping stone to the company job, then that's all you see and that's all you get.

If you accept that the degree/job relationship is the start, not end, of the reason for being there, then you see other things too.

There are opportunities around the student which are for them, not for their degree, not for their job. There are things you can learn, and never be graded. There are toys to play with you'll never see again. There are whole departments of living experts happy to answer questions.

For example, (this is pre google) I wrote a program and so needed to understand international copyright. I could have gone to the library and read about it. Instead I went to the law faculty, knocked on the door, and found their professor who specialized in intellectual property.

Since the program I wrote was in the medical space, I went to the medical campus, to the medical research library, and found tomes that listed researchers who might benefit. I basically learned about marketing.

If all you care about is the company job, then all you'll see is the degree.


right and getting a family is also just a box to check and eating food is a box to check and brushing my teeth is just a box to check and on it goes for every single thing in life. If we all just checked boxes then we'd not be human anymore.

> Most people do view obtaining a degree as just a hurdle to getting a decent job

Then fail to actually learn anything and apply for jobs and try to cheat the interviewers using the same AI that helped them graduate. I fear that LLMs have already fostered the first batch of developers who cannot function without it. I don't even mind that you use an LLM for parts of your job, but you need to be able to function without it. Not all data is allowed to go into an AI prompt, some problems aren't solvable with the LLMs and you're not building your own skills if you rely on generated code/configuration for the simpler issues.


I think, rather than saying they can’t do their job without an LLM, we should just say some can’t do their jobs.

That is, the job of a professional programmer includes having produced code that they understand the behavior of. Otherwise you’ve failed to do your due diligence.

If people are using LLMs to generate code, and then actually doing the work of understanding how that code works… that’s fine! Who cares!

If people are just vibe coding and pushing the results to customers without understanding it—they are wildly unethical and irresponsible. (People have been doing this for decades, they didn’t have the AI to optimize the situation, but they managed to do it by copy-pasting from stack overflow).


> That is, the job of a professional programmer includes having produced code that they understand the behavior of.

I have met maybe two people who truly understood the behaviour of their code and both employed formal methods. Everyone else, including myself, are at varying levels of confusion.


If you want to put the goalposts there, why program instead of building transistor networks?

> I fear that LLMs have already fostered the first batch of developers who cannot function without it.

Playing the contrarian here, but I'm from a batch of developers that can't function without a compiler, and I'm at 10% of what I can do without an IDE and static analysis.


That's really curious: I've never felt that much empowered by an IDE or static analysis.

Sure, there's a huge jump from a line editor like `ed` to a screen editor like `vi` or `emacs`, but from there on, it was diminishing returns really (a good debugger was usually the biggest benefit next) — I've also had the "pleasure" of having to use `echo`, `cat` and `sed` to edit complex code in a restricted, embedded environment, and while it made iterations slower, not that much more slower than if I had a full IDE at my disposal.

In general, if I am in a good mood (and thus not annoyed at having to do so many things "manually"), I am probably only 20% slower than with my fully configured IDE at coding things up, which translates to less than 5% of slow down on actually delivering the thing I am working on.


I think there’s a factor of speed there, not a factor of insight or knowledge. If all you have is ‘ed’ and a printer, then I think most of the time you will spend is with the printout. ‘vi’ eliminates the printout and the tediousness of going back and forth.

Same with more advanced editors and IDEs. They help with tediousness, which can hinders insight, but does not help it if you do not have the foundation.


Apples and oranges (or stochastic vs deterministic)

Look inside a compiler, you'll find some AI.

You won't find an LLM.

What would you consider AI in it?


I've seen this comparison a few times already, but IMHO it's totally wrong.

A compiler translates _what you have already implemented_ into another computer runnable language. There is an actual grammar that defines the rules. It does not generate new business logic or assumptions. You have already done the work and taken all the decisions that needed critical thought, it's just being translated _instruction by instruction_. (btw you should check how compilers work, it's fun)

Using an LLM is more akin to copying from Stackoverflow than using a compiler/transpiler.

In the same way, I see org charts that put developers above AI managers, which are above AI developers. This is just smoke. You can't have LLMs generating thousands of lines of code independently. Unless you want a dumpster fire very quickly...


Yeah ok. I was viewing AI as "a tool to help you code better", not as "you literally can't do anything without it generating everything for you". I could do some assembly if I really had to, but it would not be efficient at all. I wonder if there's actually "developers" who are only prompting an LLM and not understanding anything in the output ? Must be generating dumpster fires as you said.

LLMs have been popular for like 2 years... if you can't code without one, you couldn't code 2 years ago. Given 2 years you might be able to learn to.

Lots and lots of developers can't program at all. As in literally - can't write a simple function like "fizzbuzz" even if you let them use reference documentation. Many don't even know what a "function" even is.

(Yes, these are people with developer jobs, often at "serious" companies.)


This is half the point of interviewing. I've been at places that just skip interviewing is the person comes highly recommended, has a great CV, or whatever.

Predictably they end up with some people on the range from "can't code at all" to "newbie coder without talent"


I've never met someone like that and don't believe the claim.

Maybe you mean people who are bad at interviews? Or people whose job isn't actually programming? Or maybe "lots" means "at least one"? Or maybe they can strictly speaking do fizzbuzz, but are "in any case bad programmers"? If your claim is true, what do these people do all day (or, let's say, did before LLMs were a thing...)?


I've definitely worked with a person who struggled to write if statements (let alone anything more complex). This was just one guy, so I wouldn't say "lots and lots" like the other poster did, but they do exist.

Yeah I’ve been doing this for a while now and I’ve never met an employed developer who didn’t know what a function is or couldn’t write a basic program.

I’ve met some really terrible programmers, and some programmers who freeze during interviews.


By "lots" I estimate about 40 percent of the software developer workforce. (Not a scientific estimate.)

> Maybe you mean people who are bad at interviews?

No, the opposite. These developers learn the relevant buzzwords and can string them together convincingly, but fail to actually understand what they're regurgitating. (Very similar to an LLM, actually.)

E.g., these people will throw words like "Dunder method" around with great confidence, but then will completely melt down for fifteen minutes if a function argument has the same name as a module.

When on the job these people just copy-paste existing code from the "serious company" monorepo all day, every day. They call it "teamwork".


> Most people do view obtaining a degree as just a hurdle to getting a decent job, that’s just the economics of it.

Because those who recruit based on the degree aren't worth more than those who get a degree by using LLMs.

Maybe it will force a big change in the way students are graded. Maybe, after they have handed in their essay, the teacher should just have a discussion about it, to see how much they actually absorbed from the topic.

Or not, and LLMs will just make everything worse. That's more likely IMO.


> I fully support the author’s point

I don't. I think the world is falling into two camps with these tools and models.

> I now circle back to my main point: I have never seen any form of create generative model output (be that image, text, audio, or video) which I would rather see than the original prompt. The resulting output has less substance than the prompt and lacks any human vision in its creation. The whole point of making creative work is to share one’s own experience

Strong disagree with Clayton's conclusion.

We just made this with AI, and I'm pretty sure you don't want to see the raw inputs unless you're a creator:

https://www.youtube.com/watch?v=H4NFXGMuwpY

I think the world will be segregated into two types of AI user:

- Those that use the AI as a complete end-to-end tool

- Those that leverage the AI as tool for their own creativity and workflows, that use it to enhance the work they already do

The latter is absolutely a great use case for AI.


> We just made this with AI, and I'm pretty sure you don't want to see the raw inputs unless you're a creator:

I am not a creator but I am interested in generative AI capabilities and their limits, and I even suffered through the entire video which tries to be funny, but really isn't (and it'd be easier to skim through as a script than the full video).

So even in this case, I would be more interested in the prompt than in this video.


Yes, depending on the model being used, endless text of this flavor isn't all that compelling to read:

"Tall man, armor that is robotic and mechanical in appearance, NFL logo on chest, blue legs".,

And so on, embedded in node wiring diagrams to fiddly configs and specialized models for bespoke purposes, "camera" movements, etc.


TBH, this video is not that compelling either, though — obviously — I am aware that others might have a different opinion.

Seeing this non-compelling prompt would tell me right off the bat that I wouldn't be interested in the video either.


> The latter is absolutely a great use case for AI.

The video is not exactly great, IMO.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: