Hacker News new | past | comments | ask | show | jobs | submit login

Or alternatively you don't know how to use AI to help you code and are in the 2020s equivalent of the 'Why do I need google when I have the yellow pages?' phase a lot of adults went through in the 2000s.

This is not a bad thing since you can improve, but constantly dismissing something that a lot of people are finding an amazing productivity boost should give you some pause.




It's like blockchain right now. I'm sure there is some killer feature that can justify its problem space.

But as of now the field is full of swamps. Of grifters, of people with a solution looking for a problem. Of outright scams of questionable legality being challenged as we speak.

I'll wait until the swamps work itself out before evaluating an LLM workflow.


Blockchain was always a solution looking for a problem.

LLMs are being used right now by a lot of people, myself included, to do tasks which we would have never bothered with before.

Again, if you don't know how to use them you can learn.


And the same was said with the last fad when Blockcbain was all investors wanted to hear about ("Big Data" I suppose). It's all a pattern.

It's a legal nightmare in my ___domain as of now, so I'll make sure the Sam Breaker-Friends are weeded out. If it's really all the hype it won't be going anywhere in 5 years.


It's been 5 years since GPT2. I'm really struggling to understand the amount of negativity towards the biggest breakthrough in computing since the WWW.


If you're unaware of the general mood towards big tech in the 2020's, the downward trend of the economy, extreme speculation in all the tech sector over AI (which again, is not new), and the dozens of ethical quandries towards the methods of how LLMs obtain their data set, then yes. I can see why you're struggling to understand. There's so much literature on each point that I will only implore you to research these things on your own time if you care to.

In a purely technical vacuum though: it is truly amazing tech. I will give it that. Although it both excites and alarms me that apparently the power output predicted to properly leverage this at scale is having tech companies consider an investment in nuclear power.


Yes it's wonderful that AI will solve global warming as a side effect.


This is kind of why I'm skeptical of AI. When supposed tech experts are wearing rose tinted lens and missing the red flags, it's either because they want to wear them or because their livelihood depends on wearing them.

I won't blame people for that latter, I'd love a good quick way out of traditional work as well (gives me more time to hack on stuff without money troubles). But it's not a good model for curiosity and scrutiny. Again, I'll wait it out. Take care.


The rose tinted glasses are everyone expecting batteries to become a major part of the grid so we don't have to shut it down when the sun isn't shining.

Investing trillions in carbon free energy for AI is the most benign form of bubble I can imagine. If the bubble pops we have enough base load for the next century and don't die from climate change. If it doesn't we have the expertise to keep building large nuclear power plants.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: