Hacker News new | past | comments | ask | show | jobs | submit | parasti's comments login

That's very funny. My first guess would have been the womb. Probably the safest environment most of us have ever been in.

Why do you find it shocking and disturbing? If you go to the doctor, the average process of diagnosis and treatment is much like printf debugging - just sprinkle some based on instinct and run it again. We're surrounded by technological advancement that is making us feel like we're far in the future, but there's still so much we don't know.

This is a critical issue to pretty much every EU citizen.

I am doing this on macOS with no problem.


Funny how this describes my workplace's current approach to web dev as ancient history.


Kids' school "teaches" it, meaning the kids have to memorize where to click to do X in various Microsoft products. So you're stuck with Microsoft if you want decent grades for your child.


Almost every place on Earth is populated by humans via "expansionism".


I've released an album via Distrokid which distributes the release to YouTube as well. You can look at detailed reports there. Youtube revenue is split into Ads, ContentID and Red (which I believe is the old name for Youtube Premium). I just checked and I am currently getting a bigger share from Ads than from Red, per play.


Is "per play" the correct metric to use? What I'd like to compare are the hypotheticals "everyone is on Premium" and "everyone runs all the ads", but I'm not sure how to extrapolate this from some random split (I assume you don't see the ratio of your viewers) of Premium-to-ad..


In my experience the biggest roadblock to continuous conversation is context length. It fills up and the LLM starts forgetting parts of the conversation. Most tools don't even tell you that this has happened. But if you keep that in mind, that there is a buffer there filling up, you can massively improve the quality of the output.


Many prompters don’t practice hygienic reversal by rolling the outcome of a conversation back up the thread as input. If 10 comments are spent coming to some clean code, go back up the thread and include it as the status “back then”.


At least on macOS you don't even need to add to /etc/hosts, just use .localhost as the TLD.


How does that work with binding the same port to it? Does it automatically assign a new IP to every new ___domain resolv() call gets (and then caches it)?


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: