Hacker News new | past | comments | ask | show | jobs | submit login

Doctors can be associating what they’ve learned, often with heavy biases from hypochondriacs and not enough time per patient to really consider the options.

I’ve had multiple friends get seriously ill before a doctor took their symptoms seriously, and this is a country with decent healthcare by all accounts.

Human biases are bad too.




> Doctors can be associating what they’ve learned, often with heavy biases from hypochondriacs

So true. And it's hard to question a doctor's advice, because of their aura of authority, whereas it's easy to do further validation of an LLMs diagnosis.

I had to change doctor recently when moving towns. It was only when chancing on a good doctor that I realised how bad my old doctor was - a nice guy but cruising to retirement. And my experience with cardiologists has been the same.

Happy to get medical advice from an LLM though I'd certainly want prescriptions and action plans vetted by a human.


    > It was only when chancing on a good doctor that I realised how bad my old doctor was
How did you determine the new doctor is "good"?


By the time a doctor paid me enough attention to realise something was wrong I had suffered a spinal cord injury whose damage can never be reversed. I’m not falling all over myself to trust chatgpt, but I got practically zero for doctors either. Nobody moved until I threatened to start sueing.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: