Hacker News new | past | comments | ask | show | jobs | submit login
The Internet Isn’t Making Us Dumber – It’s Making Us More ‘Meta-Ignorant’ (nymag.com)
184 points by vezycash on April 22, 2017 | hide | past | favorite | 144 comments



The problem with ignorance in the internet age is you can now find sources that confirm your wrong headed ideas, entire ecosystems even.

If you haven't learned how to think before becoming an adult, I fear that you won't. Sorry for sounding condescending, but I've run into so many people who are misinformed, and let's just say "misinformed" means they don't know what the academic orthodoxy is in a given field, but they think they know the facts, and they even have theories about why other people are wrong.

I've tried to explain to a friend why vaccinations are a good idea, and I get a bunch of crap about how modern medicine doesn't work for her, and how she took a homeopathic vaccine, went to India, and didn't get ill. A half a dozen of her friends jump in with articles about how vaccines work, and she ends it with "sure but I'm skeptical".

I've been invited to a friends house to look at his "Time Waver" machine, which supposedly connects to one of your auras, and has a nice animation of how it scans every single one of someone's organs. Remotely. In fact he showed me a woman in Italy that he was helping out. First he asked me if I knew anything about quantum theory, which I don't really beyond undergrad, and then he gets excited and and spouts something about how I'll appreciate a cleansing. Good thing I can stay polite. But someone in his 40s who thinks this is how the world works is not going to have the veil of ignorance lifted.

These are just a couple of recent examples. Common to them is there's a bunch of stuff you can readily access which supports it. If you have some opinion about just about anything, you can find support for it, in fact a web of support, which will really test your reasoning skills.


The problem with ignorance in the internet age is you can now find sources that confirm your wrong headed ideas, entire ecosystems even.

When I was young, lonely and isolated, the internet was a godsend: it connected me to like minded individuals and expanded my horizons. It exposed me to a world of ideas about science, technology, history, politics, music, etc. at a time when everyone around me was mentally stagnant.

That same power to connect can lead people down a dark or dangerous path. Approach the internet with an uncritical mind and you could emerge as a flat-earther, anti-vaccer, or worse.

I've tried to explain to a friend why vaccinations are a good idea, and I get a bunch of crap about how modern medicine doesn't work for her, and how she took a homeopathic vaccine, went to India, and didn't get ill.

I once knew someone who bought an expensive electronic detoxification device. The idea was you held on to the handles and it sent purifying waves through your body, killing any parasites. She insisted on showing off her stool after the process was complete. She said she could see the little "bugs" that had been in her system. Then we discovered that she had forgotten to put in the batteries.

I knew someone else who wanted to invest in a $50,000 machine that made similarly dubious claims. The idea was to make the money back by selling treatments. Thankfully her daughter talked her out of it.

Gullibility is a dangerous thing, especially at the intersection of personal health and second party financial incentive.


It's not just gullibility, it's being taken in by the siren song of some compelling hypothesis and then being slow-boiled by these increasingly ridiculous "facts" that are used to sustain that trance.

Normal, grounded people don't wake up one morning and go "I get it, the Jews and Soros are conspiring to suppress the truth about the Flat Earth", they get there by baby steps, each one veering more and more into the illogical.

Maybe it was when they were unemployed or in medical trouble and stressed out beyond their ability to cope, then wanted explanations, however implausible, that gave them at least an idea of what was wrong with their world. From there it took root and by degrees they slip into this other world of nonsense and batshittery.

Crystal therapy and detox can lead to healing rituals can lead to cults can lead to suddenly selling everything and donating it all to some fraudulent healer.

Like you observe, I think we're all in need of a best friend, family member, or spouse who can say "You've got a problem, this has to stop now."

The problem with the internet is it delivers. If you want to know more about quantum physics or molecular biology or cancer research you can dig and dig and keep on digging, you will never run out of material. Likewise if you're looking for harmful material, it will unquestioningly keep going all the way.

If Google started to detect the pattern of someone slipping into madness and did what they could to start to refute things, to sprinkle in some helpful counter-points with their diet of pure nonsense, maybe we can save people who'd otherwise be lost.


> If Google started to detect the pattern of someone slipping into madness and did what they could to start to refute things, to sprinkle in some helpful counter-points with their diet of pure nonsense, maybe we can save people who'd otherwise be lost.

To which the response is who are Google to influence how or what people think?

It's easy to make up pure-nonsense examples that nobody believes - like Soros suppressing the truth about the flat Earth - and say believers should be saved.

What if someone questions the official version of 9/11? What if they believe/d the NSA taps internet, social networks and world leaders? What if they think an omnipotent sky fairy watches over us all and blesses America?

Google's "did you mean" is irritating enough when applied to assumed typos; its purpose should be to give people what they're looking for, not tell people what they should be looking for instead.


>To which the response is who are Google to influence how or what people think?

But they are already doing just that. I watched a few Youtube videos related to Quantum Mechanics and my recommended queue was flooded with videos full of pseudo scientific nonsense.

It took me considerable effort in having these weird videos no longer recommended to me. I think we can just be honest about what caused it, it's a lack of intelligence in the sorting algorithm, that currently isn't able to distinguish between true and false statements.

I understand there are areas where this line is vague, but if I watch a few videos about astronomical events, why am I getting recommendations about aliens visiting earth?


You watch one "free energy" video for chuckles and all of a sudden YouTube keeps surfacing more and more of these things, there's an infinite supply of them apparently. It just won't stop.


> To which the response is who are Google to influence how or what people think?

Give people a long leash, but when they're way off to the end of the bell-curve of normality throw them a life-line.

> It's easy to make up pure-nonsense examples that nobody believes...

That's actually an example of things people do believe. If you engage with some of those wild outliers you can learn a lot, quickly. Some of them are trolling, but many of them have extremely firm convictions. Maybe it's Sandy Hook. Maybe it's chemtrails. It's usually a bunch of stuff, all tangled together.


Give people a long leash, but when they're way off to the end of the bell-curve of normality throw them a life-line.

Hello User. We've noticed you've been spending more time on StackOverflow than 99% of our other users. May we suggest you spend some time looking at cat videos to balance things out? -- Your friends at Google


I appreciate your intent, but to me the concept of google operating as an arbiter of truth is unsettling.


They're already doing this, although for good currently. [https://theintercept.com/2016/09/07/google-program-to-deradi...]


They do it when they detect someone seeking to physically harm themselves (e.g. suicide). This could be the same for severe mental harm.


The most terrifying part of the flat earth movement is that most people don't have the basic knowledge necessary to argue against them.


It won't matter because flat earthers don't have the basic knowledge to understand arguments.

It's like I usually say

> You can't fix crazy and you can't fix stupid.


> That same power to connect can lead people down a dark or dangerous path. Approach the internet with an uncritical mind and you could emerge as a flat-earther, anti-vaccer, or worse.

Maybe. Or, perhaps it's the tendency of irrationality to beg more irrationality today, when a large number of vectors exists in which the irrationality can spread. If someone said something irrational and blaming 30 years ago, a few people might hear it and think, "huh, that's a weird thought". Most would probably not repeat the thought to anyone later because most people don't want to be viewed as irrational or weird by others. Maybe a few people might, because they were already irrational, but the chances were low that enough would repeat it to where it could spread.

30 years later, two things have happened to increase the likelihood of a "viral irrationality" to root and grow at increasing rates: 1. lots of people on the Internet and 2. lots of irrational things being said on the Internet. I would classify most of the irrationality on the Internet as others speaking for others thoughts or feelings. Check out Alex Jones just pulling shit out of his ass for example, under the guise of "news", but really just being a performance artist. He's "entertaining" people by pretending to be trustworthy, and keeping it going by making anyone listen to him question the "real" news (of which is he is not a member) to keep the schtick going.

I look at it more like a weird NP problem, where increasing amounts of time are needed to unravel the logic being presented as it spreads further into the aggregate, given it's self-referential in nature to answer leading questions with more irrationality. The blame game people play might be likened to two traveling salesmen trying to solve each other's problems of shortest path while also traveling paths that avoid or overlap each other. Given certain paths, the salesman may find themselves in a loop, blaming each other for being stuck, but unable to stop doing the calculations to get themselves out of the pickle.

Check out the concept of a double bind. It's relevant to this discussion.


> She insisted on showing off her stool after the process was complete. She said she could see the little "bugs" that had been in her system. Then we discovered that she had forgotten to put in the batteries.

There's so much wrong with this...

Reminds me of the audiophile that was very happy with his top of the line amplifier that he custom built. It was clearly much better than the previous one.

Turns out he had forgotten to turn it on and was using the old one.


Wow, homeopathy. It's a carryover from alchemy. So you identify a poison, which causes some effect. You titrate the dose-response. Then you argue that some miniscule amount, orders of magnitude below an effective dose, relieves that poison's effect. That's from alchemy. And then people who believe you feel better, given how strong the placebo effect can be.


In his book, "Amusing Ourselves to Death," Neil Postman suggests that the age of the printing press coincided with the age of reason.

The Internet is perhaps the invention that most closely parallels the invention of the printing press. But, unfortunately the Internet appears to be working paradoxically to the development of knowledge.

Is the age of the internet the harbinger of an age of disinformation, alternative facts, and ignorance?

Is it possible that the Internet could cause our cognitive devolution as a species?

https://jimdroberts.wordpress.com/2017/04/23/the-information...


I personally know a man in his late thirties who nearly died of AIDS a few years ago because he had read "credible sources" on the Internet and convinced himself that HIV is harmless.

This happened already in the early 2000s. For over a decade, he ignored his HIV status (not knowing either way) thanks to the "advice" from contrarian experts on the Internet. One day he finally had a flu that didn't seem to go away. A doctor took one look at the lesions on his face and told him the bad news.

He survived, and now he wants to warn everyone about the dangers of hopeful biases when combined with the immense amount of misinformation on the Internet. He's even appeared on national media here in Finland to tell his story, and I admire his courage.


That truly is admirable.

These examples of ignorance remind me of my own similar anecdote:

I attended a dinner, out of politeness, with my grandparents and their friends. A man staying with them saw me puffing on my personal vaporizer, cornered me, and proceeded to tell me that he was diagnosed with lung cancer after trying to switch to a PV himself.

"I smoked cigarettes for 30 years, never had a problem. Those electric things cause cancer!"

Now, I know there is no long term research into electronic cigarettes or their health effects. However, he was clear that he had only quit cigarettes for maybe 2 months before being diagnosed.

To suggest, with a straight face, that 30 years of cigarette smoke couldn't possibly be responsible for his lung cancer...I'm not sure how he came to such a conclusion other than misinformation.


Lung cancer is a strange one. As smoking levels drop, in some populations one can say that "most" lung cancer has nothing to do with smoking. So the doc has 10 patients with lung cancer, but only one of them is a smoker. With such numbers is gets hard to say that smoking was the cause. It was easy when the same doc had 30 patients and 21 were smokers, but times change. Statistics are hard and need to be revisited regularly.


Well that kind of reasoning doesn't seem factual at all... You would need to see that smokers have a higher prevalence of lung cancer than the general population to prove anything. Not "the other way around".

Edit: hmm, unsure of that grammar. (Non native)


AIDS denialism is a curious thing. There was this magazine Continuum which claimed AIDS was a conspiracy unrelated to HIV. The tragic irony is that the magazine stopped publishing because all the major contributors died of AIDS.


https://en.wikipedia.org/wiki/Continuum_(magazine)

It ran from December 1992 until February 2001, ceasing publication because all the contributors had died of AIDS-defining clinical conditions.



I'm deeply disturbed by the parallels between AIDS denialism and climate change denialism.

Take this article:

http://edition.cnn.com/2017/04/20/us/louisiana-climate-chang...

These fishermen seem just like my acquaintance who didn't want to know that his immune system was collapsing, until finally the signs were evident as bright red cancerous lesions on his face.


Please don't link that shit here. It increases it's pagerank. Or at least put a space in the url or something.

It's completely infuriating. I mean if I didn't know anything about HIV or medical research, that would sound completely plausible. I went in expecting to read batshit insane conspiracy theories. And I'm surprised how sane it sounds. That's scary.


it doesn't increase it's page rank, at least on google, it has the rel="nofollow" attribute.


> they don't know what the academic orthodoxy is in a given field

Academic orthodoxy is sometimes wrong (re: history); But even it is isn't, we all ignore academic orthodoxy in some field of life: diet, travel, wars, entertainment choices, etc...

So who decides which academic orthodoxy we need to force people to conform too? If you leave room for free-thought, you have to expect bad thoughts.


Are you sure that traditional sources of knowledge, and traditional ways of making people believe things, result in greater accuracy? Consider that over half of the papers in many fields, including medicine, are statistically bogus: http://journals.plos.org/plosmedicine/article?id=10.1371/jou... http://biorxiv.org/content/early/2016/08/25/071530

On average the results in science papers are probably more accurate than random stuff you read on the internet. But I suspect the difference in error rate is uncomfortably small, much smaller than those who are smug about science would like to believe.


While it may be true that 50% of papers are wrong in some fields, the papers that are wrong are probably not the most highly publicized ones or the ones in the most prestigious journals (and when those papers are wrong the incentive to publish corrective papers is high), so this probably isn't quite as big a problem as it's made out to be.


> the papers that are wrong are probably not the most highly publicized ones or the ones in the most prestigious journals

If you google around about the replication crisis you will discover that the opposite is true.


This is a fairly good example of the sorts of biases one even finds among scientists: http://www.npr.org/sections/health-shots/2015/01/12/37566392...


The kinds of things that are in current journals are on the cutting edge. We should expect a fair few lines of inquiry to turn out to be negative or wrongly done.

The sort of knowledge I'm talking about is the kind of stuff that's been known for decades, acknowledged by everyone who's studied it.


If you have some opinion about just about anything, you can find support for it, in fact a web of support, which will really test your reasoning skills.

Maybe this is a technological problem with a technological solution?

This strikes me as being a sort of blindness to the epistemic web-of-supporting-knowledge. What we claim to know is ultimately supported and reinforced by other propositions that are consistent with it. There is a view of reality defined by the proposition that vaccines do more harm than good, and there is a small web of propositions that stand in support of this; but its a very small web and it stands in opposition to far more propositions than are in its web.

What if there was some kind of page-rank for ideas we see on the internet? What if it was made obvious, in your web browser or something, that you were viewing information that had a very low epistemic probability based on the amount of supporting information on the web? Maybe we're 10 or 20 years out from whatever technological solution could accomplish this (AI, clever algorithms, ... ?) but I really think that sometimes people just have a hard time seeing or understanding the sheer scale of evidence that is opposed to their flawed beliefs.


> I really think that sometimes people just have a hard time seeing or understanding the sheer scale of evidence that is opposed to their flawed beliefs.

No, they would discount the credibility of that evidence before entertaining the possibility that they might be wrong.

If you created a page rank for ideas, to prove to them how unlikely their beliefs were to be true, such people would dismiss the attempt as propaganda. Just look at how often various points of view on HN (which one would expect to self-select for rational, scientifically minded thinkers) are derided as fabrications, "fake news", the work of shills, disinformation, or something similar.


Google is trying that with their 'single answer' box(I forgot the name). Sadly, that tends to lead to worse misconceptions then before.

There are a few examples in: https://youtu.be/hgEPHIaScck


This is basically Wikipedia. Wikipedia gives you the consensus view on pretty much every area of human knowledge. People who reject Wikipedia as a reliable source would reject the solution you propose as well.


"Wikipedia gives you the consensus view on pretty much every area of human knowledge."

Doesn't Wikipedia say that climate change is a thing? I bring that up, because it contradicts much consensus.


What? Are you trying to say that the consensus view is that climate change isn't a thing?


> which supposedly connects to one of your auras, and has a nice animation of how it scans every single one of someone's organs.

This kind of things makes me feel sick.

I knew a family, their child got some minor infection, they didn't believe in antibiotics and tried to cure him with "alternative" methods. Suffices to say - that kid is no longer alive.

Edit: I wish they were convicted for manslaughter.


I am not supporting their ignorance. But I imagine their belief arose from some deep issue with human agency. ie the apparent arrogance of trying to control natural forces, which would have been ascribed to the Divine, just one or two generations ago.

Nevertheless, I take issue with your edit. I believe your wish is in fact rooted in your own desire for agency. Those parents have lost a child, albeit from their own ignorance. In light of this, how do you think their prosecution would be a gain to society? We can surely educate others without inflicting even more damage on those poor, grieving parents.


Doesn't this whole argument hinge on the notion that children don't have a right to life/medical care outside of the context of the agency of their parents?

If they do, doesn't it follow that parents denying appropriate medical care to their children (on whatever grounds) is tortiously negligent and therefore ought to be prosecuted?

"We can surely educate others without inflicting even more damage on those poor, grieving parents."

On the contrary, I think we should make an example of people like this. The penalty for doing something like this really ought to be high enough to dis-incentivize people from carrying out these kinds of actions based on their idiotic beliefs.


I can scarcely imagine the pain of losing a child and knowing that it was due to my negligence. Do you seriously think that, on top of all this suffering, the risk of a custodial sentence would add further disincentive?

I am not arguing that these parents were not idiotic, just that their further punishment can serve no purpose.

We may have to agree to disagree though.


I mean, for the parents in question the ship has already sailed. I don't think it's a bad idea for parents who are engaged in clearly negligent behavior to be prosecuted (hopefully before something like this happens.)


But that probably includes you. You are not immune to this. Actually the entire human specie is subject to it, and regularly discover it's been wrong about incredibly important topic, at the level of a small group, a community, a country or the entire planet.

As an individual, you have most probably a huge number of things you just believe and never try to challenge. And among that, many that, if challenged by someone else, you would still hold to.

All in all, you can't start to change the world unless you recognize your own failures in the other people. They may do stuff you know are perfectly dangerous. And you may know better for a lot of things. But in the end, you are like them, and need to see it instead to have a real influence instead of thinking you are above that.


>But that probably includes you.

Absolutely, there must be things I think are true which are not. And particularly there must be opinions I would find hard to change. I don't deny that.

For instance the whole religion thing, I'd probably take it with a rather large grain of salt if god appeared and started giving evidence for himself, even though I think of myself as someone who is swayed mainly by evidence.

But I also think, perhaps wrongly, that we can actually learn things about our world, and the way to do that is roughly what we call science. And currently, a lot of people simply cannot think in this way. They don't have the evidence that hordes of scientists have gathered, and they don't possess the thinking skills to process what they're told when they come across new evidence. Irrespective of what my own personal knowledge gaps are.

>All in all, you can't start to change the world unless you recognize your own failures in the other people.

I don't believe in this pseudo-psychological idea we need to understand ourselves before we can learn anything about the world, or start to change it. Plenty of people have changed the world for the better without themselves having a complete understanding of their own biases. How could we know anything, if not in small pieces?


>But I also think, perhaps wrongly, that we can actually learn things about our world, and the way to do that is roughly what we call science. And currently, a lot of people simply cannot think in this way.

I disagree. I think that people contort themselves to find ways to see themselves as internally consistent precisely because they are rational within the inescapable prison of their own perception.

The issue really comes down to causation/credibility. Human psychology will cause each person to strongly favor whatever causal explanations best satisfy their biases.

Your example with religion is a good one. For purposes of this argument, let's assume that event actually did occur and it was not a hallucination.

To a religious person, the event would be a well-deserved manifestation given as a reward for their years of faithful observance, and they would likely continue stronger than ever in their convictions, fully satisfied that their position had been irrefutably proven.

But to a strongly non-religious person, the event you've described would be interpreted as a mental breakdown and cause them to seek medical treatment. After being duly medicated, this person would also likely continue stronger than ever in their convictions, having personally "overcome" a hallucinatory episode.

The same objective event occurred, but the context of one's psychology makes all the difference in its perception. This works with practically everything. No data, no event, no observation exists in a vacuum.

In the case of science/academics, people with agendas and biases are involved every step of the way, from designing the inputs to summarizing and drawing conclusions from the results. And there have been, and surely will be in the future, many significant incidents where the accepted consensus is proven disastrously wrong. Before that happens, the small handful of contrarians attempting to warn about or discuss them are typically considered fringe alarmists, bitter saboteurs, or establishment puppets.

This creates logical space for anyone to disregard the naysayers, set up camp in the contrary position, and gear up for an "I told you so" moment down the road.

If you know this person to be wrong, you will not win by attacking their conclusions, no matter how right you are. Rather, you will win by properly estimating their perspective, understanding why they choose to discredit the particular authority that proves out your position and credit a separate one that doesn't, and then enlightening the situation for them such that their own natural reasoning process will obtain to the "correct" conclusion.

This is called "sales" when it's done on a 1:1 basis and "marketing" when it's done en masse. It can be very difficult when there are powerful entities competing to cast the psychological context in their favor.


'This is called "sales" when it's done on a 1:1 basis and "marketing" when it's done en masse. It can be very difficult when there are powerful entities competing to cast the psychological context in their favor.'

Uh, marketing hasn't attempted to provide any arguments that attack conclusions, for a long time. It's been primarily about creating subconscious associations that are completely separate from arguments.


I agree, that's the point I was trying to get across.


> If you haven't learned how to think before becoming an adult, I fear that you won't.

I find this a really scary thought. It's not that people can't because of their age, but intuitively we know that people just won't be bothered to put in the effort. Especially when they've spent their whole life in a certain frame of mind. What's the incentive to change after so much time? For the vast majority of people there will be none, and that's scary.


"Spotting Bullshit" should be a mandatory elementary school class. First lesson is assume every statement spoken, written or thought by anyone, including yourself, no matter how charismatic, smart or authoritative they seem, is false until at least some small amount of first-hand evidence is presented. Second lesson is the definition of "evidence".


Both of your examples are things that have been common forever. There were germ theory skeptics, still are, that ignored the science showing how many deaths doctor hand washing prevented. And radiation healers were massive in the early twentieth century.


It wasn't just that there were germ theory skeptics, it was that its detractors were the consensus (ahem) unwilling to listen. Read the wiki on germ theory and you'll see its timeline consists of one heretic after another being shot down by the 97%.


> There were germ theory skeptics, still are, that ignored the science showing how many deaths doctor hand washing prevented

Don't forget to mention that one of the early proponents was involuntarily committed and killed as a result of social exclusion by his colleagues.

Later history was rewritten and a diagnosis of syphilis was used as justification.

https://en.wikipedia.org/wiki/Ignaz_Semmelweis#Breakdown_and...


But before the internet you had to go out of your way to find a second snakeoil salesman. In the internet age, they can be your whole village.


Nah, this stuff was rife and widespread any time you had a sufficient population of people. Please note that it's just as easy for an empiricist to be misled as it is for a religionist. You simply tailor the argument to their respective dogmas.

The internet has changed things in that it allows one to obtain information that appeals to their biases in a totally isolated and exclusionary manner.

Like it or not, if you're sharing space with someone in meatspace, their actions, choices, and reactions are going to be observed and processed by everyone else in the same space, having some type of influence, and the only way around that is for one of the parties to leave the physical area, which is usually not a trivial thing to make happen.

On the internet, if someone exerts an influence that causes discomfort for you, you just press "Mute"/"Hide"/"Block"/"Close" and it's gone forever. Our platforms learn that your biases oppose that type of influence and of course, tailor your experience to better appeal so you don't choose to use a competitor's platform.

The same kind of thing happens with search; someone who wants to reinforce their anti-vax bias will search for "reasons vaccines are bad" and find just what they've asked for. Someone pro-vax will do the same. The truth is that very few people in either camp are legitimately qualified to seriously critique either side's position, since you'd have to be a drug chemist yourself to know if the claims are valid, so as long as it looks like some semi-credible mumbo-jumbo at the surface level, it's good to go and the user will run with the one that works in favor of their bias. They never have to go through the discomfort of hearing from the other side. It's much harder to get this type of isolation in the real world for several obvious reasons.

The net effect is increasingly isolated social bubbles, which means increasingly diminished tolerance, which means increased hostility and polarization. Others can speculate on what comes after that, but it's probably not good.


I think it is entirely possible for people on one side of a particular discussion to be on more epistemologically solid ground than people on another side of a discussion in a general sense. If I'm reading you right (which I may not be) it seems that you're saying this is not the case. It's not a controversial or particularly interesting fact that everyone is human and therefore prone to weaknesses in reasoning that they will never be 100% free of, but it is also true that one can make a moderately successful effort to reduce these weaknesses somewhat through disciplined practice in critical thinking, and that there are people who can therefore reason better than others. Additionally, the scientific method itself is engineered to reduce the effect of these natural human biases that will always plague the scientists themselves. Religions, on the other hand, consistently enhance and exploit these biases to promote their narrative.


>I think it is entirely possible for people on one side of a particular discussion to be on more epistemologically solid ground than people on another side of a discussion in a general sense.

Sure, that's possible and common. But it's still just as easy to mislead secular persons as it is to mislead religious people.

>one can make a moderately successful effort to reduce these weaknesses somewhat through disciplined practice in critical thinking

Yes, I agree and I think that's important, but it should be acknowledged that wherever you're dedicating energy to this effort, it's coming out of something else, and that when you're not aware, you are basically falling back to default decision-making and credibility determination mechanisms. One should not make the mistake of believing that because they are good analysts, their decisions automatically qualify as more deliberate.

>Additionally, the scientific method itself is engineered to reduce the effect of these natural human biases that will always plague the scientists themselves. Religions, on the other hand, consistently enhance and exploit these biases to promote their narrative.

This is where you're starting to lose me.

Academia is just as vulnerable to bias and groupthink as anything else. It is laughably naive to say that the principles of the scientific method keeps academia more objective than any other field.

In fact, the opacity and density of academic papers will frequently be used to obscure what is otherwise plainly false information, allowing it to be weaponized into propaganda (with varying degrees of subtlety) that common sense and/or decency would not allow people to accept any other way. "There are lies, damned lies, and statistics", as the saying goes.

"The diocese says..." and "the university says..." are identical lines of thought, applied to a different authority. In both cases, the technical background behind the proclamations from either institution is serious and well outside the grasp of the average devotee or proselyte.

You may be operating under the belief that religious proclamations occur based on the whims of religious leaders, but this is no more true than the claim that academic proclamations occur based on the whims of academic leaders. Both groups have significant influence and control, but people do not just follow these things without an apparent technical/rational justification, and credible religious edicts are works of real and serious scholarship.


Academia is just as vulnerable to bias and groupthink as anything else.

I don’t equate “academia” with the scientific method. In fact, the two are often opposed to each other. But since that is where you want to go with it, I would question this sentence of yours as a false equivalency. I submit that “academia”, as much as you can think of it as a single institution (of dirty liberal elites or whatever), is definitely vulnerable to bias and groupthink (even to an astonishing amount, depending on your preconceptions), but your idea that it is just as vulnerable to bias and groupthink as literally anything else is not true. For example, I can substitute “flat-earthers” for your “anything else” and it becomes very clear that flat-earthers as a group are much more susceptible to bad ideas than “academia” as a group.

Merely pointing out the failings of scientific practitioners, of which there are copious amounts, is not an argument for putting the scientific method on equal ground with religion. Appeals to authority is not equivalent to the scientific method, and it is not the only reason why one would trust it over the declarations of religions. Perhaps you might be one of those people who thinks that because science refines its scientific theories over time, that they are “always changing their minds”, rather than creating more accurate models. Maybe you might be one to say that evolution, for example, is “just a theory”?

One can easily see the kind of bullshit that exists within the scientific community without being so unwise as to think that makes it just as credible as any other source of information. I could talk your ears off about what annoys me about the research community, but you’re probably fairly well versed in that, since you make it an important part of discrediting them as a whole. Ultimately what annoys me about them is precisely the ways in which they deviate from the scientific method in order to advance their goals of getting published and recognized. There are bad incentive systems in place in academia that make it so that papers with no real substance get published; negative results are polished like turds to pretend like they were looking for a different result the whole time. Bad statistical methods are used to obfuscate; no actual reproduction of results are being done in large swaths of the field; scientific journalism hypes single paper results to the detriment of science in the public’s eye. I could keep going. Yet still these failings do not even hold a candle to the actively misleading aspects of religion.

You may be operating under the belief that religious proclamations occur based on the whims of religious leaders, but this is no more true than the claim that academic proclamations occur based on the whims of academic leaders.

Many religious proclamations actually are made up whole cloth by religious leaders. Take Joseph Smith, for instance. When someone invents their own religion, it is called a cult in the beginning, but that pejorative eventually goes away after the founder dies and the organization evolves and achieves more mainstream success.

However, it isn’t terribly important whether religious proclamations come freshly made up from a single person or if they are evolved from legends and handed down from earlier generations. What’s important is their epistemological foundation. “God says so” seems to be the sum of it. The evidence? Emotional reasoning and appeals to just having faith.

credible religious edicts are works of real and serious scholarship.

I don’t know what “scholarship” is supposed to mean in this sentence. Just because a person studies something does not mean that the studying was done properly or that the subject matter or learned material has any merit whatsoever. I’m not aware of any religious edicts that have revealed any new knowledge to the world and have met their burden of proof. I am aware of a whole lot of edicts that have not.


>Many religious proclamations actually are made up whole cloth by religious leaders. Take Joseph Smith, for instance. When someone invents their own religion, it is called a cult in the beginning, but that pejorative eventually goes away after the founder dies and the organization evolves and achieves more mainstream success.

It's funny you mention Joseph Smith because he disproves your case. Whether someone believes his claims or not, he offered something to substantiate them: a lengthy, coherent book with a reasonably-consistent internal narrative that claims to underpin his theological innovations.

Same is true of L. Ron Hubbard; Scientology is a thing that has gathered a following and continued on solely because he provided something substantial to anchor it against.

Thousands of rambling fringe religionists have come and gone since Joseph Smith and they fizzled out precisely because they had nothing significant to undergird their claims. The small followings these people gather out are held together by force of personality, and as soon as they're gone or diminished in the group's eyes, the group dissolves.

>However, it isn’t terribly important whether religious proclamations come freshly made up from a single person or if they are evolved from legends and handed down from earlier generations. What’s important is their epistemological foundation. “God says so” seems to be the sum of it. The evidence? Emotional reasoning and appeals to just having faith.

To the layman, the evidence for either is the equally convincing. Compare Clarke's Third Law: "Any sufficiently advanced technology is indistinguishable from magic."

Joe Q. Public truly has no idea if the claims of a scientific paper have a valid epistemological foundation. One must undergo many years of training to become a subject matter expert and provide a serious or credible critique/endorsement of such works.

Until that point, the reader is at the mercy of his credibility heuristics to determine which subject-matter experts appear to know what they're talking about.

Serious religious works function the same way. The average member of the public cannot go into something like the Book of Mormon and make an extensive critique of it, weigh its archaeological consistency against similar theological documents, etc., just like they can't go through a scientific work and validate that the scientist sampled his/her data wisely, chose the appropriate calculations, accurately summarized the data, and so on.

If you have the dense tome for people to gloss over and use to make their own connections, it becomes 100% about projecting an image of trustworthiness and believability.

You can say "Well they can see my science works", but then, the religionist could also say "well they can see my religion works", and then it's all about impression/perception. For example, one could perform a magic trick and claim to have invented a hat-based rabbit replicator. Provided you had enough bunnies and interactivity to fool the senses, this would be highly persuasive unless there happened to be someone who was competent to get into the mechanics of it. "Seeing", or thinking you're seeing, is believing.

People are, in fact, much more excited to experiment and engage with religious content because it addresses an area of their lives that they feel is much more important and accessible (their personal/emotional/familial well-being).

Religion and science are much nearer to each other than you think. Their separation is fairly recent; before the last few centuries, it was typically religious institutions driving innovation, understanding, and discovery.

Religion and science both must be accepted based on trust heuristics (called "faith" in religious contexts) by everyone who is not an expert in the specific speciality, which means that everyone is relying on these 99% of the time, since one can only be expert in a small portion of subjects.

Because acceptance of these principles operate identically, you can hijack either group by using the same method: produce a tome that is surface-plausible, and then change things to match the trust heuristics most commonly deployed throughout the group. This will require some changes in framing and vocabulary, but not in the fundamentals.


I'd like to first try to summarize the thesis of your comment so that you can understand my conception of what I’m arguing against and correct me if I misread you.

I think you're saying that since humans have limited capacity for time, attention, and learning, and since there is so much information and claims out there in the world, we all naturally have to rely on what you call trust or credibility heuristics in order to determine what we will accept as truth and what we will reject. They are heuristics in the sense that they are shortcuts that make up for the fact that we cannot go and learn everything there is to know about every subject that crosses our path, and these shortcuts are necessary because we must make decisions every day concerning what we will believe when faced with claims and new information regarding subjects we know precious little about. These shortcuts or heuristics often take the form of placing trust in previously accepted authorities on the subject, the opinions of trusted acquaintances, or one's own senses.

I agree with all of that.

Further, I think you're saying that the credibility heuristics that people use are all the same, and are equally effective in determining truth from fiction, and equally vulnerable to being hijacked by con artists with books and large followings. This therefore must mean that people who accept scientific consensus on topics like evolution do so for the same reasons that people think Joseph Smith talked to angels.

I would dispute that.

This goes back to my previous comment where I said that it is possible to find oneself on more epistemologically solid ground than another on a given topic. This is possible because some credibility heuristics are measurably better than others, and one can improve one's own credibility heuristics with the application of standard critical thinking skills.

So maybe you have an authority you trust on a given topic X. One question to ask yourself is, why do you trust this person on topic X? There are good answers to this question and there are bad answers. A good answer might be because this person has a degree in X and has worked in the X field for several years. A bad answer might be because this person knows a lot about topic Y, or perhaps because this person talks about X a lot and claims to know about X. A well trained set of credibility heuristics seems to avoid potential problems here. One type of heuristic is more likely to be hijacked by bad information than the other.

Perhaps you accept proposition P. You might ask yourself why you accept this proposition. A good reason for accepting proposition P would be that proposition Z is an observable fact, and that Z => P is a result claimed by several diverse trusted sources. A bad reason would be that P => Z is claimed by several diverse trusted sources, and that Z is an observable fact.

A proper set of credibility heuristics would also take into account the weight of the claim, namely a measure of how important and/or extraordinary a claim is. For example, if you were claim in your next comment that you own a cat, I would accept that claim at face value. The reason I would so easily accept this claim is that it is both mundane and unimportant. There is no risk to me for accepting this claim if it happened to be false, and it is an entirely ordinary thing to own a cat.

Now if you were to tell me in your next comment that you were abducted by aliens, then I would dismiss the claim. It is an extraordinary claim which therefore requires extraordinary evidence that cannot be provided in this medium, and accepting such a claim would have a huge effect on my worldview, and therefore incurs considerable risk on my part. Any credibility heuristic that fails to account for this variance is going to be more susceptible to being hijacked.

With basic application of critical thinking, one would understand that a hat-based rabbit replicator is an extraordinary groundbreaking claim requiring extraordinary evidence, including a full explanation of the mechanism, independent verification, and a consensus from a variety of diverse external sources. One would understand that the established truth of such a claim would revolutionize science almost overnight and would have a huge effect on the way we all live. It would be a extremely difficult task to convince a critically thinking person using merely "enough bunnies and interactivity to fool the senses". We understand that the senses can be fooled in various ways. We go to magic shows specifically for this reason. A good credibility heuristic would take this into account.

There are a myriad of critical thinking tools that can be used to improve your credibility heuristics in a way that does not require you to become a subject matter expert in every area that affects you. To the extent that you pursue these techniques and apply them, the less likely you are to fooled by false claims dressed up to look plausible. You can avoid falling for conspiracy theories like the flat-earthers, the moon landing truthers, and Holocaust deniers.

Well tuned credibility heuristics would help you understand that a story appearing to be "coherent" or "reasonably-consistent" does not support the claim of it being a translation of ancient golden plates scried from a magical peepstone. It would also help you avoid falling for the idea that praying about it and having a good feeling afterward is evidence that a ghost is telling you that all of the claims are true, or that a lot of people believing in a set of claims is by itself evidence for the truth of the claims.

Everyone's credibility heuristics could be improved by understanding that Sturgeon's Law still applies to individual scientific papers hyped by the media, and that independent repeated verification is a necessary filter that takes time and patience. A basic understanding of how the scientific method works, why it works, and how it is generally applied in academia and research institutions (warts and all) is an important part of a healthy set of credibility heuristics.

Carl Sagan called these types of well-tuned credibility heuristics a "baloney detection kit". He has a whole book about it, which I highly recommend.


> But before the internet you had to go out of your way to find a second snakeoil salesman. In the internet age, they can be your whole village.

Well, they'd pass through your village at regular intervals. But I understand what you're saying - in the old days, a snake-oil salesman had to travel through each village, and risk his business and life. Today, you can hire a botnet to reach ten million people with your claims of a magic stock about to rise, or a pill that makes you irresistible to women, or a device that cures your cancer.


Pretty sophisticated people can fall victim to this sort of thing, like Steve Jobs refusing cancer treatment in favor of vegan diet and exercise.


It's not that people "haven't learned to think". It's that their perception of reality is driven by their ego.

It must be understood that this is the case with all mentally competent people. An ego's appetite can be trained and controlled, but generally speaking, everyone wants to see themselves in the same basic, overwhelmingly positive way.

When an "empiricist" comes into the fray, they've trained their ego's appetite to prefer circumstances that allow themselves to be on the side of "hard data", "scientific consensus", or, as you've put it, "academic orthodoxy". Thus, the empiricist will dismiss theories that seem to be "outside of the orthodoxy" and ridicule their purveyors/supporters/believers.

This is your ego operating to give you the room to feel superior and keep your self-image adequately positive. The same thing is happening in your friend who has the quantum healer machine. They see themselves as someone who is able to spot frauds even when many other people cannot and use this information to their advantage (something you have in common).

Most likely the key difference here is the amount of trust each party has learned to place in secular/technological/academic institutions. You believe you can see the fraud perpetrated on the public by exploitative marketers and snake oil salesmen. They believe they can see the fraud perpetrated on the public by exploitative elitists and the corporate upper class (that's currently suing them for an unpaid medical bill).

The truth is that both perspectives are sometimes valid. "Learning how to think" is learning how to recognize your biases and think about an issue from a neutral, dispassionate perspective, without discounting any potential argument without consideration.

It's never too late to learn that, but most people have internalized their emotional disdain for "the bad guy" that they believe is oppressing them, whomever they assume that is, such that they'd rather make up a justification to discredit the other side's arguments rather than admit that they're potentially, though not necessarily, reasonable. And while overriding this impulse is a great skill to practice, it must be understood that very few people will ever do so to an extent that makes it usable.

---

The key to understanding people, IMO, is recognizing that human decision-making does not operate on objectivity or observation, but rather on the unavoidable and undeviating need to see oneself as intelligent, aware, and discerning.

People must see themselves as important and significant to justify their existence. This is true of all people; it's part of human self-preservation. Do not believe there are exceptions. Most people don't like this explanation because it implies that their self-importance is illusory, which funnily enough, really only reinforces its correctness.

The way to control your ego is to train it to value the right things as signals of importance.

If you want to change someone's mind, you must not argue that their conclusion is incorrect; rather, you must discover what biases and emotional mechanisms are at play in shaping the perception that leads them to that conclusion.

Once you accept this, the world is much easier to understand, social situations are much simpler, sales and marketing become both much more practicable and much easier to notice, and a lot of bitterness and antipathy goes away.

Disclaimer: I have no research or supporting evidence, this is pure conjecture, and there is no reason at all why you should listen to me.


> Disclaimer: I have no research or supporting evidence, this is pure conjecture, and there is no reason at all why you should listen to me.

Damn... while I was reading your post I was yelling "I buy it! I buy it!" in my head, and then it turns out you don't have anything more to sell? :(


This reminds me a bit of brazenautomaton's philosophy, but with less depression about popularity.


It's not just an internet thing. The entire profession of Chiropractic care has long been established to be a pseudo-scientific field. Research on the effectiveness of the treatment is at best conflicting and it's effectiveness is not scientifically established except to say that it may be as effective as Tylenol for certain types of lower back pains. That hasn't stopped people from rushing to them and governments from licensing and regulating them giving what is essentially snake oil salespeople the credibility of state backing.

There are many other examples of pre-internet mass gullibility, like homeopathy, Reiki and assorted foolishness. Maybe the internet has made it easier to market foolishness, but I doubt we can say it has made us more foolish or gullible.


>Maybe the internet has made it easier to market foolishness, but I doubt we can say it has made us more foolish or gullible.

No, but it does help connect everyone with the particular flavor bullshit they're going to enjoy eating up.


It seems like it should be possible to at least quantity this phenomenon for each topic by doing some internet segmentation and classification. I assume telling someone that they are in a fringe bucket is not actually an effective way to reconsider their position, though.


Those people existed before the Internet was widely available. They just shared information via books, leaflets etc. Arguably I'd say the Internet makes it less sustainable as they can't help but be exposed to contradictory beliefs.

However it also makes it easier for them to organize and spread their ideas...

I'd love to see data on wether this is a net gain/negative.


40s, huh? i'm in my 30s and i learned about 5 years ago that changing peoples' minds is an impossible, pointless task. people come to their own conclusions based on the bullshit they choose to surround themselves with.

you could think the moon is square, doesn't bother me one bit. i'm living my life, not yours.

it's far better to focus on your own knowledge and self improvement. it yields a higher return on investment.

if you want to affect change, throw your time and money into actually influential things, not social media arguments.


As someone who has changed his mind, I've found that it is wrong to think that minds can't be changed. When people complain about how minds aren't changed, they are typically complaining about minds not changing now, i.e. At the point of debate. But minds can change over time, it's just that you don't get that immediate feedback that you want.


Exactly. I'm tired of self-proclaimed internet veterans witnessing people digging in on individual online conversations and then concluding that nobody ever changes their mind and all debate/discussion is pointless. I have done a 180 on so many positions I used to have thanks in no small part to people online who did not take this asinine defeatist perspective and went ahead and engaged in debate while the other side (including me) just dug their heels in during the actual conversation.

I think a lot of people need to be educated about what the benefits of debate are and how they manifest slowly over time.


I feel that in many ways, there is a take-a-side mentality bred into all of us via social cues from a young age. Your football team, the city/country you are from, the education provider you attended. And as I've gotten older, I've become more and more convinced that in many instances, people view position switching as some sort of deadly sin. So they stay locked into their opinion. I have a mate who is pretty smart, but a lot of his ideas and beliefs come from his Dad, and as a result, you are no longer just breaking down his opinion, you are attacking the Father figure's. And over the course of our friendship, we have always had running debates on various things, and I was struck recently by his comment of "Well, you didn't always think that". The comment was made in a pretty smug way, like I had somehow lost the debate because at some point, realised that I was probably incorrect. I think this is the most damaging thing, like there is a certain level of pride taken in just taking A position. Not necessarily arriving at one through debate or critical thinking.


This needs to be said. I just which there was a better way to retroactively deliver that feedback. That could diminish defeatism.


The issue is that most people on social media aren't looking for enlightenment, they're looking for validation. If they don't find it from one source they'll likely seek out a more suitable one. The only one who can change a person's mind is that person, all the rest of us can do is encourage.


If the people are participating in social media forums where debate happens or can happen (without it being considered rude), then they have not yet fully enclosed themselves into a comfortable echo chamber. If they aren't just there to troll, and seem to be engaging honestly, then debating with them is not a waste of time. Nobody is going to change their mind to match yours during one single online discussion (unless they come in wanting to have their minds changed and experience a lot of social pressure to admit to it, a la /r/changemyview). The audience matters, and even the person you're arguing with will likely remember the points you made, even if only to counter them, the next time they voice their opinion. The same goes for us. We learn more about their side and where they're coming from each time, and that counts for something.


> I have done a 180 on so many positions

Care to give any examples and the reasons for your first position and subsequent change?


I used to be a very conservative religious person. I had a whole package of beliefs that were ingrained into me from birth, and I grew into them and claimed them as my own. I thought that homosexuality was very damaging to society. I thought that human caused climate change was either fake or that there was nothing to worry about even if it were true. I did not believe that humans evolved, or at least I was somewhat conflicted about that. I thought that creationism belonged in schools. At one point in my early twenties I even listened to Rush Limbaugh and agreed with him (I cringe every time I remember that). I knew so many things, and I was happy to know them.

I also liked to think analytically and enjoyed the intellectual stimulation I got from certain online forums (this one included). I liked how people would trade counterarguments and keep each other honest by pointing out bad thinking or logical fallacies. When someone would argue against something I believed in, I would argue back. From their perspective, my mind was not being changed and they were wasting their time, but from my perspective, the things they would call me out on stuck with me. I tried to avoid falling into logical fallacies and bad arguments because I knew I'd be called out on them. This restricted the ways I had to justify my beliefs, so that made it challenging. Then at church I noticed that nobody was being held to these standards at all, and fallacious reasoning was being let fly freely and constantly. I experienced a lot of cognitive dissonance during that point in my life which ultimately led to deeper investigation and study. I give a lot of credit to honest online debaters who pushed back on my ideas and kept me honest about the substance of my arguments.


Minds can be changed. For one, propaganda does work. If only minds could be changed using more honest means... Granted, education works too, to some degree. But that requires even more work on both parts.


> you could think the moon is square, doesn't bother me one bit. i'm living my life, not yours.

The problem is when this wrong headedness starts to interfere at a public policy level, like climate change.


It's also extremely difficult to talk to someone who truly thinks the moon is square. Suppose I actually want to talk about the moon? Or suppose I admit there's a lot I don't know about astronomy (a fascinating subject!), and this person thinks it's an invitation to speak about astrology?

What I mean is, it doesn't even have to affect public policy to be bothersome. It also affects informal small-talk with acquaintances.


There is, however, a rather large difference between "bothersome" and "a bunch of idiots voting for things that are going to kill people, possibly even me... and winning".


Yes, fully agreed, the latter is dangerous. The former is still annoying, though.


For what it's worth, I've had luck moving some people's anti-science opinions by meeting them at their level of understanding, and offering some compassion. Might help that they are relatives and not total strangers though.


To me it's more like wanting your city to be a clean place. You see some trash, you pick it up and throw it in a trash can, even if it's not yours.

When people have clearly wrong beliefs (e.g. a square moon), it might not matter all that much to you, but they might harm someone else by spreading the trash, or worse, connect with like-minded people and create a real danger for others (legislate pi to be 3). That mind of trash is no less a part of your environment than the physical street, and I think it's a very good thing if you make an effort to pick up the trash, even if it's not yours.


>i learned about 5 years ago that changing peoples' minds is an impossible, pointless task.

And you learned the wrong thing. It is possible to change peoples minds and behavior with online conversation. It just isn't easy and it doesn't go quick (a single line directly pointing out a flaw often isn't enough, you have to bring across that you exerted effort, to show you care). If over time enough people invest time and effort into pointing out that you are wrong in something, you will change your mind. You might not admit it, you might not even notice it, but it will happen.

You might be right that it is perhaps not the fastest or most efficient approach, when you suggested spending time and money on actually influential things (though I'm not sure what you have in mind there).

Just look at the responses you have received so far here. Several people with different backgrounds have responded to you and said "you are wrong, you can change people's minds in online social interactions".

Now you have a choice. Will you hold on to your (really not very good for mankind as a whole) statement, so that it self-fulfills and you can still consider your own opinion correct and your thoughts and behavior consistent?

Or will you take that leap and believe people can change and act accordingly going forward?

Edit: I hope you realize that if you respond to my plea with a convincing statement and I believe you and change my mind, we both enter seriously troubled waters.

After all, do you really want to change my mind on the fact that one cannot change someone's mind in a social media setting, in a social media setting?


Ignorance and scepticism are a double edge sword. The problem is absence of curated information. The fact that she believed in how homeopatic medicine worked, or that he believed in aura cleansing, has nothing to do with the capability of the method itself, but with the source of knowledge for the carrier of the method. There are alternative medicine that work wonders, there are concepts of aura. The application of the method is wrong: non physical nature of the aura concept presented via foreign means to the method itself: a device. Or, the context is wrong: general immune system boost with alternative medicine will also cover the fact that her organism could fight off a specific case of infection. While she is ignorant in why that might have worked, I would not equate the method with the application of the method in your post.


"The fundamental cause of the trouble in the modern world today is that the stupid are cocksure while the intelligent are full of doubt.” — Bertrand Russell


That and our psychological quirk the we mistake self-confidence for competence. The two together make a pretty combustible mixture and explain a lot, IMHO.



Ah, Dunning-Kruger at scale; summing up populism as a startup.


The brain must constantly be doing triage on memories, without conscious intervention. And apparently it recognizes that there is less need to stock our minds with information that can be readily retrieved.

Or as my grandmother always said, "if you want to forget something, just write it down."


Ok, if the article is right, then:

  - our minds don't bother to remember things when we know the information is stored externally and easy to access
  - given the aggressive way our minds exploit this, remembering must be expensive
  - given that we can now get by while remembering less than before
  - to what end can we put memory capacity we've now freed up?
One answer might be "living longer". As our lifespans extend, we may find ourselves accumulating too much knowledge to quickly remember things when we need them. If we're able to conserve memory more during our early lives, we may be able to stave of senility longer.

Or maybe we can put that capacity to other uses. Our brains are pretty adaptable, and maybe we can use some of it that was devoted to remembering facts and experiences to instead remembering skills or languages or thinking about the present.


Ranking systems help solve this. For example in games. Even in 5v5 games better players will eventually get to the top, because they will cause more games to be won in aggregate over their career. Similarly, systems of apprenticeship, and rankings found in things like martial arts and sports.

The problem is that we need 100,000 doctors, some of them will be more informed than others. The less informed ones will cause the other's to look worse. A doctor is still more informed, but when there are disagreements, it would be useful to know the relative standing of them. It would help with these sorts of problems.


Problem with ranking is that the worse you are, the more you're put into toxic teams with players that don't cooperate/are griefing, thus are much easier to blame for your failure. What's worse, that blame is often legitimate.


Games have a solution for this. Toxic players have their own separate section where they are ranked against each other.


We used to remember information, now we remember keywords to retrieve the information. We haven't lost it, it's just one step away. The information is stored implicitly.

The upshot is that we have turned into skilled information seekers and are used to evaluate the quality and credibility of our sources.

We might not remember all the trivia, but when it comes to discovering new interesting domains and quickly learning a lot about them, and finding people with similar rare interests, the internet (google) is king.


I'll have to remember to recall that, once, I saw a very insightful HN comment about how we used to remember information, and now we remember... -- bah, I forgot, let me go look that up.


Today, people around the world marched in the name of science. Back in 1969 we put a man on the moon, and in 2017 we're having to march in the name of science.This appears to be one giant leap backwards for mankind.

The age of the printing press coincided with the age of reason. Is the age of the internet the harbinger of an age of disinformation, alternative facts, and ignorance?

This post looks at that very possibility, and if so, is the future taking us backwards? Are we devolving as a species?

https://jimdroberts.wordpress.com/2017/04/23/the-information...


I've often wondered whether McArthur Wheeler - the lemon juice disguised bank robber discussed in this article - actually had some mental challenges rather than lacked common sense. It seems a whole branch of academia has grown off his unfortunate back...


Is this real? It is unbelievable.


The lemon juice story first appeared in the entertainment section of the Pittsburgh Post-Gazette in a compilation format article about dumb criminals. This article was referenced in Dunning's and Kruger's paper, and the story spread from there. The original article looks suspicious to me as well.

https://news.google.com/newspapers?id=ZNlRAAAAIBAJ&sjid=DXAD...



It's the first paragraph of the article.


Difference in skills is clearest in games. When you can see someone winning and someone losing you can know who's better, despite perhaps not understanding why the winner won.


It's only totally clear in one on one, perfect information games like chess. Take a 5 on 5 game with hidden information such as League of Legends and people come up with excuses for losing: bad luck, bad teammates, bad game balance. All it takes is a little bit of ambiguity for people to seize upon an escape hatch for their threatened ego.


I think a variation on this is a key factor of life and culture in engineering - at some point, either what you made works, or it doesn't. You can argue, debate, philosophize and talk, but at the end of the day - there's either a working thing someone else can look at, or there's not.

(I imagine this is true in other places - sales, business, design - it just gets increasingly hard to tell "if it worked")


You're right, but sales, business and design have too many variables involved in the apparent success of what you make, so you're not able to safely verify what "worked".

Even in engineering, two people can make something that "works", but which one is better? Maybe the worst product had the better marketing, so it sold more and in the end there are more people praising it, then you'll think it is better.


Yes, "it works" is a rather low bar. Almost all generic products in the market "work", but some are clearly better than others.


We're in agreement.

It's a filter, not a metric: you can say this product worked, as in it did what it said on the tin, but the worse/better comparison doesn't work out.

The other aspect of "at some point you can just build it and it works or doesn't" is that when it's built, you can (ideally) just go read the code and find out how it works. It's ye olde issue of "it would take me almost as long to explain it as to make it..." - if I'm having trouble communicating an idea, I can just go make it, and then show it to you.


Some players end up being non-transitive though.


This is why I never took notes as a child. If I took notes I forgot everything I learned. If I didn't, I remembered it.


I found the opposite to be true for myself. If I don't take notes, I forget soon. But if I take notes, at the time of taking notes, I am processing the information in a way that I can explain myself. So somehow that helps me remember a little longer even if I don't read those notes a second time.


Deepak Chopra comes to mind. Listening to him weave a tail of why something is because of quantum vibrations always makes me feel slightly ill.


Since we are playing with words. Startup idea: the meta-internet to restore humanity to its previous intellectual glory.


I see this sort of thing on Hacker News all the time. Programmers that are, I'm sure, the top of their field comment in threads about biology or something out of their field with the most silly and wrong answers. A thread about nutrition or something out of their realm of expertise on here is about as useful as reading a Facebook or YouTube comment chain. That doesn't stop them from speaking authoritatively about it. :)


This happens on HN within our field too :) . I see stuff hit the front page all the time, only to be ripped to pieces a few hours later when someone that knows what they're doing appears.

Do you remember the "super fast hash algorithm" from a few weeks ago that turned out to be rubbish? What about the "high performance TCP proxy" on the front page right now that's not proxying correctly or with unusual speed?

People seem to upvote on title alone


>Do you remember the "super fast hash algorithm" from a few weeks ago that turned out to be rubbish? What about the "high performance TCP proxy" on the front page right now that's not proxying correctly or with unusual speed?

Seems like debunking Hacker News stories might make for a fun project.


I suspect your definition of "fun" may align with a very small percentage of the population.


I can confirm this to be the case.


Double points if it can be automated!


Haha! So true. And nutrition in particular seems to be something the tech-literate crowd seems to want to "hack". It's always fascinating to read threads in HN about dietary supplements or keto-this-or-that or paleo diets or how it's better if you only sleep 2 hours per day total and only drink special shakes made by some dodgy startup, all of this frequently debated with religious reverence.


Nutrition makes a lot of sense, when you consider just how much time the average person spends around food - you're generally reminded of nutrition three times a day.

It's a large aspect of health, money, and comfort (see: going to the toilet), and frankly there isn't very thorough consensus (it's been suggested that the food pyramid is biased by commercial interests of the time), and what's more, most people aren't cooking or paying for their own meals when they learn about it in school, so they don't remember it that well anyway.


Indeed, it makes sense to be interested in nutrition. What's funny is that most people commenting here on HN are out of their depths and talk about nutrition, supplements, bizarre shakes, fad diets, keto, calorie restriction or whatnot with the same pseudoscientific beliefs as other non-HN people discuss astrology ("but it really works!"). They "don't know that they don't know" and believe the most bizarre things. That's very relevant to the article and also to the OP's comment at the top of this subthread :)

PS: lest I sound too arrogant, nutrition is not my particular blind spot (I'm aware I don't know much about it), but I'm sure I have other blind spots I'm not aware of!


My wife is a dietitian with a PhD, and I can scarcely stand to read HN comments on nutrition articles. And I'm not the one with the degree!


For some reason, computer nerds think they know it all :-)


  Because they are autodidacts.  The main purpose of higher education
  and making all the smartest kids from one school come together with
  all the smartest kids from other schools, recursively, is to show every
  smart kid everywhere that they are not the smartest kid around, that
  no matter how smart they are, they are not equally smart at everything
  even though they were just that to begin with, and there will therefore
  always be smarter kids, if nothing else, than at something other than
  they are smart at.  If you take a smart kid out of this system, reward
  him with lots of money that he could never make otherwise, reward him
  with control over machines that journalists are morbidly afraid of and
  make the entire population fear second-hand, and prevent him from ever
  meeting smarter people than himself, he will have no recourse but to
  believe that he /is/ smarter than everybody else.  Educate him properly
  and force him to reach the point of intellectual exhaustion and failure
  where there is no other route to success than to ask for help, and he
  will gain a profound respect for other people.  Many programmers act
  like they are morbidly afraid of being discovered to be less smart than
  they think they are, and many of them respond with extreme hostility on
  Usenet precisely because they get a glimpse of their own limitations.
  To people whose entire life has been about being in control, loss of
  control is actually a very good reason to panic.

–– Erik Naggum, 2004


Good ones don't... just like any field...

"Intelligent" people know how much they don't know... "Stupid" or "ignorant" know everything and want you to know it.

Doctors, Mechanics, ... nerds... no one group is immune to this.

https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

"The Dunning–Kruger effect is a cognitive bias in which low-ability individuals suffer from illusory superiority, mistakenly assessing their ability as much higher than it really is. Psychologists David Dunning and Justin Kruger attributed this bias to a metacognitive incapacity, on the part of those with low ability, to recognize their ineptitude and evaluate their competence accurately. Their research also suggests corollaries: high-ability individuals may underestimate their relative competence and may erroneously assume that tasks which are easy for them are also easy for others."


I thought the most interesting aspect of these studies is that intelligent people are better at judging the intelligence of others. Those in the bottom quartile could not estimate the intelligence of peers much better than random chance. Those near the top could estimate the intelligence of others with high accuracy.

This has profound implications in hiring selection. Having exceptionally bright founders could cause a.... founder effect resulting in the entire company being much smarter than average.

Anecdotal, but I know of quite a few wildly successful companies where the founder/CEO still participates in all interviews despite having more than a thousand employees. This seems like a random company quirk but, taking this research to practice, could actually be a large part of why these companies are successful.


With warning of anecdote rather than evidence, Doctors are the worst [1] for this! So many of them seem to think they're scientists, when the majority of them are far from it.

[1] In the hyperbolic sense; I wouldn't like to suggest a literal truth here


Doctors are at a bad intersection of customer service and science.

On the one hand, doctors need to sound authoritative to 99.9% of all people since they are almost always delivering news the patient doesn't want to hear (exercise, eat less junk, take these pills 3 times equally spaced not 2 with lunch, etc.)

On the other hand, many doctors just aren't that smart. Being a doctor today isn't an automatic sign to me that you are in the top 1% of brainpower. Some of my doctors are, most aren't. In addition, many doctors don't continue to learn over time. If you do that, you eventually become dumb.


Well if you listen to doctors they'll say we apply scientifically proven methods, and some portion of them actually do science. My impression is that doctors like to sound smarter than they often are. Since medicine is more of an art form than anything - most physicians are like artists (albeit highly trained ones). And as with any art you have the whole spectrum of quality.


Well, in regards to the article I think experimentation should be cheered and not tossed away as stupidity. Wheeler isn't actually a good example of this effect.

Creating environments where people, especially people with the correct answers, are not afraid to voice their sincere opinions without the fear of being called or looked at as stupid is what is needed. Experts today, are now challenged, by persons who are incorrect and correct. An authority usually tries to shut up both as if the two were the same.

"something out of their realm of expertise on here is about as useful as reading a Facebook or YouTube comment chain."

Casting aside the clear flaw there (which YT, FB chains?), I'd bet on HN against the general public because people here are more likely scientifically minded and therefore self-correcting.

"That doesn't stop them from speaking authoritatively about it. :)"

This effect should only really matter when it comes to an individual having authority (decisions or influence) over others.


Nutrition, science news and economics are the worst subjects for discussion in HN.

I'm cautiously optimistic because I see multiple levels ignorance that was me 5,10,15,20,25 years ago. I'm sure someone sees my comments the same way. Even if the the average quality of discussion stays at the same low level in HN as it was in Usenet long time ago, it's not the same people making the same stupid comments over and over again. There may be personal development.

I don't believe that one person keeps posting "correlation is not causality" comment to every science news discussion for 10 years without good reason. It's just a phase after learning statistics, I hope.


I think, in my ignorance, that this is a deeply human desire to understand the world or at least parts of it. If there is a simpler, less complex concept you can grasp it's always something that appeals to some.

And also there is a need for belonging and FOMO, so if you find a nice subculture that appeals to you and is graspable in it's complexity, it's way easier to stay informed and connected in that, than a more general topic.

Specialisation is indeed is the only way to understand something to the state of the art for more than two centuries or so now, since it's humanly impossible to have a working understanding of the whole of human knowledge.


Luckily most of the people here, if they wanted, could easily get an entry-level education in any field, because they know how to study from books, can get access to the material, etc.


Potential is not reality.


Potential is the requirement for democracy, though.

You can only have democracy if everyone has the chance to educate themselves on the topics that are relevant to them.

If, in your country, the majority never learnt how to do so, then democracy will inevitably fail. Many people in STEM or other sciences have an advantage due to that, but this should be standard for all.


Oh, yes, its so easy, thats why every programmer aquiring ___domain specific knowledge never makes any mistake and the software - once feature complete, always guarantees satisfied customers. We are instant experts, just add hot water and stire for five minutes.


No.

I am saying, the people here could actually make qualified comments, if they wanted to. They could learn about the topics they comment on.

But they choose not to.

Which is a different story from most of humanity, which doesn’t even have a choice, doesn’t have a chance to learn.


I don't see the need for this much snark in response to a relatively innocuous comment.


Yes, i was over-shooting, because i had that attitude myself - and this is not innocuous . This hybris ruins lifes. Better snarky here, then Snarky out there, at the end of project, when it really hurts.


Why would I? Surely the point is that I'm an expert on everything already.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: