Hacker News new | past | comments | ask | show | jobs | submit login

Academia is just as vulnerable to bias and groupthink as anything else.

I don’t equate “academia” with the scientific method. In fact, the two are often opposed to each other. But since that is where you want to go with it, I would question this sentence of yours as a false equivalency. I submit that “academia”, as much as you can think of it as a single institution (of dirty liberal elites or whatever), is definitely vulnerable to bias and groupthink (even to an astonishing amount, depending on your preconceptions), but your idea that it is just as vulnerable to bias and groupthink as literally anything else is not true. For example, I can substitute “flat-earthers” for your “anything else” and it becomes very clear that flat-earthers as a group are much more susceptible to bad ideas than “academia” as a group.

Merely pointing out the failings of scientific practitioners, of which there are copious amounts, is not an argument for putting the scientific method on equal ground with religion. Appeals to authority is not equivalent to the scientific method, and it is not the only reason why one would trust it over the declarations of religions. Perhaps you might be one of those people who thinks that because science refines its scientific theories over time, that they are “always changing their minds”, rather than creating more accurate models. Maybe you might be one to say that evolution, for example, is “just a theory”?

One can easily see the kind of bullshit that exists within the scientific community without being so unwise as to think that makes it just as credible as any other source of information. I could talk your ears off about what annoys me about the research community, but you’re probably fairly well versed in that, since you make it an important part of discrediting them as a whole. Ultimately what annoys me about them is precisely the ways in which they deviate from the scientific method in order to advance their goals of getting published and recognized. There are bad incentive systems in place in academia that make it so that papers with no real substance get published; negative results are polished like turds to pretend like they were looking for a different result the whole time. Bad statistical methods are used to obfuscate; no actual reproduction of results are being done in large swaths of the field; scientific journalism hypes single paper results to the detriment of science in the public’s eye. I could keep going. Yet still these failings do not even hold a candle to the actively misleading aspects of religion.

You may be operating under the belief that religious proclamations occur based on the whims of religious leaders, but this is no more true than the claim that academic proclamations occur based on the whims of academic leaders.

Many religious proclamations actually are made up whole cloth by religious leaders. Take Joseph Smith, for instance. When someone invents their own religion, it is called a cult in the beginning, but that pejorative eventually goes away after the founder dies and the organization evolves and achieves more mainstream success.

However, it isn’t terribly important whether religious proclamations come freshly made up from a single person or if they are evolved from legends and handed down from earlier generations. What’s important is their epistemological foundation. “God says so” seems to be the sum of it. The evidence? Emotional reasoning and appeals to just having faith.

credible religious edicts are works of real and serious scholarship.

I don’t know what “scholarship” is supposed to mean in this sentence. Just because a person studies something does not mean that the studying was done properly or that the subject matter or learned material has any merit whatsoever. I’m not aware of any religious edicts that have revealed any new knowledge to the world and have met their burden of proof. I am aware of a whole lot of edicts that have not.




>Many religious proclamations actually are made up whole cloth by religious leaders. Take Joseph Smith, for instance. When someone invents their own religion, it is called a cult in the beginning, but that pejorative eventually goes away after the founder dies and the organization evolves and achieves more mainstream success.

It's funny you mention Joseph Smith because he disproves your case. Whether someone believes his claims or not, he offered something to substantiate them: a lengthy, coherent book with a reasonably-consistent internal narrative that claims to underpin his theological innovations.

Same is true of L. Ron Hubbard; Scientology is a thing that has gathered a following and continued on solely because he provided something substantial to anchor it against.

Thousands of rambling fringe religionists have come and gone since Joseph Smith and they fizzled out precisely because they had nothing significant to undergird their claims. The small followings these people gather out are held together by force of personality, and as soon as they're gone or diminished in the group's eyes, the group dissolves.

>However, it isn’t terribly important whether religious proclamations come freshly made up from a single person or if they are evolved from legends and handed down from earlier generations. What’s important is their epistemological foundation. “God says so” seems to be the sum of it. The evidence? Emotional reasoning and appeals to just having faith.

To the layman, the evidence for either is the equally convincing. Compare Clarke's Third Law: "Any sufficiently advanced technology is indistinguishable from magic."

Joe Q. Public truly has no idea if the claims of a scientific paper have a valid epistemological foundation. One must undergo many years of training to become a subject matter expert and provide a serious or credible critique/endorsement of such works.

Until that point, the reader is at the mercy of his credibility heuristics to determine which subject-matter experts appear to know what they're talking about.

Serious religious works function the same way. The average member of the public cannot go into something like the Book of Mormon and make an extensive critique of it, weigh its archaeological consistency against similar theological documents, etc., just like they can't go through a scientific work and validate that the scientist sampled his/her data wisely, chose the appropriate calculations, accurately summarized the data, and so on.

If you have the dense tome for people to gloss over and use to make their own connections, it becomes 100% about projecting an image of trustworthiness and believability.

You can say "Well they can see my science works", but then, the religionist could also say "well they can see my religion works", and then it's all about impression/perception. For example, one could perform a magic trick and claim to have invented a hat-based rabbit replicator. Provided you had enough bunnies and interactivity to fool the senses, this would be highly persuasive unless there happened to be someone who was competent to get into the mechanics of it. "Seeing", or thinking you're seeing, is believing.

People are, in fact, much more excited to experiment and engage with religious content because it addresses an area of their lives that they feel is much more important and accessible (their personal/emotional/familial well-being).

Religion and science are much nearer to each other than you think. Their separation is fairly recent; before the last few centuries, it was typically religious institutions driving innovation, understanding, and discovery.

Religion and science both must be accepted based on trust heuristics (called "faith" in religious contexts) by everyone who is not an expert in the specific speciality, which means that everyone is relying on these 99% of the time, since one can only be expert in a small portion of subjects.

Because acceptance of these principles operate identically, you can hijack either group by using the same method: produce a tome that is surface-plausible, and then change things to match the trust heuristics most commonly deployed throughout the group. This will require some changes in framing and vocabulary, but not in the fundamentals.


I'd like to first try to summarize the thesis of your comment so that you can understand my conception of what I’m arguing against and correct me if I misread you.

I think you're saying that since humans have limited capacity for time, attention, and learning, and since there is so much information and claims out there in the world, we all naturally have to rely on what you call trust or credibility heuristics in order to determine what we will accept as truth and what we will reject. They are heuristics in the sense that they are shortcuts that make up for the fact that we cannot go and learn everything there is to know about every subject that crosses our path, and these shortcuts are necessary because we must make decisions every day concerning what we will believe when faced with claims and new information regarding subjects we know precious little about. These shortcuts or heuristics often take the form of placing trust in previously accepted authorities on the subject, the opinions of trusted acquaintances, or one's own senses.

I agree with all of that.

Further, I think you're saying that the credibility heuristics that people use are all the same, and are equally effective in determining truth from fiction, and equally vulnerable to being hijacked by con artists with books and large followings. This therefore must mean that people who accept scientific consensus on topics like evolution do so for the same reasons that people think Joseph Smith talked to angels.

I would dispute that.

This goes back to my previous comment where I said that it is possible to find oneself on more epistemologically solid ground than another on a given topic. This is possible because some credibility heuristics are measurably better than others, and one can improve one's own credibility heuristics with the application of standard critical thinking skills.

So maybe you have an authority you trust on a given topic X. One question to ask yourself is, why do you trust this person on topic X? There are good answers to this question and there are bad answers. A good answer might be because this person has a degree in X and has worked in the X field for several years. A bad answer might be because this person knows a lot about topic Y, or perhaps because this person talks about X a lot and claims to know about X. A well trained set of credibility heuristics seems to avoid potential problems here. One type of heuristic is more likely to be hijacked by bad information than the other.

Perhaps you accept proposition P. You might ask yourself why you accept this proposition. A good reason for accepting proposition P would be that proposition Z is an observable fact, and that Z => P is a result claimed by several diverse trusted sources. A bad reason would be that P => Z is claimed by several diverse trusted sources, and that Z is an observable fact.

A proper set of credibility heuristics would also take into account the weight of the claim, namely a measure of how important and/or extraordinary a claim is. For example, if you were claim in your next comment that you own a cat, I would accept that claim at face value. The reason I would so easily accept this claim is that it is both mundane and unimportant. There is no risk to me for accepting this claim if it happened to be false, and it is an entirely ordinary thing to own a cat.

Now if you were to tell me in your next comment that you were abducted by aliens, then I would dismiss the claim. It is an extraordinary claim which therefore requires extraordinary evidence that cannot be provided in this medium, and accepting such a claim would have a huge effect on my worldview, and therefore incurs considerable risk on my part. Any credibility heuristic that fails to account for this variance is going to be more susceptible to being hijacked.

With basic application of critical thinking, one would understand that a hat-based rabbit replicator is an extraordinary groundbreaking claim requiring extraordinary evidence, including a full explanation of the mechanism, independent verification, and a consensus from a variety of diverse external sources. One would understand that the established truth of such a claim would revolutionize science almost overnight and would have a huge effect on the way we all live. It would be a extremely difficult task to convince a critically thinking person using merely "enough bunnies and interactivity to fool the senses". We understand that the senses can be fooled in various ways. We go to magic shows specifically for this reason. A good credibility heuristic would take this into account.

There are a myriad of critical thinking tools that can be used to improve your credibility heuristics in a way that does not require you to become a subject matter expert in every area that affects you. To the extent that you pursue these techniques and apply them, the less likely you are to fooled by false claims dressed up to look plausible. You can avoid falling for conspiracy theories like the flat-earthers, the moon landing truthers, and Holocaust deniers.

Well tuned credibility heuristics would help you understand that a story appearing to be "coherent" or "reasonably-consistent" does not support the claim of it being a translation of ancient golden plates scried from a magical peepstone. It would also help you avoid falling for the idea that praying about it and having a good feeling afterward is evidence that a ghost is telling you that all of the claims are true, or that a lot of people believing in a set of claims is by itself evidence for the truth of the claims.

Everyone's credibility heuristics could be improved by understanding that Sturgeon's Law still applies to individual scientific papers hyped by the media, and that independent repeated verification is a necessary filter that takes time and patience. A basic understanding of how the scientific method works, why it works, and how it is generally applied in academia and research institutions (warts and all) is an important part of a healthy set of credibility heuristics.

Carl Sagan called these types of well-tuned credibility heuristics a "baloney detection kit". He has a whole book about it, which I highly recommend.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: