Hacker News new | past | comments | ask | show | jobs | submit login
Google Rewrites Its Powerful Search Rankings to Bury Fake News (bloomberg.com)
63 points by rayuela on April 25, 2017 | hide | past | favorite | 114 comments



One man's unpopular idea is another man's fake news. I'm not sure I want google making such editorial decisions.

Also:

> We noticed that you're using an ad blocker, which may adversely affect the performance and content on Bloomberg.com

That is VERY fake news right there.


Trump has conflated biased news with fake news. Google is running off the 2016 definition of fake news - remember what it meant way back half a year ago?

It means headlines like "Shocking - senate hearing erupts into fistfight" or "Trump won the popular vote by 1.5 million"

Fake - written by Moldovans for Facebook revenue. Not simply biased.


What you're calling "fake news" has been called "spam" for a long time. If they're talking about spam, why do they need to color it under some new action instead of just saying "We're detecting more spam"?

It's because they know that "spam" is not a credible way to censor the voices that they want to censor.

"Fake news" emanated from John Oliver. It was the excuse the establishment offered for the mainstream media's horrendous and transparently detached coverage of the election cycle, assuring the public that a Clinton victory was a near-certainty, engaging in fear mongering by indicating that a Trump victory would be a dystopian nightmare, etc.

Oliver said to his followers, on behalf of the whole broadcast establishment (CNN and HBO are both owned by Time Warner), "It's not us, it's them." It is, after all, probably the only logical thing to say.

In predictable but hilarious fashion, the "fake news" mantra almost immediately backfired on the establishment, as the same masses that elected Trump instantly (and appropriately) applied the label to reporting from the MSM, which had just been proven vibrantly "fake" by the election outcome that they refused to acknowledge as a possibility.

Approximately half of the American public is stubbornly conservative. It greatly irritates the broadcast media that it's no longer simple to control the narrative; their propaganda is now challenged and rejected by online voices. This makes the establishment very unhappy, so they've attempted to label all such challenges as "fake news".


> fear mongering by indicating that a Trump victory would be a dystopian nightmare, etc.

I do, on occasion, feel like I'm living in a dystopian nightmare. I'm certainly wondering about the ethics of having children, knowing that they'd live to see dramatic climate change and all the horrors of it.

Colbert's "truthiness" was eerily prescient, meant in satire, yet followed by Conway's "alternative facts" without irony.


>Colbert's "truthiness" was eerily prescient, meant in satire, yet followed by Conway's "alternative facts" without irony.

"Alternative facts" is an irrelevant quibble over wording. Everyone understood the intention. It's exactly these quibbles over technicalities that everyone thought would end Trump's campaign, but it doesn't work; if anything, it generates more sympathy for the speaker.

Conway was not saying there is no such thing as objective reality (a belief called "relativism", which is much more repugnant to the conservative side of the aisle than the liberal one). Rather, she meant "we have different sources". Come now, that's not really such a crazy thing to express, is it?

The MSM heartily strains at these gnats to try to promote their agenda (not surprisingly, it seems Democratic politicians rarely make such snafus). They are apparently unconscious that all they're doing is spreading the message that these strategic gaffers want spread.


> different sources

That's not how I understood it. I find the alternative facts and related screed to be more similar to wearing your team's colors than expressing an opinion.

Some years ago, back when Bush was in office, I tried to argue/discuss​ politics with someone who held quite different beliefs than mine. We went back and forth for a while as I tried to find some common ground. At one point, I realized he had asserted something that was false, according to several sources. I showed him the relevant Wikipedia article. He laughed and replied, "You got me, I lied." I suddenly realized he had no intention of finding anything to agree on, but simply wanted to "win."


Sorry to be daft, but I'm not really sure what you're getting at. Yes, one can choose to interpret the statements of politicians either charitably or uncharitably. Yes, some people are interested only in tribalism.

How is this related to demanding Conway's statement be interpreted as a credo of relativism instead of a) an unfortunate turn of phrase with accidentally-negative connotations; or b) a gaffe intentionally planted with the expectation that the media would seize upon it and spend 3 days nitpicking a technicality, unaware that they're only broadcasting the intended message to the open-minded listeners in the public, who will disregard the media's hostile spin?

Liberal use of (b) is exactly how and why Trump won. He beat the media at their game by allowing them to think that foaming at the mouth over technicalities of his ambiguous wording was doing him damage, when it was actually helping.


I think there are more ways to interpret Conway and Trump than those you've suggested. It's not relativism per se, but the belief that for some topics, being correct is less important than demonstrating loyalty.

Further, it's possible that Trump and team just say whatever comes to mind without planning. Sometimes​ life is just dumb luck.


The mainstream media is fake because enough people believed it to be so? Are you serious?

Fake news is made up, it has no evidence to support it and can often be proven to be false very easily but it can (and is) produced in enough quantities that it can't be disprove fast enough.

You can believe the MSM is biased or trying to control the narrative but if you choose to live in your own reality where you can simply decide what is true without any corroborating information don't try to pretend that you are vindicated simply by the outcome of an election. Hitler was elected, does that make everything he said true because people believed it?

Would you allow this kind of reasoning in any other areas? Would you accept courts that worked this way or business decisions made like this? What would happen if science and engineering worked on the same principals?

I would cite sources but I think that would be a waste of time for both of us.


>The mainstream media is fake because enough people believed it to be so? Are you serious?

Uh, no? I didn't say this.

Insofar as they were reporting on the people's will during the 2016 election, the reality they painted was disastrously inaccurate and it was colorfully and blatantly disproven on election night. It's hard to believe that honest people could've gotten it so wrong. The MSM lost huge amounts of credibility, and the "fake news" crusade is their attempt to salvage some by pointing the finger somewhere else.

When the subject of the report is the aggregate state of the people's will, then yes, the people's will, the real state of that variable, is the determinate factor in its accuracy.

>Hitler was elected, does that make everything he said true because people believed it?

I have to congratulate you on a seriously hyperextended invocation of Godwin's Law.

The media is supposed to be a neutral entity that gives "just the facts" and allows the reader to make up his mind. The veracity of any political arguments are immaterial to the quality of the reporting.

The MSM all but guaranteed a Clinton victory on election night. They sold it hard. They thought there was no way Trump could win and dedicated much airtime and page real estate to discussing this. The NYTimes Election Map started out with Clinton at a 99% likelihood of victory -- leaving the marginal 1% chance only for plausible deniability.

The MSM was blatantly, absolutely wrong, and you cannot get around that no matter how often you try to shoehorn a long-dead dictator into the dialog. Now they believe they have the authority to label which news is "fake".

>Would you accept courts that worked this way or business decisions made like this?

Courts DO work like media should. An impartial forum gives both sides the opportunity to make their case, cross-examine witnesses, etc. etc. In the end, after a fair hearing, the jury makes their determination. The MSM plays the role of the "court" in this analogy, and it failed disastrously.

>What would happen if science and engineering worked on the same principals?

It'd work fine. If you're reporting on the will of the people, and the will of the people is not anywhere close to what your reports said, you should lose credibility.

Unfortunately, far too often science and engineering don't work this way, because people want to believe certain things instead of believing the true things that the data bares out. This is what we see when people try to insist that it's someone else's fault the MSM was ridiculously incorrect, and that we should allow them to dictate truth arbitrarily so that they're not bitter anymore.


So, with the msm thoroughly discredited for all eternity by the events of early November, should infowars be afforded a higher degree of trust than say, the nytimes, reuters, etc?


The thread is kind of off-topic since we're talking about the overarching propaganda narrative here, not cherry-picked publications. I'm not looking to assign a trust rating to anyone in particular. I'm sure every outlet has strengths and weaknesses.

The core issue is that Google's behavior indicates they no longer view themselves as a neutral, merit-driven internet index willing to accept the public's aggregate credibility sentiment. They now see themselves as a supervisor, an entity whose role is to ensure that the "wrong" things are not given publicity, despite the actual, reflected sentiment of Google users.

There are surely pros and cons to that, but it's a diversion from Google's original mission, and it's reasonable to be skeptical about it.


Googles stated mission is "to make the world's information universally useful and accessible." By definition, this would mean to filter out disinformation. Your interpretation of this as an "aggregate credibility index" is demonstrably antithetical to the mission of spreading information. If the entire planet placed credibility in "1+1=7", the google calculator should not be rewritten to reflect global credibility scores.


PageRank counted the number of inbound links to a page under the theory that the resources people were organically electing to point to were superior than others. This was and is the foundation of Google's search, and it's what allowed them to succeed against massive incumbents like Yahoo.

The concept that the general public can tell what's worthwhile is the fundamental component of Google's success (as well as the theory behind news aggregators like reddit and this very platform). The theory was that most people would intrinsically reject falsehoods like 1+1=7 and that they would intrinsically gravitate towards facts like 1+1=2. It was an investment in the wisdom of crowds, in democracy.

Perhaps this concept is no longer useful, but top-down editorialism is a departure from what most people perceive Google to be about, whether you agree that it's a good direction or not.


>Google's behavior indicates they no longer view themselves as a neutral, merit-driven internet index willing to accept the public's aggregate credibility sentiment.

Their algorithms could never be described as accepting the public's aggregate credibility sentiment. Only view and citation frequency.

If thousands of people mention/link to an article just to say "what a bunch of dreck, THIS publication is the problem with American discourse" Google would rank that article highly for whatever topic it addresses.

Google is attempting to push its algorithms roughly closer to a merit-driven index that accepts the public's aggregate credibility sentiment.

They have been pushing towards this end little by little for a long time. They already mostly solved the problem where medical questions were answered by ten psuedo-medical websites pushing deadly fake remedies and scamming the sick.

Those results come up much less often now. Look up any common disease-- the blue thing on the right was provided by the Mayo Clinic and a huge team of doctors verifying facts.

That was over a year ago I think.


> I'm not sure I want google making such editorial decisions.

Google has been making the decisions for you since the beginning. This isn't a fact-based search engine, it is based on a list of algorithms that Google has chosen and also it is impacted by the ad revenue it is trying to generate.

Google will do what it takes to generate more money for themselves.


I would like to see a similar tactic used in court to try to get out of a conviction.

Who is really to say what happened on the night in question? Who is to say the witnesses and police saw what they saw? What is the nature of reality and how can one really know anything?

In light of this the defendant must be acquitted of all charges because all information is equal and perception is everything therefore my drunken rampage at the liquor store only happened in the minds of all of the supposed witnesses while I was at home reading a good book my own perception.


There's active research into fake memories. Currently the assumption is that witness testimonials are somewhere between completely useless and actively harmful to justice.

https://agora.stanford.edu/sjls/Issue%20One/fisher&tversky.h...

> Several studies have been conducted on human memory and on subjects’ propensity to remember erroneously events and details that did not occur. Elizabeth Loftus performed experiments in the mid-seventies demonstrating the effect of a third party’s introducing false facts into memory.4 Subjects were shown a slide of a car at an intersection with either a yield sign or a stop sign. Experimenters asked participants questions, falsely introducing the term "stop sign" into the question instead of referring to the yield sign participants had actually seen. Similarly, experimenters falsely substituted the term "yield sign" in questions directed to participants who had actually seen the stop sign slide. The results indicated that subjects remembered seeing the false image.

So ... yeah. Our memory is pretty useless as far as determining facts goes.


Again, what is the alternative, total chaos? Every system we have is best effort, all you can do is continue to try to improve your means of gathering information and making decisions.

As far as fake news goes, most of these real events are much more verifiable than someones memory in court. In many cases there is footage, many witnesses (whose testimony can be compared to one another), documents etc.


Keeping detailed records helps a lot in case of witnesses I think. Trials that don't last several years would be great too.

Maybe avoid questioning people more than once? I don't have a solution or even that much knowledge of how it's already done.

I'd say it's easy to point at cases where it obviously works well and where just witnesses without physical evidence are highly problematic. It's the cases in between that are hard.

Even Sherlock Holmes always gets a confession in the end. Without a confession the case isn't over. Ever noticed that?


This is exactly what already happens in court.

Both sides are given room to make their arguments, even if the argument is "We can't really like, know anything, on a metaphysical level, can we?"

If a jury finds this convincing, you win.

The judge and lawyers will actively work to discourage such a waste of time, but if you're accused of a criminal offense and you insist on presenting this argument in court, it is your right to do so.

In the U.S., we govern by the will of the people. The jury box is important check on runaway government power (which is, unfortunately, frequently diluted).


Well, if you don't want Google making decisions about what content is placed at the top... I'm not sure what you think Google does.


It does affect performance, it makes it faster.


The very first line:

> Google isn’t planning to rid fake news from its search results -- but it’s trying to purge it from the top.

They aren't removing anything, just being more picky when it comes to top news, which happens to be what things like the Assistant/Google Home use to reply to questions.

To me, that's probably why they care about this so much, because if your kids asks "are dinosaurs real" to Google Home, replying "no" is a big big failure. It's honestly better if it doesn't say anything than a straight out lie.


>They aren't removing anything, just being more picky when it comes to top news, which happens to be what things like the Assistant/Google Home use to reply to questions.

It's also what they use for rich snippets, and that's really prone to bad choices. I'm unclear on what, if any, breakthrough they've had to fix this.

A US quarter is worth 50 cents: http://i.imgur.com/1nNreR2.png

Using thin affiliate sites (that just want referral click payments) as sources for good information on reverse mortgages: http://imgur.com/a/mpayp http://imgur.com/a/IZmmJ

Some others: http://imgur.com/a/6EDEP

I found these for another thread here, in the span of 10 minutes or so. They aren't rare outliers.


>To me, that's probably why they care about this so much, because if your kids asks "are dinosaurs real" to Google Home, replying "no" is a big big failure. It's honestly better if it doesn't say anything than a straight out lie.

I'd rather not have Google make decisions like that for me.


So dont use google. Just make up any reality/answers you want.


>Just make up any reality/answers you want.

What if I want my kids to understand that dinosaurs aren't real? Why should Google deflect from the truth? You're the one who suggested that.


My bad, didn't detect the sarcasm through the text. These days, you never know...


News are not ideas, news are events. Fake news are fake events, by definition misinformation.


There's a big difference between "A bomb fell on Afghanistan" and "Trump authorized a bombing in Afghanistan" and "Trump dropped a bomb" and "Trump dropped the biggest baddest bomb he's got"

All are the same factual event reported without interpretation. Just different highlights.

Or for a more classic example:

I didnt kill them

I didnt kill them

I didnt kill them

I didnt kill them

Language is funny like that. You can say a lot without deviating from reporting factual things.


Yes, people who haven't thought about bias before have a hard time gathering the imagination to grasp the extent of the problem. It's not only how you report something, but what you choose to report, where you put your emphasis, how you headline or divide it, and so forth.

News reports will frequently spend 5 paragraphs blathering about their preferred position, include a diminutive and skeptical 1 paragraph section that includes a partial quote from someone who represents the other perspective, and then follow up with a 7th paragraph that goes back to the first "expert", whose credibility they spent the first 5 paragraphs establishing, to rebut or minimize the opposite side's argument. And that's what passes for "balanced" reporting.

This propaganda is no longer relevant and that's what's really pissing the old media off. Through the internet, we can crowdsource our news, and their influence is dropping off the cliff at record pace.


That's neither fake nor news. 1) You'd have to have a pretty loose definition of news to include that message as news. And 2) Ad-blockers obviously affect page content (by you know, blocking it). I don't have the data to support it but anecdotally I've definitely seen it affect performance via javascript issues that arise.


He didn't mean it was literally news.


In a thread that's mostly discussing how fake news is defined, I thought it relevant to comment on it. To me it came off that he was calling it fake news and not being sarcastic. If that's not the case so be it but it wasn't obvious to me.


We should have a crowdsourced website that backs up claims with facts (and has evidence either way), and then journalists can write stories that would link their claims to it. Or the links be added automatically by visitors.


Googles algorithms already make decisions about what people see, though not based on the editorial content. I don't consider it a real difference whatever scale they use - it will always be biased to something.


I'm curious what liberal bias looks like in algorithm form?

> Google is also setting new rules encouraging its “raters” -- the 10,000-plus staff that assess search results -- to flag web pages that host hoaxes, conspiracy theories and what the company calls “low-quality” content.

Oh that's how, it's going to be trained by people...

Considering how often I've seen Snopes get stories wrong, or are subtlety misleading (such as marking something as false because an insignificant part of a statement is incorrect, by that I mean if it was deleted from the statement the overall meaning wouldn't change, but the general statement as a whole was still actually true).

This is really concerning. I get marking hoaxes and to a lesser extent conspiracy theories. But "low-quality content" and "fake news" in general is not just difficult to identify but open to abuse. And there will likely be no appeal process. It will happen transparently and silently.


I think there's an interesting question here that I haven't really seen discussed: Is it the responsibility of Google/Facebook to filter out "fake news"?

Obviously both of them already do some content filtering for illegal content etc. but where do we draw the line? Personally I think it's unfair to be blaming them, considering they never claimed to filter this type of content. Also, if it is decided that they should be responsible, how do you determine what is and isn't fake? Seems like a dangerous thing to be in charge of.

edit: since it took me so long to finish typing this, everyone is now asking similar questions. whatever


> Is it the responsibility of Google/Facebook to filter out "fake news"?

Their employees certainly think it is.

The reason Google/Facebook/Twitter/Reddit/etc. are so obsessed with fake news at the moment is because their employees believe that their product, which was supposed to "change the world for the better!" inadvertently helped Trump be elected. I'm sure more than a few Valley employees have had nervous breakdowns over this.

Now, banning Trump from Facebook or Twitter would be a little bit too obvious, so they're doing the next best thing: making sure, under the guise of "keeping you safe from fake news", that Trump or Trump 2.0 can't leverage their offerings to help his cause.


This is most likely answer I've heard on the motivations behind news filtering. It really exposes the ethical dilemma of taking a supposedly neutral carrier type service and using it to promote a political agenda, regardless of how righteous that agenda is.


I find it more likely that their advertising partners don't want the brands and companies they represent to be associated with fake, possibly libellous and politically inflammatory content.


Yeah they _just_ figured out that this kind of stuff exists on the Internet...


It _just_ started reaching the mainstream and causing serious controversies.


I don't know if it is their responsibility or not, but I don't think it will be effective.

I see the only effective fight against fake news, biased news, sensational news, or whatever you want to call it is education. And I lumped all those different types of news together not because I believe they are the same or even believed true by the same group of people. I grouped them together because each of them poses a threat to our country and each can be treated by effective education.

Many people who know content is filtered by an institution will discredit any info coming from that institution (and for possibly good reasons). To get people to generate trust in our institutions and their reported info, they need to be the one to make the conscious choice to consume or not consume that info. Coddling the info before it gets to people doesn’t create an empowered or educated society.


>I see the only effective fight against fake news, biased news, sensational news, or whatever you want to call it is education.

Ah, but who will fix "fake education"?


At this moment, is there a general consensus or definition as to what is a "fake news" and how and who decides upon it?


No. There's not even a good definition of what "fake" and "news" is as separate terms. Not trying to be contrarian here, but the "news" of a Bill Cosby's rape allegations, as disseminated by a shaky bootleg YouTube video of an old Hannibal Burress routine would have been considered "fake news" by many standards of newsworthiness: http://www.cbsnews.com/news/who-is-hannibal-buress-and-why-d...


Couldn't you just look up the actual cases in that instance? He is known for publicizing the details of public record.


I don't know how much of the information was easy to find at the time, other than the People Magazine (!) investigation in 2006: http://people.com/crime/bill-cosby-under-fire-peoples-origin...

I mean, obviously, it was Googleable, as Buress said in his routine. And yet the question isn't whether or not it is "fake" -- though clearly many people thought it was fake, or outrageous, which is why it was such a funny bit in Buress's routine. But there's also the question of whether it was news. Because it was a big deal, and then it dropped off the news cycle because nothing big came from it. And then Buress simply reminded people that the cases existed and then it blew up in such a way that it's hard to believe that Bill Cosby, just a few years ago, was pretty much a hero.

The rape claims were such old news that a highly senior CNN journalist wrote Cosby's biography and just left out the rape accusations because he "didn't want to print allegations that I couldn't confirm independently". The Buress incident came about the time that the biography was published, and the biography pretty much died on the shelves:

http://www.newsweek.com/cosby-biography-mark-whitaker-i-was-...

Hell, you could make the case that the famous Boston Globe Pulitzer-winning investigation [0] into the Catholic Church would have been deemed "fake news" at the time. The Globe itself covered accusations of priest abuse a decade earlier and the Church argued that such cases were horrific anomalies, and the Globe editor at the time apparently agreed that there wasn't a systemic scandal. It wasn't until the Globe got a brand new editor that a renewed focus was made on cases that victims' lawyers had revealed years prior.

[0] http://www.pulitzer.org/winners/boston-globe-1


General consensus? No, but it seems that nearly everything emanating from a conservative-leaning source is labeled as such. _Purely a coincidence_, I'm sure.

Who decides it? Well right now traditional media sources are leading the charge, and our benevolent Silicon Valley overlords are working feverishly to help out. Media and tech companies are both filled with people of a left-leaning persuasion. Again, pure coincidence I'm sure, and we all know these people are above injecting their own personal biases into protecting us from fake news, so we're in good hands.


Had the same sentiment in my post but got down voted and flagged. Called out the thousands of globalist who will use Google's power to censor content they deem unfit for the public to see. This is a systemic problem in silicon valley.


Stuff that's from conservative-leaning sources can be accepted just fine, so long as it's anti-Trump. So for example really dubious claims from Louise Mensch, a conservative who thinks Russia is involved in literally everything, are fairly widely accepted. Likewise, people were quite happy to take Glenn Beck at his word as soon as he came out against Trump. It's bizarre.


> Likewise, people were quite happy to take Glenn Beck at his word as soon as he came out against Trump.

What's even funnier is to see people on r/politics say "you know I always thought Beck was an OK guy, just a little misunderstood" after he went nuts about Trump.


Its probably not that hard to detect. Suddenly, a new ___domain appears out of nowhere full of articles using specific language (absolutes, 'shocking' terms, etc) and linked to by various low reputation IPs, botnets, etc and you just panda its SEO and go on with your life.

There are probably other factors here that make this even easier. These sites use fairly shady advertising networks so that's another factor to tie in. They have English language articles but with the grammatical mistakes a 'real' news outlet wouldn't allow due to being written by algorithms or ESL writers.

Fake news detection probably isn't as much about the content but of the methods used to spread it. That's my guess. Unless Google has some incredible AI, they'll just weigh it like they do with other sites abusing SEO. Seems like just a refinement of their panda system focused more on 'news' sites than link farms.

The 10,000 staff just are there to handle edge cases and help tweak the system. They're not going to read every moldovian fake news outlet scammer. In fact, considering most of this is autogenerated by the tens of thousands, they simply can't keep up.

I kinda of see this like spam filtering for the web. Eventually it becomes economically feasible to generate and promote tens of thousands of fake whatevers (it doesnt have to be news) and Google is just trying to keep up. The larger political issues are moot as it doesn't matter what your bias is in politics. If your search engine keeps feeding you low-information or outright false crap most of the time, you'll think about moving to a different search engine.


The definition isn't difficult: the dissemination of news stories that are known by the author to be false, or something, would work well enough.

The problem is identifying the fake news, which isn't something people are going to ever agree on fully. You can use a reputational test, which works well for keeping out the Alex Jones-level stuff, but it's not going to keep out stuff like like mainstream media promulgating stories about Al Quida working with Saddam Hussein, or Iraq having chemical weapons, or something.


There is a lot of hand-wringing about the exact definition that I think is mostly unnecessary. Yes, there are a fair few edge cases, but most of 'fake news' is pretty easy to spot. Examples include the Pizzagate nonsense, the Bowling Green crazyness, anti-vaccine insanity, Race-war baiting, Holocaust denial, etc. Those last two probably make up the lion's share of easy to spot gibberish, but I'm not counting. Likely 90% of 'fake news' is that simple to catch, even to those who are medically mentally handicapped. It's the last 10% that all the worrying is about, as well we should. But don't think that the large majority of 'fake news' has any credence whatsoever.


It seems your definition of "fake news" is conspiracy theory. While I agree with you that most if not everything you listed is likely nonsense, it doesn't make them "fake news". There are fake news stories often surrounding these narratives, but the conspiracy theories themselves are not "fake news", as they are not news to begin with.


True, I do think that the endless stream of these conspiracy theories on scummy ad-bait websites, day in and out, is 'fake news'. Just like the chum boxes on the bottom of crummy articles peddling 'doctors' hate him' and the like are also fake news. To me, it is a broad umbrella. Trying to winnow down 'fake news' to: not conspiracy wackos, not ad-bait, not debates on he-said-she-tweeted, etc. is all not productive. To me, they are all 'fake news'. Still, as I said, I think ~90% of 'fake news' is incredibly obvious, and as these things are part of my definition, I think you can see why I believe it is so easy to spot.


Anything not backed by sources? And again, the article doesn't say they are "removing" fake news, Google's just being more pick with their "rich snippets". If something is controversial, then you probably don't want to have Google straight out saying "X is true" or "Y is true", unless you have strong evidence about either.

In the actually search results, I get to see a dozen links with probably varying answers, but for something like Google Home which only reads the top one, I'd rather get no answer at all than a potentially wrong one.


Have you noticed how much US political coverage is based solely on anonymous sources, often ones whose political biases and level of knowledge about their claims are not disclosed to the reader?


Sure, and imo, those more complicated questions which don't have a clear answer shouldn't be top snippets. Snippets are things that you can explain in 1-2 sentence. Most of those political issues are far more complicated and require paragraphs to fully understand all the nuances and sides of the story. A simple quick reply by Google Home isn't gonna cut it, so they shouldn't even offer it.

That's the difference, when I search something on my PC and get 10 different results I can read through, vs when I ask Google and get a single sentence back, or see a snippet at the top with a single sentence.


There is never a general consensus on pretty much everything. Today's news problem is less with verifiable facts, and more with interpretations of facts. If I was a politician and someone snapped a picture of me reading Mein Kamph, and then I was in the general vicinity of some nutjob, most opinion hosts are going to forge ahead with a narrative that aligns with their ideology. Pretty much all of the popular shows on CNN, MSNBC, FOX are filled with these kinds of idiot know-it-all blowhards.


"Fake news" is the last gasp of a dying media establishment who are trying with all their fervor to convince the public of their own necessity. The fact is that in an age where technology has enabled us to communicate instantaneously across the globe and watch events play out ourselves, news media is largely irrelevant.

We don't need an embedded reporter on the other side of the planet anymore, because the people there will be tweeting it out.

The last several major news events for which I had access to cable television immediately devolved into "watch news anchors read Twitter". This was true of the Dallas riots and the Turkey coup last summer. There were several minutes straight of "this is a tweet from someone nearby...", interrupted only by commercial breaks and commentary, not real reporting.

There are several YouTube channels that get more daily viewers than television channels. PewDiePie has over 50 million subscribers. The rising generation grew up accessing any content they wanted on-demand and does not understand a world of scheduled television programming. CNN is quickly becoming grandpa's way to get news.

The traditional news media has been disrupted and outmoded, and "fake news" is their temper tantrum, their attempt to stay relevant. The old broadcast media establishment has lost their influence, and they will not go quietly into the night.

It's sad and ironic that Google's political agenda is causing them to betray their users. We need aggressive market-equalizing reform.


There is quality journalism out there that goes above and beyond "some guy on twitter took a pic of what was happening". If you're married to the concept of outsider journalism, the citizen journalist website bellingcat is pretty good. A lot of well educated and traveled people have contributed. The quality of content is above a tweet.

However, the reason you see news anchors reading tweets is not because there isn't a better source of content than tweets, it's the idea that tweets are what people want to see. Viewer engagement in an interactive age. It's part of Twitter's strategy, as well.


Anything google considers far-right.


Which is just about everyone outside of the Silicon Valley, San Fran, and establish ment bubble. Stifling free speech in this way will hopefully bring a cornucopia of anti-trust lawsuits against Google.


Information the rich don't want you to see.


Gee I wonder if the 10'000 "raters" will have a bias. I have no problem with such an idea in concept, but I know the process and what is being driven down will be opaque as possible.

I wish they would change their search engine to how it worked years ago instead. The search results now are so packed with nonsense that I had to stop using them.


Are there any alternative search engines? I would rather decide what's fake.


I've been using DuckDuckGo [1] as my default search engine for a couple years.

By adding a !g to your duckduckgo, you can then search Google easily if you're not finding desired results.

[1] https://duckduckgo.com

Edit: added actual url to DuckDuckGo


Is there any way for DDG to confirm that it's not engaging in the same type of censorship?

IMO this is really the time for a Ethereum-esque solution to shine. These algos should be adaptable, confirmable, pluggable.


Google is playing a dangerous game. What else is there "powerful search ranking" burying? If I decide to compete with Google tomorrow, would they bury every site that mentions my business?


Yes. You wouldn't get far. Falling out of Google's favor is a disaster for any company that relies on "organic" search traffic. Many have died from this.

I don't think many people really appreciate the extent of influence Google has over individual behavior. Local businesses die when they get pushed out of the "local box". Platform vendors have a huge amount of control over the ecosystem.

We need to update our legal structures to make the internet available for open competition instead of locking it up in Google/FB server farms. Of course, there's no reason to do so as long as Google is playing ball with the elites.


Who decides what is and isn't fake news? Surely an enormous corporation whose leaders are active in only one political party will be a fair and impartial judge.


"Google’s PAC gave 56 percent of its 2016 contributions to Republicans and 44 percent to Democrats." [https://www.nytimes.com/2017/01/12/opinion/silicon-valley-ta...] Whatever the political opinions of leadership, Alphabet as a company is happy to support whoever is in power.


The separation of economic interests and social interests is relevant here. There can be economic alignment with both parties and social alignment with one party.

According to leaked email via WikiLeaks[1], Alphabet's Executive Chairman Eric Schmidt had an interest in advising in the Democratic presidential campaign:

> I met with Eric Schmidt tonight. As David reported, he's ready to fund, advise recruit talent, etc.

There's obvious alignment between improving quality of news search results and Google's mission to organize the world's information. However, the timing of the initiative together with the current political climate suggest this is not an action made in isolation.

When do we begin to question this kind of action? When will it be too late to question it?

[1] https://wikileaks.org/podesta-emails/emailid/36828


I agree, but I draw a different conclusion: executives' social alignment is mostly about signaling; PAC spending tells the true story.


That's worth considering. I took the PAC as pure pragmatism, but I'll think about the converse. Thanks for suggesting.


With respect: after literal decades of the right wing using its stranglehold on talk radio to inculcate some awfully racist, awfully homophobic, and downright untrue things into members of my family (who I love very dearly despite it), I'm pretty okay with the shoe being on the other foot with regards to literal fictionmongers like Alex Jones on the Internet.

I'm sure, if The Market cares that much (and the right wing loves The Market), then there will be a thriving Conservoogle that will rise to provide unskewed results a-plenty where InfoWars and friends can cavort at the top of the SERPs.


I'm pretty okay with the shoe being on the other foot with regards to literal fictionmongers like Alex Jones on the Internet.

The problem isn't that Infowars won't be ranked highly for news queries, the problem is exactly what GP said and what you glossed over: SV/academia people, who are overwhelmingly left-leaning (by ___location if nothing else), are not impartial arbiters of factuality.

And it's not like anyone is, but only the SV/academia types are getting input here.

That is a problem.

There is no way, none, that this won't be used against factual news with a conservative bent. I'd wager my entire net worth on it.

Eropple, please stop deleting your comments. You've done it twice now, and it makes having a conversation very difficult. Many of us are using a notification service and so your words remain in email inboxes.


Pointing and saying 'but the other guys did it' is not a good reasoning strategy.


Agree, as if "fake news" is full of absolutes.. Sarcastic, parody writing, should be buried but could be full of actual facts.

Poorly researched facts could surface if from a reputable source.


Who decides what is or isn't true? Only you.

Or I suppose you could trust science and other institutions, occasionally, with appropriate skepticism. If Google explains how they classify things as fake news and you believe them, to a first approximation, then it's reasonable to be happy with this feature.


I imagine they use the same or similar group that provides the recent fact checking update to search results. Run by the Sanford School of Public Policy.

https://reporterslab.org/fact-checking/


Maybe. But the problem here is that the epistemic closure that makes people fear the addressing of "fake news"--and, to their string-pullers' credit, they were really good at co-opting the phrase to de-legitimize actual news--has already inculcated into those it encloses a primal fear of academia and those very "public policy" scholars.

I don't know how you unwind that, though.


Until Google has a monopoly on disseminating truth, they are still a private corporation.

This means that they decide what is truth within Google properties.


This is a good point. My assumption is that Google's algorithms have less bias than Google's employees. To me this is better than allowing fake news to spread.


If it's from alex jones, it's fake news.


The problem with Infowars or any other fringe source, is that they are right often enough. Sure obama didnt turn the frogs gay, but there were plenty of other factual stories covered in a factual way, before being spun into insanity. But you could argue many msm outlets also have very variable scruples, depending on the subject.

Ie. Infowars is generally poor quality, but not always false. I only read it for a giggle on occasion fwiw.


> but there were plenty of other factual stories covered in a factual way

for example?

The reality is even he has admitted he's just a "performance artist"


Also theonion


We seem to be witnessing the death of accuracy in reporting. And maybe a post-truth era on the internet.

"I am sure 3 million illegal aliens voted while global warming is a hoax! Obama paid off all the scientists." - I literally have friends on facebook who say stuff like that every time I repost articles from HN about rising temperatures or CO2 concentrations.


Will they let fake news providers buy ads still?


Indeed. Buying ads would be an act of redemption, and will bring Legitimacy.


Really skeptical about the false positives/negatives. The ML they are using around "rich snippets" is pretty lousy. I can find lots of bad autogenerated answers in just a few minutes.

Or would this use some approach radically different, and more accurate[1], from what's driving rich snippets?

[1] well, as much as you can be accurate for subjective measures


Sometimes, to build immunity, you need to be exposed to the raw, unfinished stuff.

If you manicure everything you will lose the ability to build critical thinking and discern among various opinions.

From this angle, the "curation of fake news" looks very paternalistic, if not even a sign of patriarchy. :)


In light of this new politico article: The media bubble is real. http://www.politico.com/magazine/story/2017/04/25/media-bubb...

I think this is a bad idea. Fake news is already a loaded term combined with the media bubble it will just make things exponentially worse.

We need to deal with the media bubble bias first, to restore faith in real news.


Aren't Ads the ultimate Fake News? Is Google the "Ad Company" also now only allowing modest and critical endorsements instead of Ads?


If the plan is to remove misleading results, most of the advertisements need to be removed as well.


I wonder what else it will be burying, going by its recent fiasco of demonetizing legitimate YouTube channels that weren't "brand-friendly" (quite different than "extremist").


I don't think that view is popular here, even if it is supported [1]. You should only read and watch videos approved by the intersection of the DNC and Bernie.

1 - https://www.youtube.com/watch?v=RslP2HGBqWI


Sounds good to me.

The use of the adjective powerful stood out to me. If Microsoft does the same thing with Bing, would a different adjective be chosen? Would it even be news?


Yeah, it's a sloppy way of saying "market-dominating", ultimately an unnecessary puff-piece worthy adjective.


I thought it was unnecessary as well. I assume Bloomberg knows their audience and if that's not something their readers know I guess it makes sense to include it.


Microsoft Expands Its Search Rankings to Include Relevant Results?


They should just rebrand it as the ministry of truth already.


Google… I don't understand the praise of it's search engine.

For example, when I search for my App name, pirate sites come first, and when I search for functionality keywords, 2011 blog posts from Apps that have been disappeared off the map appear first.

When I was writing my thesis, Google Scholar offered no help. Engineering Village did everything.

Google emphasises too much on blogs and other ephemeral content, which as we all know, is often written in a careless manner, rarely maintained, and even just completely made-up. Google doesn't understand that, and I don't think it's accidental.


Sounds like your seo game needs some work.


1. It's not a game.

2. No it doesn't, I did everything.

3. It appears first result on the AppStore.


My thoughts exactly.Your search experience with them sounds like mine.


This whole "fake news" problem feels like a giant ruse. Their candidate (Hillary) lost, thanks in a large part to the democratization of media on the internet and right leaning online media outlets, and now they're trying to bury those same sources from being on the front page. The thousands of liberal / globalist engineers and business people feel so guilty that they helped get Donald Trump elected president that now they're actively censoring websites. All under the guise of "fake news". Sure there are some blatantly fake news articles, but it's nothing I don't see on the magazine rack at the super market and laugh off. This is much more than that, this is active censorship from the left wing globalist who control Google.


Hi - can you define what a 'globalist' engineer is please? How would I tell if I was one?


[flagged]


That sounds more like you mixed in humanism, which is not nessarrily inherent to globalism. Globalism is usually tied almost explicitly to economics and trade. They are related, but distinct.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: