Hacker News new | past | comments | ask | show | jobs | submit login
Blue Feed, Red Feed (wsj.com)
91 points by some-guy on May 19, 2016 | hide | past | favorite | 55 comments



They try hard in the FAQ to explain what they built but I'm sure most people will not actually understand what this is. This is not a Facebook feed, it is not what left or right leaning users actually see. These are stories pulled with Facebook's API and then filtered by WSJ to only show sources which were previously ranked in a study as being the most consistently right and left leaning sources that people liked. In other words, they are just showing the most slanted news sources they could find.

Talk about fanning the flames. When I first started reading I assumed what they had created was a bunch of dummy accounts that would like various posts based on political viewpoint and then showing actual feeds Facebook pushed to those accounts. That is not at all what is happening here.


That definitely would have been much more interesting.

Generate a few accounts. consistently cause the algorithm to think you are Red vs Blue (via Likes, Profile tweaks, Posts from specific sources like Mother Jones vs Instapundit/Drudge) and then see how they start to diverge.

Reminds me of the Like-Everything experiment Wired tried. a while back. Much more interesting to see how it slowly moved toward what the Facebook thought was the local maxima. (http://www.wired.com/2014/08/i-liked-everything-i-saw-on-fac...)

Unfortunately, without knowledge of Facebook's algorithms, it might be a bit difficult. For example, I'm sure a decent chunk depends on your social network and what they are posting and that's much harder to simulate.

EDIT: I still find this illuminating. I am subscribed to two feeds, one Far Left and one Far Right, and it's always fascinating to look at them and how little they overlap. They pull from completely different sources and focus and nitpick and the craziest things about each other. Even when they cover the exact same story, they contextualize it completely differently with insinuated comments and bylines.

So, it's true it's not purely organic of real Facebook feeds. And as a result doesn't get to the heart of the criticism of Facebook generating echo chambers. However, it is a good exercise of showing how this might look and how jarring it can be to compare them.


Everyone seems to be misinterpreting what the WSJ did (which is probably what the WSJ intended). There's absolutely nothing here to be worried or concerned about.

This is the key line:

> These aren't intended to resemble actual individual news feeds.

The WSJ specifically filtered each feed to only include conservative or liberal sources. Facebook could observe a full equal time rule in each user's feed, thereby constantly exposing users to a variety of viewpoints and this so-called study would have given the exact same results.


So WSJ specifically crafted feeds with left and right leaning sources to illustrate what exactly? That political differences exist? I don't think Facebook has much influence over the company you keep in meat space. Your real-life circle has far greater influence on your political leanings.


I definitely agree that your real-life circle has a large influence, but I can't dismiss how much the articles that people see effect how they talk about things.

Especially with the older generation. Anecdotally, I have seen my mother have very extreme opinions on things simply bc "she saw an article on facebook" and didn't fact check anything in the article. It was impossible for me to convince her that the article was wrong.


I have engaged in discussions where I post, for instance, actual scientific studies that refute a shared article. (The specific debate was about nitrites in hot dogs)

You didn't have to read more than the synopsis to see that the article was thoroughly refuted.

The study had roughly zero effect on the beliefs of the person who shared the article. In effect, they already believed something (hot dogs are going to kill their baby), they found an article that agrees with them, and there will be no changing minds thereafter.

I may just engage with a subset of people who are prone to be like that, but a large portion of my facebook feed is similar. They just believe there's some puppeteer pulling the strings on everything. We can cure cancer but 'they' don't want to. We can create infinite free solar energy but 'they' don't want us to have it. "They" orchestrated the entrance into wars, "they're" hiding aliens, etc. Some of it is cliche conspiracy theory, but some of it leans towards "we're in the matrix" level of conspiracy.

I come to HN to keep grounded. I love it here because of rational discussion & debate among people who seek factual truths.


> The study had roughly zero effect on the beliefs of the person who shared the article.

There's something called the Backfire Effect, wherein presenting facts and evidence actually reinforces people's positions, and can make them believe even more strongly in the thing you're proving is incorrect.

[1] http://rationalwiki.org/wiki/Backfire_effect [2] https://youarenotsosmart.com/2011/06/10/the-backfire-effect/ [3] http://bigthink.com/think-tank/the-backfire-effect-why-facts...


FWIW, there is a phenomenon called "Epistemic Learned Helplessness", that most people don't change their minds when given evidence, and this isn't necessarily irrational: http://squid314.livejournal.com/350090.html?thread=3855754

I mean your example is pretty extreme, I'm not going to say that was rational. But in general the average person doesn't have enough scientific knowledge to dispute a scientific study. In fact it's actually quite easy to make a case for almost anything by cherry picking studies.

So people have to rely on taking the consensus of others in their social group, or authorities they trust. And it means they don't actually have an opinion themselves, so you can't argue against them. They are just trusting the opinions of someone else.

This actually isn't a bad heuristic in general, but can lead to crazy beliefs like that. I think we all do this to some degree. I believe in global warming, but I couldn't possibly dispute any arguments against it that you could give me. I'm not going to change my mind though.


A little OT, but as part of its public apology/explanation, the FB news team uploaded a PDF of RSS feeds that they say they pick from to find stories to include in the News Feed (to supplement "organic" trending stories)...they don't say how often they do this, but I imagine what RSS feeds they chose to pay attention to (around 800+) has to be a partial reflection of how they do day to day curation of the organically trending stories.

I parsed the PDF and did a scrape of the feeds in this repo: https://github.com/dannguyen/facebook-trending-rss-fetcher

(but didn't do the work of extracting items from each individual XML)


Wow, what an awesome experiment!

I know that FB shows me articles that their feed algorithm determines to be aligned with things I've clicked on before (so that I'll click again), however it's pretty shocking to how my clicks can effect what sort of articles I see.

It's almost as if every click digs me further and further into a specific bucket that shapes what I see and that I can't really climb out of.


> It's almost as if every click digs me further and further into a specific bucket that shapes what I see and that I can't really climb out of.

As in life, it's important to actively seek out information you disagree with. That's how you learn and broaden your horizons. It's the only way.


That keeps being repeated along with the "filter bubble" trope now and then. I'm not sure I buy it.

Most of the stuff out of my "filter bubble" is just incredibly low quality garbage, whether it agrees or disagrees with what I think.

I can't see how actively going out of my way to read articles about how Kim Kardashian is an interesting intellectual adds much to my horizons.

There's only so much information we can consume. There seems to be this myth that looking now and then at the opposite viewpoint from what you believe frees you from the constraints of bounded rationality. It doesn't, and I haven't seen very conclusive evidence that it helps in any meaningful way.


> Most of the stuff out of my "filter bubble" is just incredibly low quality garbage, whether it agrees or disagrees with what I think.

The thing is... it's entirely probable that most of the people who agree with you do so because of low quality garbage. So you're very much comparing yourself to a large group of other people and coming out that you read things above their level - not very hard. You need to find people on your level on "the other side" and figure out what they're paying attention to.


> The thing is... it's entirely probable that most of the people who agree with you do so because of low quality garbage.

So?

> So you're very much comparing yourself to a large group of other people and coming out that you read things above their level - not very hard.

Not sure I understand. I'm not comparing myself to anybody. I'm simply stating that I can't find a reason to go out of my way to read garbage.

Your last point is absolutely correct, and it probably summarizes why I believe the whole "filter bubble" thing is irrelevant or at least vastly overstated: the criteria that I use when deciding whether to read or not read something are entirely orthogonal to what "side" it comes from.


The criteria that you use is that you come across the material in the first place. Most likely, that is subject to a filter bubble. I really doubt your bubble consists of all good information available.

Your fallacy is assuming that there is nothing outside your bubble that isn't garbage.


I don't think that's a very fair reading of what I'm saying.

First, whether something enters my filter bubble or not is not the same thing as the criteria I choose to look for new things, or to decide whether something stays within my filter bubble (say, a website I would add/remove from my RSS feeds). There's a passive/active dichotomy here.

Second, I don't think I've claimed there was nothing outside my bubble that wasn't garbage. Now, I do assume that 99% of the stuff out of it is, because that's the nature of the SNR on the internet. Note that all of that is of course entirely subjective, and I'm only talking about the value I derive from it personally.

Certainly, there are things that would be interesting that at a given time live outside of my filter bubble. But actively looking for them is not necessarily the best way to find them, as I might have to expend a lot of effort going through lots of junk before I get there, and other sources within my bubble might percolate that information faster, with less effort from me (say a link post on SlateStarCodex will give me a bunch of links to stuff I would have never found by myself).

Finally, I never said my bubble consists of all good information available, precisely because as I stated, all that information could not fit in anybody's bubble due to the limited nature of our attention or available time. So that's just a straw man.

The fact is that there's a game being played between what's inside my filter bubble and what's outside, and that game is not zero sum: most likely, actively trying to add more information from other sources will decrease the utility I get from it. It could be a temporary/local minimum (going through a bunch of garbage to find some hidden gem), but it could also be a more stable, lower utility state.


Personally, my bubble tells me untruths about communities and belief systems I'm not part of - I suspect that yours likely does, too, because it seems to be human nature to distill one's "opponents'" beliefs into something easy to attack. Even when the beliefs are abhorrent, and I'm likely to continue seeing them as abhorrent, it's useful to know what people are actually saying, instead of what my bubble says they're saying.

And the only way I can find to do that consistently is to actively seek sources of information on specific subjects from the "opposing" point of view. The articles my filter bubble send me are generally going to be emotionally-fueled rubbish, because it takes time and effort and a very good understanding of the subject matter to take apart an argument otherwise, and for most people, their time on the Internet is time they don't want to be spending mental energy.


How can you click on links for opposing viewpoints if they don't even display them?


Something annealing and other local maxima algorithms do is always inject some noise. Annealing calls this the temperature. So Facebook could ensure that 5-10% of your feed comes from a "region" around where it thinks you stand, then see what you click on.

If you only Like Left, then you keep drifting. But (I assume) they will always include some stuff to the Right of where they think you are as an option. It won't be Far Right, but it will be in that direction.

Obviously, this is a lot of speculation of how Facebook is doing it's feed optimizations, but I expect they provide opportunities to get out of your localized maxima.


Google search does the same thing. (Which is one of the reasons to use DuckDuckGo, even for your Googling).


Why is that a good reason to use DuckDuckGo?

The fact that when I Google "python" it gives me exclusively results about programming and nothing about snakes is a feature, not a bug.


Because Google will personalize your results to maximize your CTRs, which are based on previous interests and likes.

DDG doesn't do this, and when you use it to query google search, it doesn't let google do this either.


You get stuck in a filter bubble, which may work to either your advantage (Pythonic results), or your disadvantage (you'll miss results that go against your opinions, or you'll get feedback which distorts the value of something). I first saw this when Dave Winer noted that his "Fargo" program had reached the first page of search results for "Fargo". Which it had: For him. Not for anyone else.


I don't typically look to Google for opinions. For controversial issues, I make a point of including diverse viewpoints in my consumption habits (actively subscribing to conservative news sources on Facebook is surprisingly effective, for example).


I can't see how this is a bad thing. I love that Google saves me time and finds more relevant search results for me.


That may be true, but this article doesn't show that. All they did was take articles from sources that are liked by conservatives or liberals on facebook. They didn't show that facebook's newsfeed actually puts these stories in their newsfeed more than other stories.


Internet filter bubble, there was book released about this problem way back in 2011.


This is fantastic. It's especially interesting to see how differently the exact same stories are reported on each side of the political spectrum.

Although (this coming from a huge Bernie supporter), it seems like both sides are in agreement that they should be attacking Bernie Sanders.

As an aside, the format of "And now you won't believe THIS happened" as a headline/tagline with almost no additional info to get more clicks is absolutely infuriating. I've always made it a point to not click on headlines/links written in that format.


I also clicked on Sanders, and the feel of the articles wasn't all that much different.

It seems the pro-Hillary left is fairly well aligned with the anti-Bernie right.

I have no idea if this updates to display different articles throughout the day, might have just been my small sample.


I hate how in the feeds, most of the title or first sentence of the article is cut off so even if they didn't style it as clickbait, you can't really tell what the stories are about.


When I checked Bernie I saw a decent mix of articles from both sides, which suggested to me another interesting dimension to look into.


I'm not really sure viewpoints like "Pope Francis Likens Jesus to ISIS, says Muslims Must Breed with Europeans" are what I need to be exposed to to have a broader view of the world but this is a cool toy.


I found both sides of the feed repulsive. They are all biased and misrepresenting facts, just towards different ends. It's sad that these stories appear in anyone's feed, let alone that they get so many clicks and shares.


That's true, although this one stuck out to me as particular egregious, if not outright offensive.

Also, there's a lot of internecine quarrelling in the blue feed about Clinton and Sanders which might be an interesting angle to examine further.


What is fascinating, and horrifying, is that some portion of each feed will simply accept the story after reading the headline.


I worry about the echo chamber effect. I have friends who are both conservative and liberal, but what I increasingly notice is that each group seems to mostly be friends with their own "type"; i.e. my conservative friends are for the most part only friends with other conservatives, and my liberal friends are for the most part friends with other liberals (there is always the possibility that my friends represent an anomalous sample, but I think that's unlikely. I generally avoid expressing my own political views to someone unless I am extremely close to them, so perhaps this is why I haven't alienated half of my friends yet...)

I think that only showing people news that agrees with their way of thinking leads to the dangerous situation where you end up with a positive feedback cycle of groups self-confirming their own beliefs, and the other side — "them" — is viewed as a deranged bunch that is incapable of irrational thought. While increasing groupthink may lead to higher advertising revenues, it also leads to interpersonal polarization and higher levels of animosity.

There was an interesting NY Times article that came out recently (http://www.nytimes.com/2016/05/08/opinion/sunday/a-confessio...) about how liberals and libertarians are far overrepresented in academia. While that in itself didn't surprise me, the comments on the article did. Here are a few of the NYT picks:

> "It's not that conservatives aren't bright; it's that, for the most part, they are narrow-minded and are sure they have the right answers."

> "Conservatives are entitled to their views, but they also must understand that young adults pay thousands of dollars to obtain a college education in order to learn facts, not fiction."

> "Is intolerance necessarily a bad thing?"

> "No scientist holds groundless beliefs to prove some kind of subservience to a deity."

What is interesting is that I frequently hear conservatives making the same comments — just swap out a few of the words:

> "It's not that liberals aren't bright; it's that, for the most part, they are narrow-minded and are sure they have the right answers."

> "Liberals are entitled to their views, but they also must understand that young adults pay thousands of dollars to obtain a college education in order to learn facts, not fiction."

> "Is intolerance necessarily a bad thing?"

> "How could any intelligent person possibly believe the universe came from nothing?"

What I generally find is that the further left or right someone leans, the more similar their personality becomes to their right/left-wing counterpart. In my experience, the "extreme" people are much more similar to each other than they are to the people in the middle of the political spectrum. I would argue that — born into a different family — many of the zealots would hold their opposing view just as strongly.

One oddity that I haven't yet found an explanation for is that conservatives seem to be less prevalent online than liberals. I'm not quite sure why this is the case.


"One oddity that I haven't yet found an explanation for is that conservatives seem to be less prevalent online than liberals. I'm not quite sure why this is the case."

Your filter bubble, probably. Telling whether that's true in an absolute sense would be difficult.

(Bear in mind that I believe it is not possible to "not have" a filter bubble, so no offense is intended in my first sentence; everyone has a bubble, the only question is the nature of it, not whether it exists, and it is not sensible to want to not be in one, only to ask how you might change it.)


I come from a very conservative circle, and I find that outside of my specific facebook feed (where that group is), the internet at large is a very left-leaning place.

HN is the most neutral place I can think of. I think it might just hide it well, as politics aren't a large topic here. I'm just glad to see that even political conversations here are largely rooted in facts.


HN is really cool in that many people have an idea of what public choice theory is.

There is an old maxim - "all organizations that are not designed expressly to be conservative get more and more left leaning over time" or the like.


Mr Bug is a philosopher for our times.

I see his ideas spreading everywhere, directly and in derivative form.

I can seriously see him being studied in academia in XXIIth Century. Him and Satoshi and a handful of other radicals in our midst who should not be named. The Net is like a flower unfolding, the people who intellectually influence the beginning are sure to have some of their number be rockstars much later on.


HN is very libertarian, more than neutral - it tends to avoid most discussion of Government and social responsibility simply by saying that neither should exist in any large capacity.


Interestingly, I read an article today (found it on my Facebook feed, posted by a Facebook employee) that pointed out the exact same thing but from the other side's perspective, regarding the conservative meeting at Facebook the other day: http://www.glennbeck.com/2016/05/19/what-disturbed-glenn-abo...


I just read that! It was a very interesting read.


>>One oddity that I haven't yet found an explanation for is that conservatives seem to be less prevalent online than liberals. I'm not quite sure why this is the case.

I live in Texas and I can definitely tell you that my internet experience is definitely not like that. In my neck of the internet, there are far more conservatives.


>One oddity that I haven't yet found an explanation for is that conservatives seem to be less prevalent online than liberals. I'm not quite sure why this is the case.

Demographics. Rural areas with worse internet access skew conservative. Older folks who can't understand the internet skew conservative.


Being a zealot from one side of the horse shoe or the other is not a reason to think their ideas are incorrect.

Evidently commonly held ideas today like democracy or social welfare or multinational corporations not explicitly backed by the mothership would have been outrageous daydreams, utter folly to believe in the 16th century.

The problem with centerist positions is that they seem so reasonable because so many of them are reasonable. That makes us too comfortable.


>conservatives seem to be less prevalent online than liberals.

I wonder what metrics you might use to measure that? Number of poster on the largest 100 "message boards" that slant one direction vs the other? Or is there an independent measure of political leaning-ness, and you could track how many click they make on Yahoo? Number YouTube views for certain videos? Facebook likes of certain stuff? Something else?


> conservatives seem to be less prevalent online than liberals

Expressing conservative views tends to be a social liability among the groups that tend to be active internet users.


This kinda thing has been visible in social media for quite some time now. For example in Adamic and Glance's “The political blogosphere and the 2004 US Election” the authors study a network where blogs are connected based on hyperlinks. I made a visualization of their dataset here:

http://ryancompton.net/2014/10/22/stochastic-block-model-bas...


I often see both, may be because I click on both the "red" and "blue" topics in the "TRENDING" feed.


Takeaway: the far right and the far left are both nutso.


There is a saying that the political spectrum is actually a circle.


Huh, all I'm seeing is yellow journalism.


Note that Ghostery blocks the posts. You might need to whitelist the site. YMMV.


As does uBlock Origin.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: