Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Is Full of Emotional-Support Groups (theatlantic.com)
121 points by axiomdata316 on Nov 3, 2018 | hide | past | favorite | 77 comments



I joined a group for T2 diabetes at some point because I have a stubborn friend that doesn’t want to receive proper treatment and I basically wanted to ask about available options.

Next thing I know is that Facebook started giving me ads for diabetics.

Having such an influential company know about your illnesses and sell that to advertisers or healthcare insurance agencies or credit agencies or god knows what else is frankly terrifying.


So I'm curious: have you tried removing details from your Facebook ad preferences?

On the desktop website go to: Settings | Ads | Your Interests. There is a huge ontology of various subjects that have been algorithmically assigned to you. If you remove these, I believe ads targeted to those interests should go away. I periodically just sweep them all clean so I have no interests. I get very few ads and those that I go get are completely generic. I never see anything that would seem creepily aware of who I am.


You can actually find out why you are seeing a particular ad bt clicking on the three dots on the upper right, there is a menu item called “why am I seeing this” which will tell you how the advertisers is targeting you.

It also gives you the ability to hide ads like that in the future


It’s pretty good that they allow you to do that now, but it can’t retroactively remove those interests from the personal data they’ve already sold about you. So it fixes the smaller issue of what ads Facebook serves, but doesn’t address the larger issue of privacy.


FB doesn’t sell your data though, never has!

What they sell is the ability to target you based on it. But the data never leaves FBs servers!

So if you remove the tags from your settings, advertisers won’t be able to target you with them anymore


This isn’t the whole truth. Facebook doesn’t sell data, but Facebook’s Graph API combined with microtargeting ad groups allows third party applications that the user has authorized to gather and sell sensitive data to anyone they want. In many cases, the data does leave Facebook’s servers as soon as the action is taken.

You could argue this is the user’s own fault, but prior to Cambridge Analytica, there was very little public understanding of what “connecting” an app to Facebook really meant among the general population.

Also, just because Facebook doesn’t sell this data today does not mean they never will.


Sure, third parties can request access to the graph API by asking the user for permission.

But Facebook issnt selling that access! Access to the APIs is free.

Now users just giving any random app access is a whole different can of worms...you have the same issue on Android and iOS with permission dialogs...people worry about Amazon Alexa recording everything they say, but will give any app microphone/camera access to do just the same if the app creator is ill intended.


Showing you ads for diabetics is not the same thing as selling your information to those advertisers.

The advertisers don't get a list of the people that see their ads.


But the reverse is true. You can upload a list of contacts through your Facebook business manager.


Similar to my experience. I left a high demand religion, and suddenly I'm bombarded with ads to join said religion everywhere because of Facebook's reach. It's disturbing!


That is absolutely terrifying. As far as I know, FB does not allow you to target users by the group they join. (pages targeting is allowed).

Which means it is likely that the Facebook algorithm is doing that without even prompted by the advertiser.

Did you visit a lot of links related to T2 diabetes? Another possibility is that the Facebook pixel could have helped the website target you again.


Facebook doesn’t allow you to target groups directly however it allows you to target interests. Interests are auto generated based on how you use Facebook (and that includes groups).


I support online groups, however I'd prefer staying anonymous which simply isn't possible without faking your account on FB.

Reddit provides similar functionality and let's you stay anon. There's subreddits for pretty much any situation: alcohol abuse, drug abuse, kids growing up with narcissistic parents, spouses living in sexless relationships, etc etc. I did some research into the topic after reading this: https://www.washingtonpost.com/news/the-intersect/wp/2016/01...

Just Google your specific situation and Reddit will probably have a support sub.


Reddit is usually a crapshoot of quality.

Maybe someone will have something for Y Combinator (et al) soon, that's more of a targeted platform for these types of things. Reddit and Facebook both suck at this.


If there isn't a community on Reddit for something, what are the odds people are going to flock to a new platform? The quality issue is because these groups are generally peer-organized and some groups have better organizers. But like, what is technology going to do to make a better Reddit for people with depression, transgender people, people whose partners have cheated on them and people with incredibly rare medical conditions? Those are disparate groups that only have the common thread of needing a support group as far as I can tell.


If a platform is promoted by physician groups or by insurers, it may take off. That would be far better than some of the utter crap on Reddit. A lot is happening in the medical space, with companies like Oscar and appointment scheduling apps. This is just waiting to spring open.


Although not nearly as specific or targeted, Stack Exchange runs a number of sites (interpersonal relationships, parenting, workplace) that usually have much higher quality questions and answers due to their strong moderation. I agree that there’s a void to be filled between Reddit and SE in terms of amount (and quality) of moderation, but both options are much less intrusive than Facebook and allow you to remain anonymous.


From what I found when I was reading up on it the different support subs seemed to be really good at modding. What are your reasons for claiming this?


Looking at companies like Reddit and Facebook in general. Do you really want that vast amount of targeted info in their hands? They can't be trusted. I'd trust a company more adapt with HIPAA and have valid PII measures in place.


I'd never seen beyond the login screen of Facebook until a site I run started getting sporadic bursts of traffic from there, so I ventured in to try and find the links. I was stunned and devastated to discover how many thriving little communities there were in this niche area (local archeology), it's everything the open web was supposed to be but it has been walled off, completely invisible to outsiders, all for the benefit of one corporation and its advertising customers.


Economies of scale, I guess? Easier to host there than to have dedicated hobbyists support the technology needs.


Ning showed some promise in being able to provide that. It took off like a rocket initially. Maybe they benefited from launching in 2005, prior to Facebook reaching any meaningful saturation (consuming all the oxygen in the segment).

When they figured out the financial side of it wasn't going to work (having raised $119m in VC), they essentially destroyed the old service in 2010, suspended the free offering, and fired half the company. I've always wondered if the original concept of Ning could have in fact succeeded, if they hadn't built the company using the typical boom or die, drive the venture capital into the wall model.

I think there's still something big there, if someone can figure out how to do it very, very inexpensively and keep it open. Minimalize the commercial abuse, build in strong spam and trust controls.


This is exactly how I feel. The same thing happened in my area of interest (historic transport).

Facebook provided an easier interface for user adding content and an easier admin system over old school forums. Everybody drank the kool-aid but we now have islands of content which are impossible to search and keep track of (unless you dutifully tune in each day).

The whole things has made me disillusioned to contribute to any sort of online discussion, I certainly have no interest in becoming one of Facebooks unpaid income generators.


I run a facebook breakup support group for men.

About half of the people that request to join the group discover the group themselves on facebook.

For me as a website owner (rapidbreakuprecovery.com) the group has been a convenient way to offer a forum-like functionality without any of the overhead involved with running forum software.

I suspect facebook users have come to expect that these types of groups exist and they're typing things such as "breakup" in the group search field.

There other breakup support groups that are routinely recommended in the sidebar as well as a support group specifically for divorced dads. So it seems that facebook's scale is allowing smaller niche communities to exist where they would not have existed before.

From a usability perspective it's not great. It's easy to set up and it's easy to get members, but I would happily switch to something more discrete, and that's a better fit for support group dynamics (interesting idea by sodosopa).


If any Facebook employees are reading this, there's a really annoying "feature" that will show a list of suggested friends to add to a private group. I've accidentally added people to private groups by accidentally clicking a single time while scrolling.

Imagine being in an AA group or HIV support group and adding your coworker to it on accident.

Please change this.


It's disappointing to see stuff like this. It means that the decision makers don't dogfood their own features.


Or they do dogfood, but the (flawed) engagement numbers support their next promo case.


Zuckerberg has been joining groups recently so I am sure he'll fix it if he sees it as an issue.


Or that their tiny bubble doesn’t know enough about the real world to be careful with things like this.


Yes. See also: Google


Google Buzz was a much worse disaster for similar reasons.

https://money.cnn.com/galleries/2010/technology/1012/gallery...


Playing devil's advocate, I've found the feature useful for meme pages and university groups. However, I think they should add another tap to the process as it's too easy to send accidental invites on mobile.


Alternative: keep it one tap, but hold it for a couple minutes while an "undo" option is visible. Possibly also a "confirm" option for those in a hurry.


this seems like a simple and effective solution. a one to five minute delay on a group invite does not seem like a terrible inconvenience.


Nothing is on Facebook by pure chance. This feature is there because it was designed to be.


Perhaps it even tells you something about your coworkers ...


The whole Facebook friendship stuff is a disaster.


It's really easy to complain about this, but it's very hard to actually come up with a solution.

Did you have any suggestions how cases like this could be detected?


> it's very hard to actually come up with a solution

Just not have "suggested friends"? I've never gotten anything out of that feature, and I imagine it provides little actual value to anyone since if someone was actually a friend or even an acquaintance, they'd have that person in mind when adding friends on Facebook.


Or even have an ability for an admin of a group to turn off "suggested friends" for that group alone. Ah, but that's not just a new boolean field in the model, a database migration, GraphQL dependency, and an if statement in the template... but also a committee meeting to clear the text used in the admin's interface, translations to all languages used by Facebook (you wouldn't want Facebook to be seen as causing problems for only non-English-speaking HIV support groups, right?), additional overhead to track, potentially stepping on the toes of the design team that maintains the mockups for that admin interface. It would need to be someone's passion project, and even that might not be enough. And perhaps the people at Facebook who know the problem firsthand from being in AA or HIV support groups don't want to draw attention to why they're justifying this feature, for obvious reasons.

This is why we can't have nice things.


I’ve used that feature a few times. It’s very helpful when trying to grow a new group


Just jumping in here, but if detecting accidental triggering of irreversible actions is a challenge, maybe the overall approach is problematic. You could, though, temporarily introduce a confirmation step and see what % of users choose to cancel rather than confirm their invitations or something like that. The chance of a random tap on a face that Facebook (not the user) chose to put on a screen seems to guarantee there will be some accidental invitations no matter what, since accidental taps happen.


Maybe make it a two-step process to add people to a group. Sure, show the list of friends to suggest, but make the "adding friends" page separate, so if you accidentally click, you're still a step away from adding someone.


Or better allow people to mark groups as more private (lets call it sensitive) and only if done so have a two-step process to add people and possible other changes.


Product people are going to hate that friction, just add an undo invite button


That would work until another product person decides to eliminate the friction of "accept invitation" and adds an auto-accept invitation feature


Dialog: "Are you sure you would like to invite x?" - This solution has existed for a very long time. Not difficult.


While we're at it we can add checkbox with label 'Dont show this dialog again'


> It's really easy to complain about this, but it's very hard to actually come up with a solution.

I don't want to be rude, but I'm really not sure what you mean by this. "Don't show suggested friends, at least for private groups" seems simple, obvious, and effective.


An option in the group settings that says "show friend suggestions in the group".

Done.


A confirmation screen after you click "suggest friend?"

Delay the action for five minutes and add an "undo" button for the five minutes.


Not the poster's job. Facebook is a business. If they want a solution from the poster, pay for it.


Possibly FB’s algo thinks they are alcoholics or have HIV and would like the group?

Anyway, it should be easy to block that element client-side.


“For example, according to Lewis, the algorithm might keep showing a post whose question has been answered. And it might deprioritize posts from new members that don’t get much engagement—ensuring they get even less engagement in a form of algorithmic ghosting.”

Essentially the flaw with facebook’s entire system of selecting posts. When I recently posted on Facebook to my 2000 friends, the only people to react were mother and aunt. I am actually not that much of a loser, but you’d never know from my Facebook page.

FB have made Instagram the same way and I spent days looking at posts my friends made 4 days ago while not seeing their newest posts (this is especially annoying at conventions). I have a hard time believing that Facebook actually pays hundreds of engineers above market salaries to realize this sad customer abuse. It’s a serious waste of the potential of the Internet.

As far of the rest of the article, yes, Facebook has taken over the independent web forum. And yes, their software is inadequate (the usual total lack of features or confusing UI) but their reach is incomparable.


The secondary effect of this is that some people post less to fb. I hate the feeling of seeing a post I made get near-zero engagement, even from close friends, because some algorithm decided it wasn't worthy. The result is I post less, and many of my friends post less, if at all. The result is FB absolutely stinks at keeping me connected to those I care about.


How would you adjust the algorithm? Maybe surface new posts with some degree of stochasticity, like what HN does? Sorting by Hot on Reddit feels pretty similar to FB, and sorting strictly chronologically doesn't scale (just sort by New on HN or Reddit - most posts just aren't very interesting to most people).


It’s complex and difficult to say since the algorithm is secret. I could list flaws or ways I think it should work differently.

I think it’s been tainted by FB’s business models. I will check back when I have more time to write.


I feel like having to use your real name and identity and hope that Facebook doesn't fuck up is a really bad requirement for secret support groups.


Facebook is one of the biggest sites in the world. You could replace 'emotional support' with anything and it'd be full of those groups.


Sure, but then what would "journalists" at theatlantic have to write about?

Journalism today is all about writing nonsense and throwing on Facebook, Google or Trump on the headline for views.


Anecdote, but it sure seems to me that the need for emotional support groups has skyrocketed within my lifetime. I’m sure all the members of these groups are legitimate (I’m not making a “snowflake” argument) but what in the hell is going on?


My theory - too much information, out of sequence, without a good guide present to clear up misunderstandings.

I teach part time. Lot of the anxiety and depression I see in kids seems to be thanks to this constant info overload on the net. Reddit/Youtube/FB all contribute in different ways.

14 year olds busy reading about the problems 35 year olds are facing on the net, is like dropping off your second grader in a tenth grade classroom and asking them what they think about the problems on the board. The natural reaction is to feel overwhelmed. Everyday they get hit with someother issue, someone is depressed or outraging about.

The best solution I have heard is to allow internet access, not on a device but in group settings (class room/living room etc). This way when they do get hit by stuff above their pay grade, there is someone around to process the stuff with.


Maybe the need didn't skyrocket but the demand did. People used to not have channels to read about their problem, so they were isolated and not talking about it. Now it's easier to anonymously get informed, create support group, join them without much social stigma.

But maybe the need did skyrocket, I don't have data to support either or both hypothesis


Has the need really skyrocketed though? Or has the use of them just become more normalized (and perhaps more convenient thanks to the internet), and less people feel the need to just "suck it up" and deal with their problems alone?


Yeah, my stereotype of the distant past which I didn't live in - say, up to the 1980's - is that it was considered uncool to talk about your problems to strangers in a constructive way.


The rate of death by suicide have been going down since the 40s.[0] I think it is a huge mistake to interpret rising opportunities for emotional support as some mental issue epidemic. People have had it awful since the dawn of time, and I see it as a very positive sign that we are talking about it openly now.

[0]https://academic.oup.com/ije/article/39/6/1464/736597


My guess is that not only is life is much more complex nowadays, but we are continually showered by polished profiles on Instagram et al showing lives that our own do not compare to. This creates feelings of inferiority and missing out, and eventually depression and other emotional problems. Add to that the general pattern towards a society of increasing inequality, and it's possible to see why in this hypercompetitive context so many people are struggling to maintain a baseline of self-esteem.


I’d imagine the hollowing-out of local IRL communities is a major contributor. Fifty years ago you would have gotten companionship and (what at the time passed for) emotional support from your church, or the local rotary club, or similar. We’ve witnessed an enormous hollowing-out of local support networks over the last decade or two for a wide variety of reasons; it makes sense that people would seek an alternative wherever they can.


It cost a lot to set up a support group and they used to be limited to things that were fairly common (ie alcohol), because they had to be geograhically isolated.

When the cost curve shifted because the price of communication crashed as people got on the internet, and the internet then become everywhere and cheaper, the demand curve shifted in response: with the price to setup a support group now is limited to a few hours and a facebook connection, basic supply and demand would say that we should get many more, which is exactly what we got.

In consequence we also get support groups for minor issues, or support groups for issues that are not so common.

I consider both a good thing.

Unfortunately we also got support groups for bad things: pro-ana (and thinspo), and debatable things (suicide pacts).


The Internet. The need hasn't increased just there is longer have any friction for setting them up or joining/participating. Also no geography restrictions to participate.

In meatspace if you want to set up a support group you'd have to rent a space, then participants would have to physically drive there, and your hours would have to coincide with theirs. It's not really fesible for many support groups to form in this environment, especially outside major metros.

With the rise of the Internet, you don't have those restrictions, thus people can reach each other, so they do more reaching out to each other.

Possibly internet support has taken the place of friends and family support though.


I think the need has always existed, it just hasn't been available. The internet has made it much easier to seek out like-minded strangers, and it's become much more socially acceptable to do so, for men especially.


Maybe people just want a group to hang out with, and emotional support group is a way to get that. As to why they are increasing? Because the competition is weakening; other groups are declining.


My guess is that churches used to take that role for a large part.

Now with the internet, it is is much easier to find wider scale and more specific groups so people are turning to these instead.


Maybe I'm just too paranoid, but first thing that popped into my head is there are a lot of predators out there looking for weak prey. Maybe I need a support group to help with that :)


There are. Predators use domestic violence, sexual abuse and other abuse-related online support groups to groom vulnerable people. Strict moderation is absolutely necessary in these communities.


This is actually a problem in AA, it's called "13th Stepping."


Secrets are a pre-21st century idea.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: