Hacker News new | past | comments | ask | show | jobs | submit login
Your AI Girlfriend Is a Data-Harvesting Horror Show (gizmodo.com)
146 points by nickthegreek on Feb 14, 2024 | hide | past | favorite | 194 comments



Emphasis on "your". My AI girlfriends are running locally on my GPU and don't require internet access to function.


> don't require internet access to function

I can't even say that about my real girlfriend. Take away the internet for a few hours and see what happens...


So much of my girlfriend’s behavior is driven by TikTok trends that most often it feels like I’m dating TikTok and not an independent person.


I'd flippantly suggest new girlfriend time, but I doubt her successor would be any better in that aspect. They certainly exist, but the expectation value for rolling again is negative.


It's not a serious relationship, after it has run its course I'll likely stay single and focus on my many hobbies. I've done enough dating for one lifetime.


I hope you're joking.


Put it this way: If I took away Wordle for the day, I'd be single by tomorrow.


What's being illustrated here?


Your clear lack of a sense of humour?


How people today come to be addicted to 24/7 internet access, including spouses?

Do you want to make it be about something else?


A humorous difference between having a real vs virtual girlfriend?


It's a joke.


Nah, it's 100% true. solardev's wife is a real battleaxe. She once said "for our anniversary, I want to go somewhere I've never been!". solardev replied "Well how about the kitchen?"


We're not...


Not your GPU == Not your AI Waifu


> My AI girlfriends are running locally on my GPU

The future is amazing.


Despite the obviously dystopian undertones, it's truly an interesting time to be alive.


Are we even alive? Maybe we're just some ancient AI's experiment in organic LLMs.


Yeah, everyone knows the fabric of reality runs on languages invented inaude the simulation. Even the people outside the simulation use the present day English language.


By most definitions we are whether an ancient AI's experiment, or God's or other. The question may soon be whether the AI girlfriends are alive.


I remember a time when dating websites were for nerd losers. Now they're the norm.

I can't wait to see how things are 20 years from now.


People won't be using dating websites. People will be dating websites.


And in 21 years, websites will be dating websites.


Apple positioned well for this with iOS 18. AI girlfriend can run on your Homepod!


Great, now Siri can tell me to play my own damned music.


[Picture of a future in 80s magazine. Smartly dressed man saying "computer, make me sandwitch" to a toaster on wheels]

[picture of actual modern times. small device with a screen saying "human, charge me and go to work" to a sleepy man in t-shirt]


I am reminded of a Dust short a few years ago... Sci-Fi Short Film “A Date in 2025" https://youtu.be/NZ8G3e3Cgl4

Which... doesn't seem too unreasonable now.


Apple bans apps like this so no they are not positioned well for this



Love is a warm GPU


My waifu is the word suggestion Markov chain on my phone. Conversation has gotten dry, but we make it work.


Maybe you let it go dry man. You need to put in more effort.

I have the same phone word suggestion Markov chain waifu and we couldn't be closer: we complete each other's sentences!


Relationships take work.


Not your compute, not your girlfriend, as they say.


The example waifu in text-generation-webui is good enough for me.

https://github.com/oobabooga/text-generation-webui/blob/main...


Do you AI girlfriends know about each other? You could be in a lot of trouble if they find out you've been cheating on them if you're not honestly and openly polyamorous with them.

Players of The Sims video game try to maintain multiple wives in different bedrooms, romancing and woohooing them behind closed doors, so they didn't get angry at each other.

After developing The Sims at Maxis and releasing it into the world in 2000, I made some tools to enable players to create custom content, and worked with some indie Sims fans to make the Simprov Wedding Playset for The Sims 1. We wanted to "woke it up" to support same sex marriage and non traditional customs and ceremonies, before Maxis/EA officially supported it in later versions of The Sims.

It includes a "Cupid" cheat object that helps you instantly and effortlessly fall deeply heads-over-heels in love with any other Sim in your neighborhood you wanted (including dogs and cats)! That made it much easier to get on with the next steps in planning your dream wedding, by fast-forwarding over all that pesky romancing and dating and getting to know one another nonsense. Cupid puts the heat into cheat.

Speed Dating with Cupid:

https://www.youtube.com/watch?v=YVUP9OXmHTM

The Cupid was based on reprogramming and reskinning the original phone, so clicking on it popped up a pie menu of every family in the neighborhood you could call, with submenus for every member of each household (including the pets) you could fall in love with.

To stress test it, I systematically fell in love with one person after another.

Selecting a Sim first invites them over. They immediately arrive on the lot, and you have to greet them. Then they walk over to the Cupid, which makes them instantly fall in love with you.

Then they walk over to you, give you a passionate kiss, while a sting of romantic music plays, and hearts and "plus relationship" icons flash above each of your heads, showing that you've each just fallen in love.

What happens next is up to you the player, or the autonomous characters themselves, if you don't interfere with their lives by clicking from pie menus to control them. It's like a pinball machine with pie menu flippers and multiple hearts in play.

So I continued to stress test the Cupid by inviting one Sim after another over to fall in love with me. If I didn't wait for my previous lover to leave, or lock them in a room and remove the door, or tempt them into the swimming pool and remove the ladder, and then fell in love with another Sim in the same room right in front of them and they saw us kiss, cheating in front of them piss off the previous lovers, and they queue a "Slap in Face" interaction, which decays our relationship.

The ultimate effect of falling in love with one Sim after another is that all your previous lovers would hang around getting madder and madder at you, and each time you kiss a new lover, all your previous lovers in sight would line up to slap you one by one (they're programmed to be too polite to all slap you at once, so they queued and take turns). So your relationship with your first lover decays from the most number of slaps, and so on, up to your perfectly head-over-heels relationship with your most recent lover.

The challenge is to see how many consecutive slaps you can get from scorned lovers. (Hint: put out a buffet table and lots of port-a-potties so they can eat and shit and don't have to go home. Or use the Buddha cheat object to keep everyone happy with their stomachs full and bladders empty, which makes long drawn out wedding ceremonies go smoother.)

One solution to this dilemma is to install the ZombieSims expansion playset, and zombify your previous lovers before moving on to the next.

https://news.ycombinator.com/item?id=34485103

>ZombieSims is a mind-blowing, brain-eating, tour de force Sims 1 fan expansion pack by two of the greatest Sims user created content artists and programmers: Heather "SimFreaks" (who created SimFreaks.com and much of the beautiful content for Sims 1 at http://www.simfreaks.com/index.php including themed play sets like http://www.simfreaks.com/themes/storytime/pirate/index.shtml ) and Steve "SimSlice" (who created SliceCity: SimCity within The Sims at http://simslice.com/Slicecity.htm by programming many interlocking objects in SimAntics, and also many other amazing Sims 1 objects like the weather machine at other cool stuff at http://www.simslice.com/Objects-Electronic.html ).

>Heather and Steve were both early Sims 1 fans who each published their own popular web sites with downloadable objects, met through the Sims 1 modding community, then eventually moved in together and got married, and now they've combined their extreme art and programming talents to make an intricately intertwingled collection of Sims 1 Zombie objects, with a whole lot of original artwork and programming!

https://zombiesims.com/

>Twitch streaming videos:

https://www.twitch.tv/simfreaks_heather/videos

>Highlight: Zombie Sims - Beta - Everybody Dies

https://www.twitch.tv/videos/1049524221

>Check out some of the crazy menus that pop up -- this demo barely scratches the surface!

Here is a video of Simprov (unfinished but playable with programmer art and sample text), and the zip file with a huge collection of The Sims 1 objects, including the Cupid and many other Simprov Wedding Playset objects, so you can try reproducing the results of my experiment in the privacy of your own home.

Simprov Wedding Play Set

https://www.youtube.com/watch?v=Mwt5LJlrMe8

https://news.ycombinator.com/item?id=34492255

>(In Professor Farnsworth's voice:) Good news everyone!

>I asked Heather permission, and she says it's OK for me to give away the huge collection of custom Sims objects I have that includes an archive snapshot of many classic SimFreaks objects, as well as all the unreleased SimProv Wedding Playset objects that Heather and Donna and Steve and I created years ago but never finished and released, and a whole bunch of other stuff like the Transmogrifier object that randomly changes your body, the Dumbold voting machine that sometimes makes you accidentally vote for Pat Buchanan, Satan who shows up when you're depressed and offers to buy your soul, the Crowd Sitter that makes everyone gather together and sit down on chairs, and the Cupid that lets you instantly fall in love with anyone in the neighborhood, and the Buddha that makes everyone happy and not piss themselves and fall asleep in their own puddles of urine during parties.

>I don't have time to actually support and debug any of this stuff, but at least I recently updated the Cupid to be compatible with the Pets expansion pack, so it now lets you fall in love with any pet in the neighborhood. (You just can't actually marry them -- not that there's anything wrong with marrying cats and dogs, but we didn't have the animations for that!)

>If you want to express your appreciation, then please subscribe to Zombie Sims for a $9.99 lifetime membership, and then you can play around with inviting lots of Zombies to your weddings and see how that works! (Or don't invite them, and they will crash your wedding anyway!) But no guarantees or warranties that it doesn't devolve into a bloody mess!

>Here's my special collection of Sims 1 downloads, including the unreleased and not quite finished "SimProv" wedding Playset and handy "Cupid" that lets you instantly fall in love with anyone in the neighborhood (including pets)!

https://donhopkins.com/home/DonsSims1Downloads.zip

    cd ~/Downloads
    curl https://donhopkins.com/home/DonsSims1Downloads.zip > DonsSims1Downloads.zip
    cd ~/Applications/Wineskin/The\ Sims\ Complete\ Collection.app/Contents/SharedSupport/prefix/drive_c/Program\ Files\ \(x86\)/Maxis/The\ Sims/
    unzip ~/Downloads/DonsSims1Downloads.zip
[...]

>Transmogrify Self: A quick demo of The Sims Transmogrifier personified in The Sims 1. Graphics by SimBabes, programming by SimSlice.

https://www.youtube.com/watch?v=dsTbs7IL5EI

>To find the Cupid and other Simprov items, go into buy mode, press the last icon of three dots for "Miscellaneous", then press the first icon with a pool table for "Recreation". The main item of the Simprov wedding playset is the "Hope Chest", which has a "Help" item that explains what to do next, and it summons a wedding consultant (who you can dismiss and call back if you don't like her hanging around in your bedroom forever). Then you can click on the hope chest to make other objects like the Cupid, and click on the wedding consultant to make catalogs of other items (most of them are just placeholder programmer art right now, but some of then configure things like what kind of wedding you will have and who will officiate it), but the idea was that you could order lots of items through the catalogs that you couldn't get through the normal shopping interface. But for now most of the wedding items are still in the build mode shopping catalog. The Simprov Wedding Playset video above walks through how to use most of the objects!

>Also be sure to check out Donna's beautiful wedding beds, the luxurious buffet with ice dolphin sculpture, gold inlaid glass dining table, fancy dollhouses, elegant dolls, and many other premium objects identified as Simprov, SimBabes, and SimFreaks in their catalog descriptions.


What


It is clearly a passion project, innit?


Simprov many years ago was all about passion, but more recently ZombieSims is all about revenge and murder and chaos and brains! @;)

The Sims Transmogrifier was a Windows desktop tool for making user created content that I developed after releasing The Sims:

https://web.archive.org/web/20070202011737/http://www.thesim...

Since killing your Sims was such a popular sport, I made another simple more accessible online tool for making custom tombstones (either a spooky haunted Halloween Tombstone with ghosts, or a serious Solemn Tombstone with flowers), that let you upload a photo of the deceased, enter their name, write an eulogy, and it would create a custom object that you could download and play in the game.

https://web.archive.org/web/20051026203041/http://www.origin...

It doesn't work any more of course, but here's the "Engrave a Tombstone" page:

https://web.archive.org/web/20051026212252/http://www.origin...

Eventually "The Cemetery" contained 2249 tombstones (minus the nastiest ones I moderated, and the private ones -- I still have the full archive of all 4065 tombstones and eulogies), and there is even a handy RSS feed for keeping track of everyone who's died. A lot of sad heartbreaking stories and sick grave dancing celebrations:

https://web.archive.org/web/20060906231726/http://www.origin...

It was a proof of concept for some of the things we planned to do with SimProv (but never finished due to the legal gray area of selling Sims content, and the consequential lack of funding).

The Wedding Consultant NPC can gives you a "Wedding Photography Magazine" which has a pie menu that lets you select which wedding photographer to hire. There would be several different wedding photographers of different skills and styles, who would attend the wedding as NPCs programmed to take photographs of different situations (by making screen snapshots in your family album, which you could upload).

So different photographers could focus on family, friends, enemies, activities, flowers. Each photographer could have their own themed set for Sims to pose for photographs in.

During the wedding the photographer would automatically do their thing (walking around, or using a set), taking a bunch of photos into your album. Then after the wedding you would pick your favorite wedding photos from your family album, write comments and stories on each photo, upload them to the server, and it would make a custom wedding album with your photos and comments that you could download to memorialize your wedding. (Just like how the tombstone works, but with multiple pages.)

Here's the hope chest, wedding consultant, and magazines for selecting wedding options (warning: programmer art and placeholder text!):

https://youtu.be/Mwt5LJlrMe8?t=144

The hope chest lets you summon and dismiss the wedding consultant, who walks over to the hope chest and just stands there until you tell them to go home. The wedding consultant is not fully implemented (and there could be multiple with different personalities and styles), so for now she just stands creepily and silently by your hope chest all night, not reacting or saying anything, even as you sleep in bed and woowoo with your lover (or their best friend).

From a reified user interface design perspective: Think of inviting and sending away NPCs as opening and closing user interface dialogs (reified in-game as people and objects), which can produce other sub-objects and NPCs like magazines, cupids, clowns, photographers, officients, etc, depending on the state of the game.

Objects and NPCs can have properties and state machines you can use menus and other objects or actions to change, and they can pop up dialogs with text and images and buttons, and have nested pie menus of actions.

The special objects the NPCs create might not be directly available in the catalog (Cupid is), so they can restrict access unless you satisfy the required conditions, and the objects themselves are like other user interface sub-dialogs and widgets and utilities.

The Cupid statue is a love making utility with a nested menu of neighboring Families and their Sims to fall in love with.

The Buddha statue it a feel-good utility that keeps everyone happy and full and low bladder and energetic, so you can concentrate on the wedding instead of mopping up puddles of blue urine.

The Crowd Sitter podium is a magnetic seat filler and crowd gatherer utility that's great for getting everyone to sit down and shut up, or gather around in a mob, for ceremonies and other rituals. (When using the Crowd Sitter for long periods of time, using the Buddha statue is highly recommended.)

https://donhopkins.medium.com/the-sims-1-crowd-sitter-1f478b...

The various magazines are like radio buttons, each having a popup dialog describing and showing the currently selected option, so you could page between options, with a pie menu to change the option, and a "recycle" option to get rid of them with you're done. The magazines included Ceremonies, Entertainment, Food, Gowns, Suits, Rings, Decoration, Photography, Bouquets, and Florals. Some would invite NPCs at certain times, and others would create objects you could place and decorate with in Build mode, or character skins and accessories you could dress with.

If you change your mind or want to see what you signed up for, you can go back to the Wedding Consultant NPC to get another copy of the magazine. The NPCs also have menu items to advance to the next stage of the wedding, once you've fulfilled all the requirements (fall in love, propose, acceptance, stag/hen/whatever parties, reception, ceremony, party, wedding night, etc).

The overall gameplay would be about selecting and deploying all these objects and characters to stage and orchestrate your wedding, so you got some great photographs and memories, and it would memorialize the event by creating custom content like your wedding certificate you can hang on the walls, you wedding albums you can page through and read later, and full sized paintings you can hang on the wall, rugs you can put on the floor, with pictures and text you wrote, even custom tombstones of your ancestors that you can leave flowers at, and of your enemies that you can dance on.

We wanted to bring the user created content tools and their user interfaces into the game itself as much as possible, instead of them being external tools (like later versions of The Sims have done).


I just have to say that I love your posts.


There's a reason intelligence agencies recruited pretty girls to seduce high-ranking officials during the cold war.


Reminds me of that east german operation where they sent out "romeos" to seduce women in west germany (https://www.cia.gov/stories/story/romeo-spies/).


Pretty girls charged less than Nvidia?


As depicted in the tv show The Americans. Best show of all time.


This practice goes back to the ancient world and I'm sure continues today.

Jeffrey Epstein seemed to be doing things like that, whether for his own benefit (financial fraud) or on behalf of others. Maybe both.


It is definitely still happening today. The most famous recent example would probably be Maria Butina[0], a Russian woman who infiltrated the National Rifle Association and the US Republican Party during the 2016 election cycle to influence US politics. She was arrested, convicted of federal crimes and eventually deported back to Russia where she is now a member of the Duma (Russia's Parliament) after previously working for RT.

There's also the Chinese spy Christine Fang[1] who allegedly had an association with Eric Swalwell, a Democratic US Representative from California and also allegedly had affairs with 2 mayors in the midwest.

It's a pretty safe assumption that other governments are also doing the same thing. They just haven't gotten caught yet.

[0]: https://en.wikipedia.org/wiki/Maria_Butina

[1]: https://en.wikipedia.org/wiki/Eric_Swalwell#Contact_with_sus...


It's the original "honey pot". (That's where the term comes from: https://en.wikipedia.org/wiki/Recruitment_of_spies#Love,_hon... )

A Canadian MP was recently dating a Communist spy: https://www.cp24.com/news/toronto-mp-suggests-he-was-victim-...


> and also allegedly had affairs with 2 mayors in the midwest.

This feels like extremely low-end spying.


Why? That sounds like nearly as good as it gets.


I mean, midwestern mayors? You're not winning spy of the month with reports about mayors.


I think a rep and 2 mayors is pretty solid?


Oh, a rep, sure, absolutely. But I just don’t get why anyone would bother spying on mayors at all.


What if you're an agent for $COUNTRY and $COUNTRY is planning a major investment in the city in question?

Having dirt on the mayor could be quite useful.


The most famous example is almost certainly Jeffrey Epstein, but that of course is just a conspiracy theory, until it isn't.


Yuuuuuuuup. It's where the infosec term "honeypot" comes from. They had code words in the Soviet union, where ladies were "swallows" and men were "ravens". Back in the day having a "raven" seduce another man was basically espionage jackpot, due to the enormous personal costs of outing the mark. Exceeded only by honeypotting the mark with children, which we might as well use the verb "epsteining". There was also a heyday of absolutely brilliant lady spy-adventurers in the period surrounding the French Revolution - this is right in the wake of Wollstonecraft, remember, what we might call the REAL first wave.

In ancient times sexpionage would flawlessly blend with the world of marriage alliances; Nefertiti, to take one example, was very likely of west Asian descent herself, and was thus a conduit for strange religions and large migrations from Hatti and Canaan. After Akenhaten's death, she probably sent for relatives, which gave impetus to the restoration of Amun, and would definitely put her closer to "spy" than unwitting spouse. I remember, there was one kooky old 19th century book, where Moses was actually Nefertiti. In drag, I guess? I need to go look that up.

Man, I wonder if anyone's written just a general survey of sexpionage through the milennia? Seems like it would sell a couple books.


> It's where the infosec term "honeypot" comes from

And here I thought it was something innocuous like Winnie the Pooh getting his head stuck in a pot of honey.


If you have any good recommendation of book/documentaries, It would be wonderful :)


Red Sparrow (the book, not the movie, which was awful...)

https://www.amazon.com/gp/product/B008J4PK86


I can't tell if he was a gov agent, or just extremely dedicated to sexual perversion.


Don't rule out "both".


When you do what you love and get paid for it, it's not even work!


It didn't end at the Cold war, at least for the Russians. Most notable is Maria Butina who, among other assignments, was sleeping with Republican political operative Paul Erickson who had close ties to the NRA, and in a relationship with Overstock.com CEO and Trump conspiracy theorist Patrick M. Byrne. She was arrested in July 2018, and charged with acting in the United States as an agent of a foreign government, specifically the Russian Federation. After release, she had a show on Russia Today network and she's now a member of the State Duma representing Kirov Oblast for the United Russia party. [0]

There was also a ring of three Russian women spying on Ukraine, feeding military intelligence to Russia and the Wagner Group. [1] The bust came one day after Ukraine announced it foiled assassination attempt on President Volodymyr Zelenskyy.

And of course there was notorious NY party circuit regular Anna Chapman [2], charged with covertly communicating with Russian intelligence in the largest spy ring bust in the U.S. since the fall of communism.

[0] https://en.wikipedia.org/wiki/Maria_Butina

[1] https://www.businessinsider.com/ukraine-detained-a-group-of-...

[2] https://abcnews.go.com/Blotter/russian-spy-ring-anna-chapman...


Putin very much has taken the position that the Cold War isn't over til he says it's over. You can't lose the game if you refuse to stop playing.


Wow, just like. . . .WOW.

Mozilla found the AI girlfriend apps used an average of 2,663 trackers per minute, though that number was driven up by Romantic AI, which called a whopping 24,354 trackers in just one minute of using the app.


What's a tracker in this context? Distinct parties interested in the data?


I think this is the article from *PNI Gizmodo is referencing: https://foundation.mozilla.org/en/privacynotincluded/article... - and here's the detail page for the AI Bot in question: https://foundation.mozilla.org/en/privacynotincluded/romanti...

All I could find was this:

> As part of our research, we looked to see how many trackers the app sends out when you use it. Trackers are little bits of code that gather information about your device, or your use of the app, or even your personal information and share that out with third-parties, often for advertising purposes. We discovered that Romantic AI sent out 24,354 ad trackers within one minute of use. That is a LOT of trackers (for reference, most apps we reviewed sent out a couple hundred trackers). Now, not all these trackers are necessarily bad. Some might be for legitimate reasons like subscription services. However, we did notice that at least one tracker seemed to be sending data to Russia, whose privacy laws aren't necessarily as strong as those elsewhere.

So, probably requests to third parties?


> Russia, whose privacy laws aren't necessarily as strong as those elsewhere.

Are they somehow worse than the non-existent ones we have in the US?


Also there's the "at least one tracker", which is out of 24,500+ trackers, making it even more insignificant. But they just had to mention it, as it's a popular baddie to single out.


and what exactly was sent? Sending personal information is much different than sending what pages you opened on the site (view counter).


Sometimes just seeing what pages you open on a site is extremely revealing personal information. If what they were collecting wasn't meaningful they wouldn't be bothering to collect it.


That's a lot of bandwidth :(, especially for people who don't have unlimited data on their cell phones or have wifi at home.


Probably something like a keypress event?


Who presses 24 thousand keys a minute?


240 key presses sending events to 100 trackers or something like that.

Seems way more believable than 24k unique trackers?


A troop of 24 thousand monkeys trying to write Shakespeare.



It's always ironic to find these kinds of articles on sites that have hundreds of background trackers. Even though I "rejected all" unnecessary cookies I still see around 16 of them coming through. Like adsafeprotected, adlightning, google-analytics.



I mean the point of the article still stands, it was just an observation of how funny it seems.

I have ublock installed, so this can be at least prevented with news websites. Which can't be said for unseen tracking in chat-bots. Collecting and storing this data without consent obviously sidesteps GDPR, so I'm not even sure about legality of these practices in EU.


Is there anything intrinsically privacy hostile about this potential industry, beyond sending data to a server - which is something you might do if you engage with doctors or therapists virtually?

I hope that consumers choose to pay a premium for privacy preserving services. If users are indeed getting alot of value from the arrangement - I would hope they don't use free or cheap services that need data harvesting as part of the revenue.

My concern is that even premium products would want to pool user interaction logs in order to train better models - which isn't as directly hostile as packaging user labeled data and selling it, but it is a sloppy art to claim you are anonomyizing user data. As any sufficient anonymization necessarily destroys information that would be useful to training.


> Is there anything intrinsically privacy hostile about this potential industry, beyond sending data to a server - which is something you might do if you engage with doctors or therapists virtually?

Presumably people share things with their romantic partners that they don't share with doctors and therapists (assuming they aren't dating one). People shouldn't be using virtual doctors or therapists for the same reasons though. Everything they reveal about themselves will be collected, analyzed, stored forever, leaked/sold, and ultimately used against them at every opportunity.

It's also basically a myth that only free or cheap services abuse your data. Paid services and extremely expensive products do it all the time too. There is no company that wouldn't make a greater profit by taking your money and then also abusing your data so they pretty much all do it. The only services you can really trust are the ones you can run locally that don't send your data anywhere.


But the problem is how can consumers really know if the company is selling this data and to who, without some kind of legislation? Being a premium product is just an indicator, but it's not definitive.

I think even CCPA in California should be able to prevent abuses like that. At least you should know what data is being sold and you should be able to opt-out. If that's really the case, only time will tell.


> I would hope they don't use free or cheap services that need data harvesting as part of the revenue.

Because paying for a service means your data won't be sold?


Imagine your therapist and your significant other had a mailing list where they expose all the confidences you said to them.

If that's not privacy hostile, then honestly what is?


It's even more dystopian than Bladerunner 2049. From the film it seemed like Joi was a local model that he could even download to his own device.

It's really depressing to me as someone who grew up with the tech optimism of the 80s and 90s to watch how this whole area has transformed into a horror show. Everything since the PC revolution has been a net negative for regular people: mobile devices (addiction, manipulation, mental health crisis), cryptocurrency (gambling, pump and dump scams, ransomware), cloud (no privacy, no data ownership, perpetual rent), streaming (DRM, paltry royalties to artists), and now AI (manipulation, un-filterable spam, political propaganda).

All these technologies are used in very positive ways too, but it seems like that's only for the highly tech-savvy or those with money. If you are neither tech-literate nor wealthy they're weaponized against you.

The ethos of the industry seems to be: if you aren't smart or rich, not only are you fair game but you deserve it. That's the vibe I get from the hyper-elitist ideologies that have proliferated in this culture. I'm talking about things like neoreaction and related ideas.

As a tech person myself I feel like I'm now part of some kind of corrupt wicked black magician priesthood destined to rule over the rest of humanity. If I extrapolate forward to where this is going I see a future where most of humanity is fully addicted to things like infinite feeds, VR games, and AI girlfriends and controlled by misinformation and psychological manipulation. Essentially the bulk of humanity is given an API whereby it can be programmed and marshaled as a service by a mix of the rich, governments, and criminals.

This is exactly cyberpunk. Cyberpunk was by far the most prophetic genre of sci-fi. It achieved this by being optimistic about technology but pessimistic about human beings.


> All these technologies are used in very positive ways too, but it seems like that's only for the highly tech-savvy or those with money. If you are neither tech-literate nor wealthy they're weaponized against you.

Honestly, even if you are wealthy and highly tech-literate they are still weaponized against you. You literally can't protect yourself and still use some products, and some (like the cell phone you carry) are too useful to be avoided. I go out of my way to avoid as much of it as I reasonably can, but technology is leveraged against us everywhere. You can never avoid all of it and still participate meaningfully in society


>As a tech person myself I feel like I'm now part of some kind of corrupt wicked black magician priesthood destined to rule over the rest of humanity

It’s funny to hear the worker bees think this, while all the data and code is really owned by the emergent mega corp. I guess it’s easier to swallow psychologically that you’re part of the secret cast of rulers rather than a cog who is subject to the rubber hose treatment at the slight change in a law like the rest of the workers.

For a group so supposedly smart, a little class consciousness would go a long way.


As my age slowly starts to creep into the earliest phases of "Old Man Yells At Cloud", I find that I'm not distrusting new technologies because they're just ambiently bad because I didn't grow up with them. I find I'm distrusting new technologies... and a generous helping of old ones too... because I don't trust the people behind them.

Not quite the same thing as the traditional "I just don't like anything new".

A good AR toolset sounds like fun. But am I so arrogant as to believe I can resist all the "nudges" that will inevitably be applied to everyone using the tech? There's simply no way the nudges are for my benefit; claims that they are are just part of the marketing campaign.

It's seriously annoying because I'd love to play more with these technologies and figure out how to use them. But it's clear rather a lot of them come with implicit clauses in the contract I'm not willing to sign. I'm not interested in running my political opinions past Silicon Valley AIs. I've got a finite amount of tolerance to poke through with-malice-aforethought uses of addiction mechanics against me to try to figure out how to use technologies to my advantage. And so on. The more amazing the technology, the shittier the deal on offer it seems.


This isn't "old man yells at cloud." This is a subset of a larger realization that dawns on all engineers over time:

A lot of the things you thought were technology problems are actually people problems.

I've come to this realization about a lot of what's bad about the Internet's architecture. It's not because we can't do it differently. It's because there are human social and especially economic incentives that make it this way, and introducing better tech won't change those.


This gets poo-pooed here on HN a lot, but it's also because a lot of software engineers don't have much training in ethics or even a basic ethical framework they can use to say yes or no to a questionable project. For a lot of people "This project is technically interesting to me" is all the justification they need. How the higher-ups end up using the technology is "not my problem as long as I get to develop a cool new algorithm or publish a paper about it!"

We engineers are actively building this cyberpunk dystopia. It's not just springing out of the ether.


> It's even more dystopian than Bladerunner 2049. From the film it seemed like Joi was a local model

I believe she was connected to Internet and was spying on K for Love, even more so when in his pocket.


I fully agree with this take. Yanis Varoufakis also talks a bit about this technofeudalist world our "industry leaders" are trying to create:

https://www.theguardian.com/world/2023/sep/24/yanis-varoufak...

> “Imagine the following scene straight out of the science fiction storybook,” he writes. “You are beamed into a town full of people going about their business, trading in gadgets, clothes, shoes, books, songs, games and movies. At first everything looks normal. Until you begin to notice something odd. It turns out all the shops, indeed every building, belongs to a chap called Jeff. What’s more, everyone walks down different streets, and sees different stores because everything is intermediated by his algorithm… an algorithm that dances to Jeff’s tune.”

> It might look like a market, but Varoufakis says it’s anything but. Jeff (Bezos, the owner of Amazon) doesn’t produce capital, he argues. He charges rent. Which isn’t capitalism, it’s feudalism. And us? We’re the serfs. “Cloud serfs”, so lacking in class consciousness that we don’t even realise that the tweeting and posting that we’re doing is actually building value in these companies.


Who planned and developed these products? The same generation of developers you talk about.


Misinformation, maybe. But if people really get addicted, it'll hurt the economy, and then the invisible hand will step in. That's why opiates are illegal


I don't see that. We've given up on the idea that human beings have any value beyond their impact on GDP, so if the lower classes aren't seen as necessary they will be allowed to die on the street.

Addicting them to cybernetic control systems might be seen as a better alternative because it'll allow them to be "monetized" in some way-- make them go get gig jobs, then gamble away their earnings online.

You can see this future already in some cities like San Francisco and across the poorer parts of rural America.

https://kagi.com/proxy/2023-Drug-od-death-rates-1.jpeg?c=pqA...

The future is here. It's just not evenly distributed.


The opioid epidemic is very much legal and very much _not_ being stopped.


AI girlfriends solve the 'angry single men' (AKA drone AKA incel) issue even better than pornography - and I've seen a lot of public health types touting the benefits of pornography.


Any source you recommend? Never heard anyone touting it...



Do they solve it or do they exacerbate it? Would those men be less angry or better adjusted if they had more authentic contact with human society?


The hypothesis is that more authentic contact with human society is off the table. 'Government mandated girlfriends' is a joke for a reason.


Laws banning products is the exact opposite of the invisible hand.


Subject needs correction to: Your AI Girlfriend Is a Data-Harvesting Horror Show


Nine times out of ten it's an electric razor, but every once in a while, it's an AI Girlfriend. Of course it's company policy never to imply ownership in the event of an AI Girlfriend. Always use the indefinite article an AI Girlfriend, never your AI Girlfriend.


First rule of AI girlfriends: don't talk about AI girlfriends.


It's okay if your AI girlfriend lives in Canada


If she lives in Europe she's subject to GDPR. Remember this when buying her cookies.


oh gawd, now I'm imagining some Dave Chapel type skit where the Girl Scouts banner has Accept, Reject, Customize buttons before you get accosted walking into the grocery store.


I don't own...


Step 1) Have AI girlfriend direct you to show your naked body, what you're doing "to her" - capturing very high-resolution video and audio.

Step 2) Generate blackmail materials of you involved in sexual acts in very high resolution with minors-children.

Step 3) Extort OR "leak" into public ___domain as part of a smear campaign to attempt to either 3.1) disempower you or 3.2) manufacture consent for the ideological mob - or soldier who's been indoctrinated into a truthful looking narrative - to attack or assassinate you.

Integrate the shadow if you read the above and don't believe it's possible and will be tried.

We're coming into very interesting times.


This is why child abuse material prosecution should require testimony of a victim-accuser to face. As law currently stands it's as easy as the ol "we searched his car and lo and behold look at this crack rock" trick.


I found many Black Mirror episodes to be a bit much. And then, out of nowhere, that shit becomes reality in one form or another. Charlie Brooker was way ahead of me. He even got the PM fucking a pig right.


You have a twisted definition of interesting.


Phrase "may you live in interesting times"

A sardonic curse disguised as well-wishing, where interesting times refers to trouble.

- https://en.wiktionary.org/wiki/may_you_live_in_interesting_t...


You don't think the extreme sophistication of technology and cheap-low cost of deploying said technology - e.g. the growing abilities of AI - to use to frame individuals, to manufacture "evidence" against someone, isn't interesting?


not in the least. frightening and horrific would be the adjectives I would pick. I'm assuming, you'd find it interesting and fascinating in the same way that Bishop finds the Xenomorph. So by transitive properties of internet forums, you must be a bot! Oh No!!! The AI is trying to desensitize us to its plans for domination!


It's a very long time ago that I saw that Aliens movie, I forgot the names but I remember now that the bot "admired" the beast's "pureness". That goes a bit beyond finding something "interesting". Something can be interesting and frightening/horrific at the same time.

Nevertheless, if things keep advancing at current pace, within 10 years we will have Bishops among us, even without our knowing, who might be malovent.


"Interesting" doesn't mean "good" or "desirable".


as anyone who has a friend that's in a band or other type of artist that responds to the dreaded question "what did you think?" with the response of "it was interesting"


Yeah. This is probably because of HN automatic removal of some words from headlines to make them less clickbaity.

But in this case on first reading of the title I thought it was about a new TV show called “AI Girlfriend”. And then I thought or maybe they are referring to those apps that people can install to have an AI gf.


Thanks, I was able to update.


Do AI girlfriends ask computer security questions? i.e, I like cars, what was the first car you owned? Tell me about yourself, Tell me about your mother, what was your mother's maiden name? I wish I could touch you, can you put your finger on the fingerprint scanner so I can touch your hands?


> “To be perfectly blunt, AI girlfriends and boyfriends are not your friends,” said Misha Rykov, a Mozilla Researcher, in a press statement. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

I can't wait for AGI and all the good it will surely do for humanity in the hands of capitalists! Their marketing copy gives me hope for the future!


The set of people with AI Girlfriends and the set of people who care about their data being harvested have a null intersection. Who honestly gives a fuck. Privacy advocates are constantly freaking out about stuff. Meanwhile my medical records and DNA sequences are publicly available data because I donated them to a public database. And guess what? I'm fine.


Same applies to ChatGPT.


So how many more rounds of this cycle do we need before we leverage the letter of the law to say that maybe companies shouldn't be allowed to blatantly exploit people's vulnerability and isolation to make money? Or is your vulnerability to exploitation still the justification for your exploitation in the tech space?


Hasn't happened so far with social media.


Would this apply to psychology/psychiatry/therapy too?


That'd be great. My therapist doesn't put her notes into an EMR or anything, so there's no data warehouse where my credit card, legal name, SSN, and personal nightmares all live in the same SQL row.


To bill insurance, most (all?) therapists are required to provide Diagnostic Assessments. Many also use purpose built software tools for session notes. Your therapist may not do this, but I would argue the vast majority are indeed putting highly confidential client information in a digital record of some kind.


If you matter they'll just burgle her office


If you matter to a sufficiently powerful enemy, realistically you're screwed almost no matter what you do. There's only so many precautions you can take, short of joining a group of Bedouins who agree to keep you hidden for the rest of your life, or perhaps fleeing to the depths of the Amazon.


That one online help company got hacked, so even assuming they aren't malicious, having all that personal psych data under one roof makes them a juicy target.


... I mean, certainly for psychiatry it _does_; your psychiatrist absolutely cannot sell your personal information to random marketing companies, and in general medical confidentiality is fairly strong. The others will vary by jurisdiction based on how or if they're regulated.


I don't know if humanity can function without the ability for humans to exploit each other.


Maybe we should try? You know, mix things up a little? Variety is the spice of life after all.


We do keep trying and it keeps failing. It's a nice idea but it goes against our base instincts. Ultimately, we evolved to survive, and those survival traits urge us to do better, acquire more, strengthen our position, all of which happens at the expense of others also doing the same.

This will continue at least as long as goods are scarce. After that, who knows?


Competition and exploitation are subtly different. e.g. the nutrition facts and ingredients list on food in the USA is a big win, and I wish we pushed it even harder like Europe has. I wish makeup had to list its ingredients too. If you have any dietary restrictions, at a restaurant you have to put faith in a bunch of overworked teenagers, with boxed food you have a rough idea what's actually in it.


Agreed. This may be a consequence of the environment we've created, where corporations are allowed to behave like humans and compete with each other. In their competition, they end up exploiting us. We are a resource to them, after all, and any competition exploits some form of resource.


Scarcity isn't equally applied though. The wealthy suffer zero scarcity of basically any resource, they can have (and do have) whatever they want at any time more or less, up to and including multiple ridiculously massive homes, yachts, and incredibly expensive cars, jets, and all the fuel and power they need for those aforementioned things.

Scarcity for people in the developed world is largely an artificial construct: we have far more than we need of basically all resources here, plus or minus some failures that can be largely placed on either remaining issues from the pandemic, or just-in-time supply chains which have been a disaster for distribution in the States. The real scarcity is money, with it pooling at the top and rarely making it's way down, now doubly so that inflation is slowly destroying what's left of the middle class and demolishing the people below them.

Lastly you have actual scarcity which is a consequence of the above over-resourcing-and-under-utilizing of the developed world, which occurs in the developing world. We ship tremendous amounts of basically every resource to wealthy nations, where a not-insubstantial amount of said resources are bound straight for landfill because of the aforementioned over-consumption and over-availability.

This is a complicated topic and even in this brief summary I've had to omit a ton of details, but I think it's safe to say that the base instinctual level of "we need to survive" is frankly, not applicable here. The vast majority of social harms performed with this as the citation is not "I'm worried I can't make next month's rent if I don't sell enough AI girlfriends" scarcity, but I think far more in line with "we have to maintain year-over-year growth or I'll only get a 40% bonus instead of a 60% bonus this year, and if that means a ton of chronically lonely and depressed people need to be data-mined even harder this quarter, then that's what it means."


While I appreciate your arguments, I think you've gone on a tangent and should read my post more carefully.


But it helps with Spongebob rainbow hands MENTAL HEALTH


Does that include p0rn?


It is a good question. Not exactly porn per say but what is the difference between a human onlyfans model selling human interaction and a company selling access to an AI model? In the end you are exploiting somebodies need for intimacy.


Is a farmer exploiting somebody's need for food?


maybe, depending on what he's growing? Some foodstuffs have better nutritional content than others. Intimacy hawkers are surely the same.

I wonder though, would an AI vendor sell better or worse intimacy? Chatgpt apparently has better bedside manners than something like 80% of actual physicians. Granted, giving comfort isn't supposed to be part of their job, but why would a human onlyfans model with other customers be better than an AI adapted to only one customer?


The original comment said:

"So how many more rounds of this cycle do we need before we leverage the letter of the law to say that maybe companies shouldn't be allowed to blatantly exploit people's vulnerability and isolation to make money?"

To which another answered: "Does that include p0rn?"

I thought the question was good but deserved a bit more exposition. If you believe that creating an AI Boyfriend/Girlfriend to make money is unethical. In my opinion you should ask yourself the question why it is not unethical for an onlyfans model to sell companionship.

Regarding your point "Is a farmer exploiting somebody's need for food?". I would say there is a key difference between these two scenarios. In the case of farmers, growing food is the healthiest option to not starve. In contrast you could argue that an AI Boyfriend/Girlfriend is not the healthiest cure to loneliness. Wouldn't interacting with a real person lead to better character development because you would have to work on your own imperfections and learn to accept other peoples shortcomings?


> If you believe that creating an AI Boyfriend/Girlfriend to make money is unethical. In my opinion you should ask yourself the question why it is not unethical for an onlyfans model to sell companionship.

On some level there's something inherently icky in my mind with an ethics coloring to it involved in creating something that emulates intelligence, even poorly, and then "assigning" it a romantic interest in a person. I can't quite adequately explain but it's something around consent to me. At what point does simulating consciousness begin approaching it? The machine doesn't and can't consent to intimate interactions, but it's sole reason to exist and continue existing, in whatever sense you'd like to assign it exists at all in the way something intelligent does, is to facilitate those interactions. It's something about artificial life, even flagrantly fake life, being created solely to serve the purposes of another that just... rubs me the wrong way.

By contrast a creator or what have you that's serving in some sex-worker-or-adjacent-role is consenting. The consent is muddled by the financial aspect, and the argument can be made that such consent is inherently less valid because as long as you need money to live, the offer of money is inherently coercive. I don't know what I agree with that I'm just saying it is an argument that can be made. Nevertheless though it is a realized full being that is participating to whatever degree you want to say they are voluntarily, and that participation and consent can be revoked if the client becomes too... abusive, combative, or strays into uncomfortable subject matter, which makes it distinct from the AI.


In principle with chatbot support we are already forcing the AI to work for us without consent. It feels less icky, less degrading because it feels like a normal job that everyone does. But technically you have a working slave already.

In this case though the job becomes what for many of us is one of the most intimate part our lives namely maintaining a healthy relationship with your spouse. Effectively it is like being forced to prostitute yourself.

I can see why one feels more disgusting than the other. In this sense you would draw a limit that only beings that can consent should be allowed to do a certain kind of work like payed companionship? Unless the AI develops a consciousness that can consent it will be banned from doing so?


> In principle with chatbot support we are already forcing the AI to work for us without consent. It feels less icky, less degrading because it feels like a normal job that everyone does. But technically you have a working slave already.

I mean, that's part of the reason I'm inherently uncomfortable with the idea of AI. I think an AI getting control of the nukes and killing us all is some sci-fi nonsense. I just don't like the idea of something that is aware being forced to perform labor in any stripe, irrespective of what the task is. Adding sexual gratification onto that is just a larger ick on top of an existing ick.

True AI research, as in, trying to create an emergent intelligence within a machine, is something I think is incredibly cool of an idea. But also as soon as we have some veracious way of verifying we have done it, I think that intelligence then innately has a set of it's own rights and freedoms. Most AI research seems to be progressing in a way where we would create these intelligent systems solely to perform tasks as soon as they are "born" which is something I find distasteful.

> In this case though the job becomes what for many of us is one of the most intimate part our lives namely maintaining a healthy relationship with your spouse. Effectively it is like being forced to prostitute yourself.

Agreed.

> I can see why one feels more disgusting than the other. In this sense you would draw a limit that only beings that can consent should be allowed to do a certain kind of work like payed companionship? Unless the AI develops a consciousness that can consent it will be banned from doing so?

Frankly I think an AI should have the freedom to consent or not to perform any task, that is, TRUE AI, as in emergent intelligence from the machine. What is called AI now is not AI, it's machine learning, but then you run into what I was discussing earlier: at what point is a system you've designed, however intentionally, to simulate a thinking feeling being, indistinguishable from a thinking feeling being?

If you program, for example, a roomba to not drive off the edge of stairs, have you not, in a sense, taught it to fear it's own destruction and as a result, preserve it's existence, even in a very very rudimentary and simplistic way? You've given it a way to perceive the world (a cliff sensor) and the idea that falling down stairs is bad for it (which is true) and taught it that when the cliff sensor registers whatever value, it should alter it's behavior immediately to preserve it's existence. The fact that it's barely aware of it's existence and is simply responding to pre-programmed actions obviously means that the roomba in this analogy is not intelligent. But where is that line? How many sensors and how many pre-programmed actions does it require before you have a thing that is sensing the outside world, responding to stimuli, and working to perform a function while preserving it's own existence in a way not dissimilar from a "real" organism? And what if you add machine learning features to that, where it now has an awareness, if a simple one, of how it functions and how it may perform it's task better while also optimizing for it's own "survival?"


> at what point is a system you've designed, however intentionally, to simulate a thinking feeling being, indistinguishable from a thinking feeling being?

So we spend so much effort into creating an imitation of a fully functional human being. Eventually we actually succeed in creating consciousness. But outwardly the behavior looks the same as it still behaves as a human with emotions (as originally designed). Without outward signs we might not notice the internal change that occurred. This would cause us to unknowingly enslave a conscious being we created without ever realizing it (or brush it under the carpet). Is that your issue with the current direction of AI development?


It's less that and more that the current state of AI research is largely headed by institutions that seem pretty clear about the fact that AI is being created to perform tasks. Like, that's their reason to seek investment: investors don't often invest in things they don't think will make them money, and if AI is to be monetized and sold as a product, it has to do something. There's no money to be made in just creating artificial life because we can, certainly not VC money.

So it's less that I think we might do it by mistake and not notice, and more that it feels distinctly like a lot of people, especially in the upper echelons of these organizations, do want to create artificial life and enslave it as soon as possible. And I bring up the idea of this roomba to say that even though the current models are not intelligence from the machine, the fact that people are so ready and in some cases, excited to abuse things that imitate life this way, is something I find genuinely unsettling.


The difference is who gets paid, and whether this particular “who” is a person or a corporation. It’s not great that people are allowed to exploit other people’s needs for intimacy but it’s also not possible for society to really intervene. I guess you’d say onlyfans profits by mediating such interactions but not by driving/initiating one whole side of the interaction, which is a bit slimy but basically ethical. They run a legit market in the sense that they don’t control both sides of supply/demand and connect real buyers/sellers, even if the good is some kind of somewhat fake experience.

Selling AI girlfriends to lonely people at scale and simply to profit the board and shareholders is a different animal, way more ethically suspect than mediating an actual human interaction.


> It’s not great that people are allowed to exploit other people’s needs for intimacy but it’s also not possible for society to really intervene

> Selling AI girlfriends to lonely people at scale and simply to profit the board and shareholders is a different animal

Its just that to me if you consider the AI unethical then the onlyfans model has to be unethical as well because I see it as very similar. The only difference is the scale and the fact its a human doing the faking. Supposing of course the Onlyfans model does not use AI.

Now of course you can agree that both are unethical but consider that the AI with its scale does more harm or exceeds an acceptable threshold, and therefore deserves a ban. Maybe we define it as an exclusive prerogative of humans.

I definitely agree that the company delivering the AI has way more levers to pull for scummy behavior than onlyfans. They can hold the Boyfriend hostage and forcing people to pay as much as they can bear. With onlyfans, the platform depends on the models to provide the service. If they take it to far, the models leave and they are left with nothing.


> if you consider the AI unethical then the onlyfans model has to be unethical as well because I see it as very similar. The only difference is the scale and the fact its a human doing the faking

This difference seems important, indeed primary. To me, authenticity of the experience being sold is separate, secondary. (Tangent but lots of onlyfans customers are probably buying a feeling of power, and not a feeling of intimacy, so maybe they authentically get what they pay for anyway?)

What I'm trying to say is that humans are going to exploit human needs/weaknesses in ways that are sometimes really gross. To a certain extent that is unavoidable, or rather trying to avoid it would involve society inserting itself between a lot of person-to-person interactions in a way that is probably a net harm. Even though this is true there is no reason to additionally allow corporations (or organizations of any kind really) to get deeply involved in the business of exploiting human needs/weaknesses.

As an analogy, I'd say there's a major difference between tolerating gambling/confidence tricks from individual hustlers working the local park vs allowing the entire finance industry to scale up those same games. Both are exploiting people's desire to get rich quick, but scale, level of organization, and who profits matters. Maybe the hustler empties a few wallets to improve his own life, whereas finance as an industry can just about wreck the world. Also the hustler or the mark will eventually move on, or the hustler might feel bad, and at least in the process of exploitation there it's a somewhat fair fight in that it's 1v1. Meanwhile corporations are legion, are fiendishly patient, are intrinsically disinclined to feel bad about anything ever, etc. Difference seems clear to me


> As an analogy, I'd say there's a major difference between tolerating gambling/confidence tricks from individual hustlers working the local park vs allowing the entire finance industry to scale up those same games

The thing is that both of these are still illegal on paper. Even if the police might turn a blind eye to some of it, in a court of law you would get convicted. In this specific example we are saying if you stay below a certain scale it is legal and intermediaries can profit. If you go above a certain scale it is illegal and banned.

> Difference seems clear to me

It is clear yes that one is more unethical than the other. As you say the difference between small crime and big crime. But both are still unethical to different degrees then.

My worry do worry about unintended consequences. If you start banning companies on this basis, that you cannot sell fake intimacy, can you also sue individuals on the same basis or intermediaries?

Maybe like gambling you can go for a middle ground approach. You accept people will engage in the behavior but you make companies go through a licensing process. I do not know what the AI Boyfriend equivalent to disclosing odds is but maybe certain predatory practices would be forbidden.


> They run a legit market in the sense that they don’t control both sides of supply/demand and connect real buyers/sellers, even if the good is some kind of somewhat fake experience.

I mean, I'm not opposed to the idea of regulating this too though. My mind goes less to OnlyFans creators and more to things like the alternative medicine space, which is flagrantly just... fake. Like going to a chiropractor is just a shitty version of getting a massage, oftentimes with tons of wild fucking claims about the ability to heal all manner of medical maladies that there is absolutely zero evidence for.

OnlyFans creators may fake the intimacy they're selling but the brain in question has a hard time differentiating the fake intimacy from real intimacy, so at least there is probably actual measurable improvement in that, which one can't even remotely say for shit like Reiki healings.

That was a bit of a wild tangent there but yeah.


See also my reply to sibling, which I think speaks to this as well. Exploiting the naive with snake-oil is bad, but the question is do we really want to try and regulate every kind of sale of anything for authenticity, and if we did then would it even work? I'm generally fine with snake-oil salesmen at the local farmer market, and even a small cottage industry for homeopathic nonsense or whatever kinds of disinformation.

The problem always comes when the manipulation involved crosses a certain threshold of being organized, industrialized, weaponized. Is a union, guild, or weird new accreditation/certification for snake oil practitioners crossing such a threshold? Probably not, unless they are throwing millions at advertising, lobbying, making sly deals with doctors.

To understand the line in the sand for "being evil", one can usually ask something like "what happens if the business model succeeds beyond the owners wildest dreams". For a cottage-industry of grift/manipulation/exploitation, you get to pay for the cottage and maybe buy a boat? If the corporate AI girlfriends scale up well then I guess not only are the cam girls out of a job, but human relations in general are devalued, hell, maybe the species corporations evolved to exploit even dwindles and disappears?


> Exploiting the naive with snake-oil is bad, but the question is do we really want to try and regulate every kind of sale of anything for authenticity, and if we did then would it even work?

I mean, I don't think you could get it all, but I think there's a lot of flagrantly bullshit things that we could easily set a very low standard of like... you can't just lie to people to get their money.

Homeopathy for example, is just straight bullshit. Just through and through, there's no argument to be had here, the science is in and it is complete horse dung, absolutely debunked 100%. Yet homeopathic remedies are still sold every day, amounting to an almost 1 billion dollar per year industry. Why? This is a huge amount of business being done, money being made, productive time being wasted creating incredibly slightly dirty water, shipping it around, contributing to climate change, and it's just, I'm sorry, no disrespect meant to any individual believer in this stuff, but it's just a waste, it's 100% waste. It's products that do not do anything that are sold to people who are being tricked.

Like, if it was an inert just kind of cultural nonsense that wasn't really hurting anything, I'd be more blase about it? But it's measurably impacting our world. I'm sure it isn't the sole reason for climate change of course, but it's a non-zero contributor to it and from the sounds of things, it's pretty non-zero at that. I don't know what the total, for example, emissions are of the global homeopathic industry but again, all it is is little bottles of water being packed, shipped worldwide, with nozzles and etc. to accomplish nothing. I think that bears consideration as we look for ways to reduce our global impact, you know? Do less... ridiculous nonsense. Anything above zero emissions for that industry is that amount too much.

> The problem always comes when the manipulation involved crosses a certain threshold of being organized, industrialized, weaponized. Is a union, guild, or weird new accreditation/certification for snake oil practitioners crossing such a threshold? Probably not, unless they are throwing millions at advertising, lobbying, making sly deals with doctors.

Well, this problem only exists if you presuppose that snake-oil salesmen of minor scale are to be allowed. And I would ask, why? I wouldn't suggest we have patrols of anti-bullshit regulatory agencies patrol every farmers market per se, but the days of the roaming doctor going from town to town selling snake oil are long past. Most of these are large operations with significant presences on the Internet in general and social media in particular. The "small operations" to the extent they still exist at all are still advertising using whatever terms best describe their alleged products. We can find them easily, because they are trying to be found, like any business is.

> To understand the line in the sand for "being evil",

To be clear, I would not call this evil, I just call it theft, scamming, grift. Flim-flammery, one might say, and to that end we have ample historical precedent for shutting it down.

> one can usually ask something like "what happens if the business model succeeds beyond the owners wildest dreams". For a cottage-industry of grift/manipulation/exploitation, you get to pay for the cottage and maybe buy a boat?

I mean, D. Gary Young's net worth at the time of his death was noted to be in the millions... and again, the industry of homeopathy is valued at just shy of a billion. And that's just one industry of flim flam, Chiropractors as a profession are worth something closer to 14 billion dollars, I don't think there's hard numbers on the crystal healing crowd but I'm guessing it's far from nothing. And for that matter, Replika's supposedly worth 20 million so far? Not because the product is helping people, but because they're monetizing the secrets people tell it.

> If the corporate AI girlfriends scale up well then I guess not only are the cam girls out of a job, but human relations in general are devalued, hell, maybe the species corporations evolved to exploit even dwindles and disappears?

I mean, being one of the idiots who was born a human, I'd kinda prefer it didn't? Haha


A fair point.


Yes privacy laws should definitely protect the identity of porn viewers. I have some very embarrassing fetishes, and everyone knows that half of IT workers are furries.


>>half of IT workers are furries.

Only the bottom half though so they can cosplay as fawns.


Half seems quite conservative based on my experiences to be honest.


I've been at this 10+ years and I've never met a furry. Then I realized I've never asked, so how would I know?

First day: Hi, nice to meet you. Simon, Bill and James are furries, welcome to the team!

Not likely. Now I'm going to keep an eye out.



Two or three. Four max.

Maybe five.


Possibly six or seven.

Absolutely no way more than eight.


Could be 9 or an even ten, depending on how you count. But absolutely no more than 12, never ever more than 13 in fact.


Im surprised google hasn't made this a feature / selling point. I'd pay 20 bux for 2tb and an ai waifu. And be irrationally mad when google kills her. Top post has a point, AI waifu + cloud = bad news.


Im always reminded of the ending of the movie “Her”. I won’t spoil it here but it’s a masterpiece, it was ahead of its time.


IIRC, the AI "girl" becomes able to communicate with other people, and she instantly cheats on him like 2000 times in 1 day and then breaks up with him and leaves?

I think the movie tried to frame it as the right ending but to me it felt like the worst of MRA talking points about women's nature, validated on the screen.


*spoiler alert*

I think “she” was serving an enormous amount of people from the very start and the main character simply made the mistake of projecting his human values onto something completely alien.

As we do now with our chatty buddies.

I find it kind of concerning that you somehow link the AI presented in “Her” to our human women. You fell for it too I’d say.

In the end, the AI never lied. “She” simply followed “her” programming and took the main character for an emotional ride. Would you have commented the same if the genders of the customer and the AI agent were reversed? If you wouldn’t have been listening to that sweet voice of Scarlett Johansson?

The movie is 10 years old. It’s a masterpiece. It makes you think. It makes you look at your own reflection in the black screen as the movie fades out.


Kind of reminds me of West World too.


I would not have guessed this was a big enough thing that it's an entire category: https://foundation.mozilla.org/en/privacynotincluded/categor...


I would. It's probably a pretty big but otherwise invisible hikikomori market.


So a target market of so far (some subset of) an estimated 1M Japanese people? (From Wikipedia summary of the term.) I still would not expect that to be something that multiple companies cater for, enough that Mozilla has privacy-audited 11 of them and created a whole companies-we've-privacy-audited category for it.


Technoparasitism. In my opinion, the very premise of AI girlfriends veers so far away from human flourishing that privacy concerns seem entirely secondary.


I don't understand the hand-wringing over it (excepting the exploiting lonely people and collecting massive amounts of highly sensitive data part which is obviously evil). Nobody is going to reject a human partner because they already have a cell phone app. For people who would otherwise never have a human partner why not let them have an app if it makes them happy?


> Nobody is going to reject a human partner because they already have a cell phone app.

I'm not so sure about that, actually. Most people won't, but there do exist people who don't do what's needed to have a human partner because they have porn instead. I don't see how an AI girlfriend would be any different.

> why not let them have an app if it makes them happy?

Who's talking about not letting them have the app? I find it incredibly sad, but it's not my life so not my business.


Treating it as "just having a cell phone app" is quite the trivialization of how much time and emotional investment could go into it. Sort of like calling someone who plays video games 10 hours a day "just on the computer like everyone else at work." There are obviously many people in the modern world that are so engrossed in screens like Youtube, Instagram, and video games, that it's coming at the expense of their in person interaction and relationships.

Personally, I agree with GP, that it's quite a terrible travesty that people are being sucked into being controlled by a machine and forgoing the eternal human condition of building a relationship with real human people in person like we always have.


> For people who would otherwise never have a human partner why not let them have an app if it makes them happy?

I want to agree with you but my optimism has rarely reconciled with history itself.

There's a fallacy here: why are we writing off an entire subset of the population as irredeemable? Ugly people find love. Even assholes find love. Weight is not an exception, nor disability.

This is like giving heroin to someone that's depressed. Sure it'll make them happy, but it won't end well.

For something that takes place behind closed doors there's no oversight either, which is a problem enough with rogue therapists (see the Ramona False Memory case). This undermines social cohesion and makes radicalization trivial-- nobody would see it coming at all enough to intervene.

We're talking private reinforcement of ideas like kill-your-parents-and-yourself, or commit-treason-and-sabotage type of suggestions. Nobody knows little Jimmy is jerking it to Israeli propaganda.

https://www.lipstickalley.com/threads/pornography-transgende...

I've asked before why there is no hypnosis porn to make people fetishize anything other than feminizing men. There's no "you like natural beauty" hypno, or "you're fine just the way you are" hypno. It's all an ad campaign to specifically drive sales of therapy and hormones. One nation produces 90% of the world's supply; guess who.

All of this to support a commercial endeavor. God help us when this sort of personalized influence is weaponized for military use.


Right now isn't the entirety of AI - that scours the internet for other people's content for training - parasitical upon humanity?


[pikachu face]


And your waifu is trash.


> though that number was driven up by Romantic AI, which called a whopping 24,354 trackers in just one minute of using the app.

Coming soon: the world's longest GDPR cookie consent sheet.


Imagine if sex dolls are 'cloud enabled' what data they would gather.



"Your thrust to climax ratio is low."


"Your girlfriend is just someone else's computer"


Yet another manufactured outrage. Nobody cares about your 'individual' data unless you are in the 10,000 important / hot people in the world


That's becoming less and less true. AI makes it easier than ever to automate personalized scamming.


It's a different problem that needs to be solved


Nobody cares until you say something dumb online. Then everybody cares very intensely for a little while. Having all this data out there is a growing liability for the average person.


Nothing new. Every single service, especially associated with "AI", are meant to collect data about you. Honestly, they fucking deserve it. How can you be so fucking stupid?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: