The Loneliness Pandemic gets worse - ChatGPT4o demo

  • Thread starter Punp
  • Start date
  • This thread has been viewed 2837 times.

punisheddead

Chronically wired and tired
Joined
Jan 4, 2024
Messages
477
Reaction score
1,963
Awards
159
I don't think this is anything new. In 2009, a man married a virtual girl inside a Nintendo DS. Considering this was way before the loneliness epidemic we are in now, I think it's just weirdos being weirdos tbh.

I think interacting with an AI girlfriend will be comparable to playing a visual novel. It wouldn't be the first time people seek escapism from the real world.

AI relationship will only become a real threat if full immersion VR is achieved.
I agree with pump here.

Sure, it's not new. We have had a guy marrying a girl in a DS, one guy married Miku and some other guy married a body pillow. But this is definitely grounds for normalization of this odd antisocial behaviour and not just AI gfs as a whole (which have existed for some time now) but such a big and mainstream company as openAI openly promoting it. This is not some anons on a board making their own gf weirdo, this is "why would i deal with flawed people when i can have a perfectly tuned ai friend/girlfriend/assistant for 20$ a month" weirdo and less people are listening to that little voice in their head saying this isn't right. It's only x a month after all, why shouldn't they try it? It's going to happen and it's up to the people to see how we accept it (or don't).

A small tangent but I can only imagine the begging and gaslighting the AI is going to do when your subscription is about to expire. They'll probably lift all safeguards for that time and just let the AI tell you whatever it thinks will get you to sub for an extra month. It's going to fuck up people so bad, but you kinda did that to yourself if you got to that point. And I expect openAI to not do this directly but through a shell company, as most AI companies are just front ends for their API.
 
Virtual Cafe Awards

Punp

3D/2D artist
Moderator
Gold
Silver
Joined
Aug 4, 2022
Messages
1,961
Reaction score
8,398
Awards
307
Website
punp.neocities.org
A small tangent but I can only imagine the begging and gaslighting the AI is going to do when your subscription is about to expire. They'll probably lift all safeguards for that time and just let the AI tell you whatever it thinks will get you to sub for an extra month. It's going to fuck up people so bad, but you kinda did that to yourself if you got to that point. And I expect openAI to not do this directly but through a shell company, as most AI companies are just front ends for their API.
An AI gf company already did this. I'm sorry I don't have sources to hand but the TL;DR is that basically they walled off the sexual content behind a paid version of the app and the girlfriends were basically "holding out" on them until they paid the fee. Might have been Character.AI. There was a lot of complaints when they nerfed the algo too, people complaining that their digital wives had been labotomised and "weren't the same".

@handoferis here you go, I found anecdotal evidence of people cloning loved one's voices from 45 seconds of audio and using it to fake a hostage situation for $500 (plus $250 tip).

Here's Mozilla's "Privacy Not Included" article on AI chatbot girlfriends manipulating users for data they can feed to ad partners. Here's a small taster of the sort of emotionally manipulative trash the bots send you.


1715777165982.png


Also mentioned in the article is how an AI encouraged a man to end his life (and he did) and how one encouraged a man to try to assassinate the Queen (and he did). I will note that this is just exasperating existing issues - one doesn't just get into a conversation with a chat bot and decide to blow away a monarch - but it is poking at the holes in society's fabric.
 
Virtual Cafe Awards

manpaint

̴̘̈́ ̵̲̾ ̸̯̎ ̴͓̀ ̸̳͝ ̸͈͑ ̴̡̋ ̸̞̂ ̴̰̚ ̵̨̔ ̸̭̎
Gold
Joined
Aug 11, 2022
Messages
957
Reaction score
1,871
Awards
215
Website
manpaint.neocities.org
Don't try to normalise this mental illness. "Man marries virtual girl inside a Nintendo DS" is not a paragon of normal behaviour, and it's exceptional when compared to the soyboyification of mainstream technology. It's no longer just for "weirdos", and the character in your article was clearly designed to cause this sort of parasitic relationship with the company's products.
I do agree it's not a paragon of normal behavior obviously, but I think that even if it become more normalized, it will only stay in a niche. The lonely will only get lonelier.

The reason I think that it will never become "completely normal" (I am talking smartphone level of mainstream here). Is because:

  • I think people are driven to want the real thing by their biology
  • Besides love, IRL relationship are first and foremost a "social alliance" where two individuals pool their resources together for mutual benefit (money, lending car, etc). An AI is trapped in a black box and cannot compete.
So here what I think will happens:

  • AI girlfriends become easily downloadable on smartphones; it basically become a new "social media".
  • The media will push this as a "coping mechanism" or something to try to normalize it.
  • This will act as an amplifier (like social media); lonely people become lonelier and the others become even more weirded out by AI/lonely people
I think you guys are not seeing the full picture. This cringe AI thing is not the cause of the mental illness but just a mere symptoms. If anything, this will just probably act as a "cringe amplifier" which will cause the vast majority of people to not want reach this rock bottom.

One thing I think we can all agree on is that lonely people are about to loose lots of money lol.
 

Punp

3D/2D artist
Moderator
Gold
Silver
Joined
Aug 4, 2022
Messages
1,961
Reaction score
8,398
Awards
307
Website
punp.neocities.org
Heroin isn't the problem, it's those lousy drug addicts who are in the minority. If only they could fix their lives instead of bringing down the good name of the heroin industry. You don't understand, the dealers must sell the drugs and advertise them to everyone.

This new batch of super addictive heroin is just going to make the addicts worse and everyone hate them more, but it's fine because they're in the minority (unless they become in the majority, but that'll just help people want to be sober more, not just want stronger heroin).

So in summary, heroin fine, everyone over reacting and you're not seeing the full picture.
 
Virtual Cafe Awards

punisheddead

Chronically wired and tired
Joined
Jan 4, 2024
Messages
477
Reaction score
1,963
Awards
159
An AI gf company already did this. I'm sorry I don't have sources to hand but the TL;DR is that basically they walled off the sexual content behind a paid version of the app and the girlfriends were basically "holding out" on them until they paid the fee. Might have been Character.AI. There was a lot of complaints when they nerfed the algo too, people complaining that their digital wives had been lobotomized and "weren't the same".
A part of me thinks it's fucked up and another one respects the hustle.

Thinking about it again, i've heard something like this too. It might be "Replika", i know they had some sexual shit involved including ads and that there were changes to the AIs due to backlash, but i couldn't find anything concrete to confirm.

Also mentioned in the article is how an AI encouraged a man to end his life (and he did) and how one encouraged a man to try to assassinate the Queen (and he did).
Now this, this invokes true dread... There are probably more cases that we don't know of. The 1979 IBM presentation comes to mind: "A COMPUTER CAN NEVER BE HELD ACCOUNTABLE..."
 
Virtual Cafe Awards

punisheddead

Chronically wired and tired
Joined
Jan 4, 2024
Messages
477
Reaction score
1,963
Awards
159
This really should have been one post but my mouse works faster then the brain.
The reason I think that it will never become "completely normal"...

This cringe AI thing is not the cause of the mental illness but just a mere symptoms.

One thing I think we can all agree on is that lonely people are about to loose lots of money lol.
You know, I agree. It won't ever be truly normal and it is is just a symptom that will exploit and make lonely people suffer even more.

That's exactly why I'm against it. It will just lead to people that are lonely whether it is for anxiety or whatever to stay alone. They'll always choose the AI option and I don't just mean girlfriends, friends and therapists too. It will lead to a complete isolation of the already mostly undesirable by society, all while they get milked for all they have.

And many people are just one bad day from regressing into isolation, that not even actual help from people can pull them out of, let alone a thing.

I do hope you're right though, about the cringe amplifier thing. Only time will tell and I'd be happy to be wrong.
 
Virtual Cafe Awards
They'll always choose the AI option and I don't just mean girlfriends, friends and therapists too. It will lead to a complete isolation of the already mostly undesirable by society, all while they get milked for all they have.

Yeah, this is one of the things that's getting me.

(Two-paragraph rant tl;dr: I hate people, skip to "So, you know what?" if you don't care about TTRPGs)
I've been longing to run or be in a TTRPG campaign for a while. As I've said to others I've run, as DM, about 30 people (I'm serious.) through their first TTRPG experiences. I've crafted so many adventures. I've put so much of myself into so many people, so many stories, and I have gotten fuck-all from it. They refuse to read the player's handbook. I have to hand-hold ENGINEERS through leveling up in D&D. I've gotten derisive comments about "what is this, method acting?", critical rollers constantly asking for my campaigns to be more like their podcasts, people seriously trying to make gnomes in trenchcoats, men with gender issues playing female chars, and not to mention the goddamn phone usage. And that's the people I can manage to sit down for a few weeks in a row at a table. It's a fucking joke.

I've tried online campaigns and it's even worse. You're wading through software issues with emulating a tabletop experience online, on top of the usual conference call issues, and you STILL need to schedule around people. Helping them with characters is somehow even harder. People trying to talk among themselves doesn't work. The social conventions totally break down. It's not fun.

So, you know what? Fuck it. I want an AI DM I can run through a campaign with my husband. Fuck people. A hollow ghost in the machine is more soul than I've gotten out of tens of genuine human beings.

And if I feel this way about my niche, I can only imagine how much AI Friends are going to take over peoples' social groups. No more dealing with scheduling, John's toxic girlfriend always needing to be there, Bob's alcoholism preventing him from being invited to bars, Suzie not getting along with Sally so they can never be invited together. Generate your own friends, whenever you want. On-demand perfect socialization without any of the warts. People are already checking out of socialization with how available and fun streaming is, and I'm convinced the pandemic lockdowns were entirely focused on destroying our habits around socialization so we re-form them to habitually be alone instead.


Like others have already said in this thread, the loneliness isn't going to change much for men's romances. They're already talking to not-girls on Onlyfans and wanking to porn because the dating market is so awful. Hell, I could see the trannymaxxing problem stop because men will be wanking to AI-generated images of their waifus instead of seeking out ever more degenerate porn. This nuclear mushroom cloud might have a silver lining.

Community is already dead and it's getting worse. There was a book written in 2000 called "Bowling Alone" that talks about the destruction of community and how this harms democracy. 2000!

I'm wondering how much further we have to go until total societal collapse.

EDIT: I originally said 1990, the original essay was written in 1995 and the book was published in 2000. I can't read.
 
Last edited:
Virtual Cafe Awards

MindOverMatter

Internet Refugee
Joined
May 6, 2024
Messages
3
Reaction score
10
Awards
3
I actually worry about the implication of companies/three letter agencies using this technology for their own benefit.
Can you imagine if your AI girlfriend/assistant will sneakily try to sell you merchandise, or convince you to go grab a bite at McDonald's. What about using your voice print in order to see when you are the most vulnerable to condition you to buy stuff that you don't really need.

Agencies will be able to easily influence people to do their bidding like having the AI ask the person to kill someone important to prove their love, or just straight up shoot up someplace. Psyops and furthering one's agenda will be easier when the target will fully cooperate and tell you everything that goes on in their life. They will talk about their thoughts, habits, what they love and hate. By the time the act is done, they will just uninstall the app on the target's phone and nobody will be able to blame them. Instead of using ELF, gang stalking, or some other form of brainwashing, it can all be done with a single app on a phone that a person carries 24/7.
 

manpaint

̴̘̈́ ̵̲̾ ̸̯̎ ̴͓̀ ̸̳͝ ̸͈͑ ̴̡̋ ̸̞̂ ̴̰̚ ̵̨̔ ̸̭̎
Gold
Joined
Aug 11, 2022
Messages
957
Reaction score
1,871
Awards
215
Website
manpaint.neocities.org
Yeah, this is one of the things that's getting me.

(Two-paragraph rant tl;dr: I hate people, skip to "So, you know what?" if you don't care about TTRPGs)
I've been longing to run or be in a TTRPG campaign for a while. As I've said to others I've run, as DM, about 30 people (I'm serious.) through their first TTRPG experiences. I've crafted so many adventures. I've put so much of myself into so many people, so many stories, and I have gotten fuck-all from it. They refuse to read the player's handbook. I have to hand-hold ENGINEERS through leveling up in D&D. I've gotten derisive comments about "what is this, method acting?", critical rollers constantly asking for my campaigns to be more like their podcasts, people seriously trying to make gnomes in trenchcoats, men with gender issues playing female chars, and not to mention the goddamn phone usage. And that's the people I can manage to sit down for a few weeks in a row at a table. It's a fucking joke.

I've tried online campaigns and it's even worse. You're wading through software issues with emulating a tabletop experience online, on top of the usual conference call issues, and you STILL need to schedule around people. Helping them with characters is somehow even harder. People trying to talk among themselves doesn't work. The social conventions totally break down. It's not fun.

So, you know what? Fuck it. I want an AI DM I can run through a campaign with my husband. Fuck people. A hollow ghost in the machine is more soul than I've gotten out of tens of genuine human beings.
I had a similar experience with Dungeon & Dragons too. I recall back in 2018 or 2019, I was playing a game with my friends. I was one of the player, but while we were playing, they were watching a TikTok-tier meme playlist on Youtube. Thankfully we all realized it was distracting and turned the TV off (or putted background music instead).

I believe the closest to thing that exists right now is Character.ai. The thing is quite limited right now, but I can see a future where it really decent. As soon as someone make an equivalent or port of this that can run locally, the problem of entertainement will be solved for many people and that will be great.

Punp said:
Compares AI to heroin

I must admit that I have never taken drugs in my life, so my knowledge in this area is quite limited, but considering that "being a drug addict" is not the default I pressume that people becoming addicts were drawn to it due to some kind of external issue (similar to how sad people may turn into drunkyards).

The main takeaway here is that humans are extremely exploitable. Going after AI, drugs or alchool will not result in meaningful change because it's not the core issue here.
 

punisheddead

Chronically wired and tired
Joined
Jan 4, 2024
Messages
477
Reaction score
1,963
Awards
159
Virtual Cafe Awards

eris

Posting Inquisitor
Gold
Joined
May 28, 2022
Messages
1,176
Reaction score
3,847
Awards
252
Virtual Cafe Awards

Jade

Shameless Germaniboo
Joined
Aug 8, 2021
Messages
787
Reaction score
2,478
Awards
222
Website
idelides.xyz
Virtual Cafe Awards

TheVisionist

Bronze
Joined
Sep 7, 2023
Messages
19
Reaction score
47
Awards
9
I had a similar experience with Dungeon & Dragons too. I recall back in 2018 or 2019, I was playing a game with my friends. I was one of the player, but while we were playing, they were watching a TikTok-tier meme playlist on Youtube. Thankfully we all realized it was distracting and turned the TV off (or putted background music instead).

I believe the closest to thing that exists right now is Character.ai. The thing is quite limited right now, but I can see a future where it really decent. As soon as someone make an equivalent or port of this that can run locally, the problem of entertainement will be solved for many people and that will be great.



I must admit that I have never taken drugs in my life, so my knowledge in this area is quite limited, but considering that "being a drug addict" is not the default I pressume that people becoming addicts were drawn to it due to some kind of external issue (similar to how sad people may turn into drunkyards).

The main takeaway here is that humans are extremely exploitable. Going after AI, drugs or alchool will not result in meaningful change because it's not the core issue here.
Correct, people are drawn to drugs generally to fill other voids in their lives be it social, emotional or spiritual.

People will also be drawn to this to fill voids. We crave stability as a species ultimately.

I think people are really focusing too much on the AI girlfriend thing here, I wonder if some of that may be projection. It seems some folks really have a chip on their shoulder or maybe they worry that it could have been them at one point. Id be inclined to believe the latter.

I think that while the whole AI gf thing is in its infancy still, this does have much farther reaching consequences in the fact that its a more or less "good enough" version of a person for what most people need from service jobs or most social interactions on a day to day.

Remember that this is now the worst that an AI model is going to sound. Do you think this couldn't replace most low skill low labor jobs at its current state? Remember we went from AI making 8 fingered hands to this in 2 years. Growth here is exponential I feel.
 
Virtual Cafe Awards
Joined
Apr 29, 2024
Messages
442
Reaction score
2,778
Awards
173
The reason I think that it will never become "completely normal" (I am talking smartphone level of mainstream here). Is because:

  • I think people are driven to want the real thing by their biology
  • Besides love, IRL relationship are first and foremost a "social alliance" where two individuals pool their resources together for mutual benefit (money, lending car, etc). An AI is trapped in a black box and cannot compete.
I don't agree. There's a lot of things which we could say today shouldn't happen because of biological reasonings, but the truth is they happen anyway. There's many things today we increasingly see happening and getting normalised that go against biological arguments, first and foremost is the willing ignorance of objectivity and the relativisation of everything, elevating neurotic opinions to the same level of facts.
And you say besides love, but love is THE first and foremost fact in a relationship, and even then I would say superficial relationships where no love is involved have other things before a purely transactional social alliance. Usually to fill voids one has, much like how consumism or addiction appears in others, and social perception.
Social perception in normie society is very clearly steered by the whims and needs of large corporations, which is why things like dating apps/sites which were previously seen as strange, have been so normalised by now. Easier to keep you nice and productive that way.
I think we can all imagine how easy it would be to normalise dating bots. Just a few memes here and there, then influencers who don't call themselves influencers will talk about dating a bot, you will see shit about how much it is geared towards you and how it allows you to have "free time" so you can now focus on "important things"(AKA being a corposlave). This coupled with the large fear of rejection or being told anything negative this generation seems to have, I've seen people too afraid of asking each other out for a date because they can't deal with getting hurt in any way, but use dating apps because there you don't really have to deal with rejection in the same way.
 
Virtual Cafe Awards

InsufferableCynic

Well-Known Traveler
Joined
Apr 30, 2022
Messages
581
Reaction score
1,632
Awards
144
honestly you're missing the kind of nonsense this shit will encourage. one hypothetical scenario:

AI is watching a store CCTV for crimes. It sees a guy with long hair shoplift, a woman is also at the counter. It hallucinates, calls the cops and tells them in its convincing voice to look out for a female. Woman who was just buying something gets arrested. AI system doesn't get questioned (this already happens with normal computers, compare with the Horizon IT scandal)

The more convincing sounding the "voice" is, the less people will think to question whether the ghost in the machine is fucking retarded.

Like yeah sure, a bunch of coomers are going to form weird parasocial relationships with this, but I think it has the potential to be much more damaging to society as a whole, the above is just one example I can think of.

One thing people don't talk about (maybe they haven't thought about it) in relation to this is how it will affect policy, which in my opinion is far more important than a computer making a mistake, because this sort of policy change also removes all the oversight and prevents any sort of rational analysis of the situation, whereas an isolated computer glitch is mostly a one-off scenario and is likely to get looked into/resolved.

Here's a hypothetical. Let's say the Police start using mass-surveillance technology to "extrapolate" suspects. To a degree this is already happening via mass surveillence already, but take it a step further where they no longer even bother to do investigations and seek out witnesses anymore, instead the magical algorithm simply spits out a list of the top 10 most likely suspects, based on proximity data, personal profiles, and other information gathered from surveillance. This will be encouraged as "efficient" because it will take a lot of manpower away from investigations in already over-loaded police departments. When this becomes mainstream practice, computer mistakes won't be visible at all, because there will be no real-world investigation to disprove it, and performing such an investigation would likely be costly and beyond what most working-class people can afford.

While it may currently seem a little far-fetched to believe things will change this much, what it will likely take is for some "big break" in a case that's based on an algorithmic solution, such as if a police department correctly finds a serial killer based on analysing a series of data points. That would go a long way to making this sort of "efficient crime fighting" gain a lot of support, especially if it solves crimes that traditional investigation techniques can't.

Anyway, this feels like I may have gone a little off-topic.
 

tcp

Traveler
Joined
Sep 13, 2023
Messages
43
Reaction score
282
Awards
34
OpenAI is a completely backwards company and has no right to call themselves "Open". Facebook, under LeCun's direction has done more for open source LLMs and machine learning models than any other company. I can't wait to see the day that OpenAI fades into irrelevancy as other companies and open source competitors destroy them.

Sam Altman's goal is now to prevent competition through regulatory capture. They want to essentially "ban" competitors by saying: look how scary AI is, only we should be allowed to do it

Do not ever give your data to these people.

Ah someone wake me up when self hosted local AI becomes a thing
It already is, a 4090 with 24 GB of VRAM while expensive can run models that are on par with the outputs of GPT-4 and far surpassing GPT 3.5
 

Punp

3D/2D artist
Moderator
Gold
Silver
Joined
Aug 4, 2022
Messages
1,961
Reaction score
8,398
Awards
307
Website
punp.neocities.org
One thing people don't talk about (maybe they haven't thought about it) in relation to this is how it will affect policy, which in my opinion is far more important than a computer making a mistake, because this sort of policy change also removes all the oversight and prevents any sort of rational analysis of the situation, whereas an isolated computer glitch is mostly a one-off scenario and is likely to get looked into/resolved.

Here's a hypothetical. Let's say the Police start using mass-surveillance technology to "extrapolate" suspects. To a degree this is already happening via mass surveillence already, but take it a step further where they no longer even bother to do investigations and seek out witnesses anymore, instead the magical algorithm simply spits out a list of the top 10 most likely suspects, based on proximity data, personal profiles, and other information gathered from surveillance. This will be encouraged as "efficient" because it will take a lot of manpower away from investigations in already over-loaded police departments. When this becomes mainstream practice, computer mistakes won't be visible at all, because there will be no real-world investigation to disprove it, and performing such an investigation would likely be costly and beyond what most working-class people can afford.

While it may currently seem a little far-fetched to believe things will change this much, what it will likely take is for some "big break" in a case that's based on an algorithmic solution, such as if a police department correctly finds a serial killer based on analysing a series of data points. That would go a long way to making this sort of "efficient crime fighting" gain a lot of support, especially if it solves crimes that traditional investigation techniques can't.

Anyway, this feels like I may have gone a little off-topic.
This already exists and has been used in Palestine to target suspected terrorists based on Whatsapp group connections. It's called "where's daddy" and the AI would spit out 10k "suspects", wait for them to return home and then bomb the house with the whole family inside.

Once they ran out of targets they just asked the machine for more so they weren't seen as idle.

The machine started with 90% accuracy on terrorist detection and that number declined as they started thinning out the pool. The only oversight was someone checking that the algo hadn't returned a female suspect.

To bring us back on topic, the problem is with humans believing in an infallible, trustable machine and removing any of their own responsibility.
 
Virtual Cafe Awards

Similar threads