More thoughts (and science) on the Dead Internet

  • Thread starter Merek
  • Start date
  • This thread has been viewed 7374 times.

After IP's excellent original thread had a chance to rattle around in my brain for a bit, I recently encountered a very interesting blog post detailing a scientific study whose relevance will become clear.

From the post:

Researchers at Indiana University Bloomington have built simulations of social networks, in which each node is an imaginary person with a random set of connections to other nodes. To model the real world, each simulated person has a limited attention span — they can only process a certain number of incoming messages. The messages are assumed to vary randomly in quality, and like in the real world, each person can choose to rebroadcast a message, or not. In the simulation, the decision whether to rebroadcast is random, rather than being driven by "virality" or cognitive bias, so the simulation is an optimistic one.


It turns out that message propagation follows a power law: the probability of a meme being shared a given number of times is roughly proportional to an inverse power of that number. This turns out to mean that as the number of messages in the network rises, the quality of those which propagate falls.

There's been a lot of concern as of late about "algorithms," and rightfully so. But while it's clear that all the large tech companies are engaged in tampering with the flow of information, this study demonstrates that that isn't the central problem. Read that last paragraph paragraph again, and then this one from the article itself:

Bots...greatly reduce the quality of information in a social network. In one computer simulation, OSoMe researches included bots (modeled as agents that tweet only memes of zero quality and retweet only one another) in the social network. They found that...when the percentage of bot infiltration exceeds 1, poor-quality information propagates throughout the network. In real social networks, just a few early upvotes by bots can make a fake news item become viral.

Constant hand-wringing is done over "misinformation" (including in the article, with its obnoxiously leftist slant). But this study shows pretty clearly that the root problem isn't algo or propaganda per-say.

It's information overload.

As people get exposed to more and more info, their ability to engage with it goes down, and so they prioritize low-quality info (much of it not even political) because that's all they have the mental bandwidth for. This process can be accelerated by bots, but they're only catalyzing a reaction that's already taking place, like gasoline on a fire. People who weren't there will have a hard time believing it, but >redditcostanzayeahrightsmirk was originally a place where you could have deep, thoughtful, text-only discussions with people. Then it devolved into a place to share cat pictures. Why? IT GOT TOO BIG. And low quailty shit rose to the top.

Consider what this means. Maybe the Dead Internet isn't a conscious plot or sinister AI. Maybe it's an emergent property, something that happens naturally in an information sharing network when people are forced to deal with more info than they can handle. Maybe the Internet feels dead and hollow because the sheer SIZE is sucking the oxygen out of the room, creating an environment that's toxic to high-quality, thoughtful content. The traditional framing of Eternal September is that communities that get too big fail because large communities acquire members faster than they can internalize culture. But maybe this is overthinking it. Maybe the problem is just the size itself. Force people to process to much info, and you get garbage.

And with this in mind, maybe the solution is less information OVERALL. Taking the study's methods as inspiration, this could be done with just a random number generator: if you have a feed of 1000 items, make a system to pick ten or 20 at random. Trim it down. Cut back the firehose of content. And interestingly, this is a solution that both sides might agree to, as long as it was truly random. Wouldn't even have to trust Silicon Valley: you could implement this across various sites with a greasemonkey script.

Of course most people will never do this on their own: it has to happen on the site level to make a big impact, and that's unlikely because SV likes ad money above all else. But it does shift the discussion from debates over what information should be allowed to simply helping people process information OVERALL.

In short: What happens when social media crams more info into peoples heads than they can handle?

This:
GIF by Giphy QA
 

Plasmawiz

Resident biologist
Joined
Aug 21, 2021
Messages
33
Reaction score
58
Awards
22
It's something I've been thinking about for a few years as well. I agree with the statement that the "dead internet" is an emergent property of how people react to the expanding landscape, in some ways it was inevitable. For those here that still read books, I find the current state of social media to be like having a box of 100 books mailed to your door, which you have to skim through all of before the next batch arrives, it's just impossible to get anything meaningful out of anything you read that way.

In my last year of HS (2017), I gradually had to come to terms with the fact that my brain could only handle so much information before my quality of thought started to deteriorate. This was a year where I was studying for my finals,and of course the beginning of what I like to call the post-2016 fallout period, where social networks ramped up the aggressiveness in their algorithms.
I got into a habit of regularly vetting what stuff I followed and what services I used based on how much value they give me, simply because if I didn't I risked the very real possibility of overload (I'm a sperg, that shit can fuck me up). These days I can go through all my internet "feeds" in less than one-two hours and get on with my day. In terms of video content I can just spread out whatever people I follow upload throughout the week with no issues.

In some ways it is an issue of digital hygiene, there is a reason why you can tell how someone behaves online just by looking at how many people they follow. IMO keep your follow count for any service to ~100 at a maximum, any more and it risks becoming unmanageable.

EDIT:
Should also elaborate that I reject the algorithmically sorted "feeds" that modern sites have. I only have feeds of people I follow, sorted by upload date. AI driven feeds are designed to ensnare you and induce overload.
 
Virtual Cafe Awards

Crypto

Traveler
Joined
Jun 1, 2021
Messages
44
Reaction score
99
Awards
26
Website
tilde.club
i Feel like this a a natural progression for the internet. Engagement and focus are all that matters to these companies, so naturally if you could create an AI or bot to increase engagement of actual people, you would so you can get those sweet ad clicks.

compound tons of websites doing this or even unleashing bots on their competitors platforms to sew discord, and we have a zombie internet full of Astro turfing campaigns. Obviously social media is the easiest place to see it, and IMO >redditcostanzayeahrightsmirk is the ground zero for a lot of the astro turfing going on now a days.

to fix it we would need a system similar to web3.0, some way to either make it expensive to try and be disingenuous, or make it easy to identify fake accounts and fake people. Either by trying to to existing services that are legit or better yet using a block chain based system that users can verify themselves.

i think a place to start would be breaking away from social media, places like this are almost an oasis in the desert of low effort shitty posts. The only way to take back control would be moving to different websites or using open source software. Problem is different websites can just become corrupt similar to how the current social media empires become corrupted. Ideally the future of internet should be decentralized and verifiable.

I think it will come eventually, we could even see a fragmenting of internet in two, one with services and websites where you're the product and these places are considered "legit" and another where decentralization and open source software reigns supreme, and regular internet users are discouraged from using these systems.

im not sure, I have hope for the future but the present is definitely scary
 
Virtual Cafe Awards

Noct

Internet Refugee
Joined
Oct 2, 2021
Messages
7
Reaction score
22
Awards
3
I don't think it's an either-or thing. In fact, I would say that you and IP are talking about two pieces of the same puzzle. If you tried to introduce the kinds of bots we're seeing today into the old internet, they would immediately be called out. In order for it to work, you need an environment that's being overloaded with low quality information. Once that has been achieved, subversion becomes significantly easier. This is an emergent property, but it's one that malicious actors have pounced on
 
Virtual Cafe Awards

niuenso

Traveler
Joined
Oct 10, 2021
Messages
47
Reaction score
87
Awards
21
Website
niuenso.com
To both @Plasmawiz and @Merek

What do you think is the antidote to this emerging issue? I'm asking asking because I'm genuinely interested. I'm personally very concerned about the centralisation of the web and the fact that people are not only losing interest in having personal sites, personal corners of the web and independent places like this forum where they can share and engage in conversations but they're also losing the skills required.

More and more people are happy to follow the flow and the flow pushes towards more social media, more garbage.
But is there anything that can be done?

@Crypto :

> i think a place to start would be breaking away from social media

How do you do that though? I mean from a personal stand point is easy but how do you convince a large number of people to move away from social media? And also, if you convince them to move away, where should they go? They can't all go to the same place otherwise we're back at square one.
 
Virtual Cafe Awards

Taleisin

Lab-coat Illuminatus
Bronze
Joined
Nov 8, 2021
Messages
636
Reaction score
3,317
Awards
213
I have some relevant stuff about networks, psychology and the dynamics of some of the issues arising from DI information control.
I really suggest you read through these if you want to go deeper on this subject. These are surface level, but they're angles you won't necessarily have encountered before, so it should give you a clearer perspective on this.

https://pubmed.ncbi.nlm.nih.gov/34044349/ (social network structure correlates to brain network structure)






View: https://youtu.be/oYp5XuGYqqY
(Donald Hoffman- do we see reality as it is?)

I'm happy to discuss this more, if you find this relevant to your personal thought-stream. I'm a neuro student btw
 
Virtual Cafe Awards

exotika

Internet Refugee
Joined
Nov 7, 2021
Messages
23
Reaction score
68
Awards
13
Yep, noise is all over the place. Information overload reinforces heuristics standing behind acceptance of low quality content. Let's add that many forms of participation are systematically supressed so typical activity pretty much concentrates on sharing and copying trends. Internet is vast but is getting homogenous.

I also don't think there's a global plot behind current state of affairs, it's just very tempting to look for some ultimate cause that is easy to put into words and provokes strong affective reaction. It may seem intuitive since there's evolutionary bias towards detecting intentionality in random events. By the way, such vivid and emotionally engaging explanations fit the system and get coverage.
 
Virtual Cafe Awards

E R I L A Z

Young Wehrwolf
Joined
May 5, 2021
Messages
89
Reaction score
229
Awards
46
There's an age old saying that I feel like I say more and more every year:
"All things in moderation. Even too much of a good thing can become bad."
"Everything in moderation, even moderation!"

I really enjoy your thoughts about how information overload could be the cause of all this horseshit we see now. I guess there's some inherent desire to see order in the chaos of life. But, having been on the net for as long as I have, I can't help but draw connections between the deterioration of the quality of the web and the interests of powerful people that have no one's best interests in heart. I also refuse to believe that most people have lost their way, just because of the way of the world
 
Virtual Cafe Awards

uG85fWMlXXOh

I think we're seeing a combination of factors at play here: On one hand, the internet is naturally prone to this now because of how much low-quality information is floating around (partly because people choose to get their opinions from bought and paid for influencers, rather than anyone with any actual knowledge on a topic), and as a result of this, certain powers are using the situation to inject their own narratives via bots.

So if you're an oppressive government, for instance, and you are accused of some heinous human rights violations, your best bet is to mobilize an army of bots to post whatever your damage control narrative is, then share it around as much as possible. Even better, if you can buy some influencers, then you can have bots and real people correlating, which makes things feel a lot more natural to onlookers. Eventually the information will trickle down to the masses, and your narrative will be solidified in the common thinking. Your opposition might do the same thing, obviously, and then you have 2 narratives fighting each other - which is exactly what we are seeing right now with the so called "culture war".

I have no doubt all the big popular breadtube channels are getting their information from low-effort bots mobilized by corporations and governments to push whatever agenda. These groups don't even have to target these specific channels - they will be willingly looking for information that conforms to their existing worldview, so all these bots have to do is spam it enough times and they will find it and latch on to it. While some of these people will also usually be bought outright, I feel many of them will simply be duped by information overload (especially when it all points in the same direction), then that garbage gets fed out to their followers, and forms public opinion. At the same time, the rightoids go looking for the opposite information, and the bots from other interests are spamming the opposite narrative, and they fall for the same thing. To me, this seems to be why both sides of politics always consider the other side stupid - to them, the facts are overwhelming and obvious, and to the other side, their facts are incredulous and fake.

Eventually it gets so bad that these influencers become incapable of independent thought, which is why we see "free speech warriors" calling for censorship, and communists defending big corporations. They don't think for themselves anymore, they just go where the "reliable informations sources" (aka the bots that were initially setup to guide them to a specific talking point) tell them to go. People often apply the NPC meme to mindless consumers of content, but I feel like it more often fits the content creators themselves.

The worst part about this is that it's largely consensual (because people are looking for information that supports their existing world view), and will literally never end as long as vested interests have the ability to deploy bots.