A while back, I was reading a book called Fall by Neal Stephenson. Full disclosure — I haven’t finished this book. It’s long, and I found that the focus of the book shifted away from what I was really found interesting in the book. That’s probably a mistake, but it didn’t hold my attention about halfway through. This post, however, is about the part of the book that I did find interesting.
It’s kind of a strange, sprawling, not-so-distant future sci-fi. Its focus is split between (roughly) two distinct stories. In the main story, an older man named Dodge dies, and his brain/consciousness is mapped and uploaded onto a server. Many years later, a young programmer figures out how to “wake up” Dodge’s consciousness. This is the part of the book that I lost interest in around the midway point.
The second focus of the book is on the outside world, and the various ways in which humans are dealing with (mis)information, false media narratives, news bias, and social media silos. I really wanted the book to be about this. As an example, near the beginning of the book, a fake nuclear attack on a small town outside of Moab, Utah was staged via a widespread media blast coordinated with an internet and phone line shutdown for that section of the US. There were well-made press conferences, fake nuclear blast videos that looked like they came from airplanes, and widespread uncertainty throughout everyone’s social media and video feeds.
In other words, something that could EASILY happen in a situation like ours. Not-so-distant future, remember? Some of the characters end up figuring out that the hoax is happening (one of these characters is a CTO of a major social media company, so he feels the responsibility to figure out the real story to stop the spread of the misinformation campaign). Suffice it to say, this one moment holds massive impact on the US over the course of the next couple of decades. There are “Moab truthers” (people who believe a nuclear attack happened, when it did not). A new form of the internet is born where people are given private internet IDs so they can be easily identified.
The story skips ahead, and there has been some kind of secession in the US — there is now the regular USA, and what looks to be a sovereign country called “Ameristan.” Further, you find that everyone has a different, self-selected version of their access to the net — including news channels, articles, entertainment, etc. Some people just have algorithms choosing their information feeds for them. Others (the wealthy) tend to have personal news/media curators peeling through news and media and selecting it to go in the person’s net feed or not. These leads to massive polarization, because these feeds bend and shape how each person sees the world — what it is, what it’s not, who’s in control, and what is true of the past and present.
The thing that struck me the most about this depiction is not only its relevance, but just how close this is to our own reality.
I’m not necessarily worried about secession or a civil war right now. But this whole notion of people splitting their feeds, breaking off into other forms of social media so they aren’t censored, and choosing the kinds of information they get (or being radicalized by information chosen for them) seems especially… prescient right now.
Since election day last week, I have seen multiple people claim that they are getting off of Facebook (not necessarily a bad thing, IMO). But instead of doing that to create a sense of calm or peace, they are simply moving platforms to a place where their information is not censored — Mastodon, Parler, Gab, etc. I don’t know if this is necessarily a bad thing, but doesn’t this come with a cost that we have a difficult time foreseeing? Facebook, because of its massive popularity, is a platform that pretty much everyone can use. Therefore, we have a bigger opportunity to encounter those with differing view points.
Don’t get me wrong. Facebook and Twitter have their own (MAJOR) problems. Their algorithms select for the most outrageous posts, videos, and images. When, as Tristan Harris says, the economics of the social media companies we use revolve around selling our attention, we will inevitably lose any autonomy we might have over what we consume — and ultimately what we think and how we see the world.
But when we start siloing ourselves within a particular social network with only those who agree with us, doesn’t this present an opportunity for radicalism to spread unchecked in these groups? A form of group-think has the ability to emerge, and this can also produce cult-like behavior, where real, lasting harm can be done to individuals.
I don’t know what the proper solution here. Neither do the tech whistleblowers that we see in movies like The Social Dilemma. They see the problem, but (again, like Tristan Harris says) it’s like saying climate change is a problem. That’s true, but any possible solutions here are necessarily complex, because the entire internet, and therefore the entire digital economy, is inexorably interwoven with the attention economy as seen in the behemoth tech companies of Google/Facebook/Amazon/Apple/Twitter.