by Akshata Atre
I “quit” social media a while ago. I mean, I didn’t quit entirely. It’s actually kind of impossible to do that if you’re a college student who wants to at least maintain the appearance of having a social life. But I still scrambled all my Instagram passwords, blocked Twitter, and only used Snapchat back in freshman year (of college) for maybe two months. But I’m still unfortunately bound to my Facebook account. I just can’t find a way to not have one and remain even somewhat in the loop. I did delete the app (the main one and messenger), unfollow all the people I’m friends with, and remove a ton of my page “likes,” though. Yet, even with all those measures in place, I still get that dopamine rush when I do login and see that I have a notification, even if it does turn out to just be someone’s birthday. Something about those red bubbles with tiny white numbers in them just inadvertently triggers that response in me-- and I really, really hate it.
The most frustrating thing about the whole situation is that, of course, those red bubbles are designed to elicit that exact emotional reaction. That’s the whole point of them. They’re meant to get you to log onto the site or open the app and scroll. And scroll. And scroll. And for what?
Not to sell you products, but to sell you. Your attention.
And that’s a distinction that’s made clear in the new Netflix documentary The Social Dilemma. In the film, a group of former Silicon Valley tech executives & other experts dive into the inner workings of Facebook, Instagram, Pinterest, and the like. One of the key takeaways from the film is that the algorithms that run these platforms are a) programmed to have an outcome that is directly linked to profit; b) trying to do everything they can to keep you using the app/site longer in order to achieve that profit goal; and c) not even fully understood by the people who made them anymore.
That second point that really stuck with me throughout the film, particularly when it was dramatized in one of the several scripted storylines presented in the film. In this storyline, the algorithm is portrayed by three men behind a screen who are feverishly trying to maintain the main character’s attention in order to win “bids” from various companies who want to put advertisements in front of him. We see how they use increasingly personal push notifications to draw him back into the app. Then, once they get him back on the app, they start pushing extremist political content into his feed, sucking him in even further.
This story, although depicted in kind of a cheesy way, is not far from the truth. We’ve all heard about how terrorists are using social media to recruit people, and even how YouTube’s algorithm pushes increasingly extremist content the further down the rabbit hole you go. The latter is of course a great example of how this desire to increase engagement (which is essentially the point of an algorithm like YouTube’s) can lead people to content that is largely opinion-based and nonfactual.
This essentially means that we no longer share a common understanding of what truth is. Because we’re not all getting the same information when we open Google. The algorithm literally prevents that from happening-- if we see the truth, we’re not always going to like it. And the algorithm doesn’t want that. It wants us to like what we see. So it shows us things we already agree with, regardless of whether or not it’s true, playing into our confirmation biases and keeping us scrolling, scrolling, scrolling, all the while collecting information that allows advertisers to target us so directly it’s scary. And the longer we scroll, the further we’re split into right and left, the lonelier we get, the sadder we get, the more anxious we get, and the further we get from reality and truth itself.
So back to the first point about algorithms-- why are social media companies allowing their algorithms to push this kind of content and manipulate us this way? Profit. Their whole business model is based on finding ways to get your attention and then selling that attention, your attention, to advertisers, who are essentially funding these platforms. But it’s really more than just your attention; as Jaron Lanier puts it in the film,“it’s the gradual, slight, imperceptible change in your own behavior and perception that is the product.” Think Instagram influencers, extremely targeted ads, and the like. You don’t even realize that’s what’s happening half the time. And that is worth SO MUCH MONEY.
Money is what this all comes back to at the end of the film. These companies are making so much money that they have no reason to stop these out-of-control algorithms, to stop collecting our data, to stop selling us. A corporation (contrary to the infamous Citizens United case), is not a person. It has no conscience. It’s not going to make ethical decisions for the sake of being ethical. The only way the machine will stop is if it’s forced to because of a monetary penalty or incentive. Regulations, taxes, changing the stock market so that it’s not a short-term quarter-over-quarter growth system, these are all solutions presented by the people interviewed in the film. And these options (excepting the third) have, in fact, worked in other industries.
Until those larger systems change, tech companies aren’t going to make any changes themselves. And those systems won’t change unless people demand that they do. As many terrible things as there are going on right now, I really believe that this is an important issue to address. Because so long as we as a society can’t even agree on what is actually happening, what the truth is, we are never going to reach a solution on any of the many pressing issues we’re facing. You can’t build a car if half your designers think they should be building an airplane and the other half think they should be designing a submarine. And you can’t address climate change, discrimination, or economic inequality if people don’t even agree that they’re real.
Thankfully, the interviewees in the film do offer a variety of actionable solutions at the end of the film, which I won’t list out here because I highly recommend you watch the film yourself. But what I would like to share are a couple stanzas from Bastille’s latest song, What You Gonna Do???, which came out in July of this year. As usual, their words are as poignant and timely as ever.
"Shake, rattle and roll / You got control / Got my attention / Make me tap and scroll / You got control / Got my attention
Listening, you got us listening / So what you gonna do with it? / You got us listening
So what you gonna do? / Now, what you gonna do with it? / Make me paranoid / Love me, hate me, fill the void / What you gonna do with it? / So who am I? You decide / Inside out, you read my mind / What you gonna do with it?"
And since we can’t all get in the faces of tech CEOs to ask them this question, watch The Social Dilemma (and maybe also read The Circle and reread 1984) and then do what you can to stop being a product and find the unfiltered truth.