Several friends recommended that I watch the Netflix documentary “The Social Dilemma.” It shows, through interviews with Silicon Valley tech leaders, documentary investigation, and a narrative drama, how social media manipulates users to stay logged on longer and sign in more often. For instance, Google creates a granular profile of each user, then uses a personality algorithm to offer personalized content based on what that individual is most likely to click on.

The film captures how technology that was initially designed to entertain and benefit people by connecting them with others turned into something nefarious. It’s a real-life example of something designed with good intentions ending up causing harm.

One of the most disturbing aspects of social media, as highlighted in the movie, is its effect on teenagers. There’s an alarming correlation between the use of platforms like Facebook, Snapchat, and Instagram, and the rise in depression, anxiety, and suicide among teens. Experts link the pressure that teens face to be constantly online, post images that portray a certain image, and get others to like their content with a decline in mental health.

A report by Common Wealth Media says teens spend more than seven hours a day on their phones, not including time for schoolwork. Much of that time is spent on social media. This can cut into their sleep schedules and negatively affect their behaviors, especially if they’re subjected to online bullying or unrealistic views of other peoples’ lives that leave them feeling left out.

Mobile devices are now ubiquitous, which means being online is always just a click away. Teens and adults staring at their phones in day-to-day life in Shakopee, pretty much in any setting, is now commonplace. Big Tech knows this and wants to monetize, in some way, shape, or form, every click a person makes on social media.

Another part of “The Social Dilemma” that interested me is how algorithms direct people to certain content based on their personal profile. It’s why, according to an internal Facebook report, 64% of people who joined extremist groups on Facebook did so because algorithms steered them there.

To me, this is when social media gets scary. Companies like Facebook are not only supporting extremist groups, they’re actively profiting from them. It’s not a stretch to think content from these extremist groups ends up creeping into non-extremist social media circles, which impacts the rest us, even here in Shakopee.

I say this because I often see people in my social network, who aren’t connected to each other, posting the same extreme political content. This is happening with increased frequency as we get closer to the election. I call the posts extreme because of the language or imagery that’s often vulgar or violent. In many cases, the posts are factually wrong, but they strike a chord with people who want to believe they’re true and aren’t willing to verify before reposting.

Extreme content, of course, leads to more division and animosity. For example, I believe social media is at least partially responsible for the politicizing of wearing masks. I highly doubt many people would equate wearing a mask inside businesses during a pandemic with the government trying to take away a freedom, unless they saw the idea on social media and identified with the poster’s politics.

When people talk about social media influencing elections, this is the type of activity they’re referring to — posts that encourage people to think and then vote a certain way, without regard for facts. The goal is to influence votes.

I see a lot of people blaming the news for many of our societal problems and say if we turned off the news, many problems would go away. To be fair, some if not most news does skew their coverage to drive ratings. But if we really want many problems to go away and have people start thinking for themselves, we’d put our phones down and scale back our use of social media.

Brett Martin is a community columnist who’s been a Shakopee resident for over 15 years.


Recommended for you