In reality, I disabled all social media notifications on my phone. Along with marrying my wife and having a child, this is one of the best decisions I’ve made in recent history.
I just went to Settings > Notifications on my phone, and turned off the notifications for all social media apps. It was that simple. Then, social media faded away. And the craziness of the world faded with it. Social media no longer has any power over me, and my life is so much better without it.
Most my several hundred “friends” on social media are acquaintances. While I appreciate all the people I’ve met, I don’t want to share the intimate details of my life with all of them. And generally, I don’t want to hear their opinions, even if they’re the same as mine.
Social media is a relatively new experiment unleashed upon humanity. In the same way that COVID-19 is a new virus, we don’t know the long term effects social media will have on society. Sure, there are a few positive aspects. In the same way COVID-19 inspired me to exercise more and eat a little healthier, social media helps me stay in touch with long-distance friends. However, in the grand scheme of social media, like COVID-19, I’ve seen it do much more harm than good.
Social media went so wrong, in so many ways, it’s difficult to summarize. Maybe it started with the monetization of the platforms. Or the introduction of like buttons. Or the legions of bots and fake accounts. But for me, the final straw was the introduction of machine learning algorithms that use your own psychology against you.
As the owner of a software company, and a designer and developer, I’m not entirely ignorant of the technology and design techniques employed by social media companies to keep their users engaged.
I began noticing the negative effects social media was having on my mental health years ago. While I was never addicted, even a casual involvement was taking a toll. So, I started scaling back. After disabling all notifications, I felt so much better. Then, I began contemplating how social media went so wrong.
Now, I’ll take you on that journey.
Online communication has always had problems. Particularity, when it comes to anonymous communication. It can bring out the worst in people. Fifteen years ago, the worst of it usually involved cruel insults and unsolicited opinions in chat rooms. Remember those?
At least you understood what you were dealing with — an asshole. Sitting in the safety of their bedroom, hiding behind a username like LimpBizkitFan420.
That was fine. I can deal with assholes. Learning to deal with assholes is a fact of life.
Often, inflamed chat room debates motivated me to research a topic further. In the process, I would learn something. Sometimes, I would learn the asshole was right. Sometimes, I would learn an asshole is just an asshole. In some cases, I learned that I was the asshole.
So, what defines an asshole?
Is it somebody with a different opinion than my own?
No. I have tons of friends and family with different opinions than mine.
Is it somebody that voices their opinion in response to my opinion?
No. If I’m voicing an opinion, I’m inviting opposing opinions.
An asshole is an egotistical and inconsiderate person. Inconsiderate of the thoughts, feelings, and opinions of others. It’s somebody that voices unwavering and unsolicited opinions, then vilifies the people that oppose those opinions.
With the launch of Facebook, we blocked the assholes. It should have been great! However, in the process, we created something worse.
Facebook introduced online accountability. Theoretically, each account is associated with a real human, and “friends” are people that you have probably interacted with in real life. If you know a person in real life, you’re less likely to be obscene, offensive, or annoying in their presence — hopefully. If not, you probably don’t have many friends.
At first, it worked. Facebook held users accountable. It also provided the power to block all those assholes. In theory, it should have eliminated the spread of unsolicited opinions and reckless information. Instead, it created an echo chamber.
As defined by Wikipedia, an echo chamber is a metaphorical description of a situation in which beliefs (opinions) are amplified or reinforced by communication and repetition inside a closed system and insulates them from rebuttal.
Gradually, people became more comfortable sharing unsolicited opinions and reckless information. Those opinions were then reinforced with “likes” and a kiss-ass choir of comments. So, what began as a malleable seed of an opinion becomes fertilized. Then, it grows into a mighty oak. When an opinion is that strong, it can be misconstrued as fact. And that’s dangerous.
Social media didn’t start this way, but Facebook and other platforms have evolved into dangerous echo chambers. They polarize people, and create divisive groups. Essentially, they slowly turn non-assholes into assholes, without them ever being aware.
Then, came the bots.
Bots, Not Robots
If an echo chamber is fertilizer for opinions — bots are Miracle-Gro.
Bots have been crawling the internet for a long time. They are not new technology. However, echo chambers are the perfect environment to provoke radicalism using bots. The right meme, shared to the right group of people, at the right time, is a perfect tool for inflaming opinions and dividing people.
All you need is one “friend” who friended a fake account — that’s a bot. Then, you’re susceptible to false information being shared as if it were fact. Unfortunately, most people don’t dig into the truth of that information using services like Snopes. They just accept the shared headline as factual because it’s already aligned with their opinions. So, they share that false information, then keep scrolling.
It’s happening right now on social media by the likes of Russia and other foreign bots creating fake accounts. A country divided is easier to manipulate. And they are aimed at unraveling our society from the inside out. It appears to be working.
Remember the movie Inception with Leonardo Dicaprio? That movie isn’t about dreams within a dream. It’s about the extremely difficult task of changing somebody’s mind. In order to change their mind, you have to implant an idea so deep in their subconscious, they believe the thought is their own. It’s nearly impossible, as illustrated by the movie. Particularly, after an echo chamber strengthens an opinion into an extreme belief that becomes part of a person’s identity.
However, humans have invented a technology capable of changing minds. It’s called artificial intelligence, and Facebook is one of its biggest investors.
The term “artificial intelligence” has been used in gaming since the beginning of games. In the beginning, artificial intelligence was really just conditional code. Conditional code is a series of conditions stating, “If this happens, do this.” It’s a tree branch of possibilities. With enough possibilities, the code can seem intelligent. However, that’s not true artificial intelligence. Ultimately, those possibilities are limited by the developer.
The type of artificial intelligence that Facebook and other large tech companies are investing in is referred to as Machine Learning. Machine learning is fundamentally different than the artificial intelligence of old. It’s code teaching itself how to perform a task more efficiently based on the data it receives. The key here is that it’s teaching itself, at an exponential rate.
You know those “prove you’re not a robot” reCaptcha quizzes you’ve encountered online from Google? Usually, they involve choosing all the images that have traffic lights — or something similar. Those quizzes are not really for proving that you’re a human. In reality, you’re providing data for the machine learning software used in self-driving cars.
Social media provides a wealth of data for feeding machine learning technology. Data regarding your actions online, even your emotions, and what you’re currently thinking about. We’re talking monumental data here. And the more you use social media, the more data you’re feeding into the machine.
In some cases, that’s to my benefit. For instance, I want Netflix and Spotify to serve me music and media based on my preferences. Additionally, I don’t mind teaching cars how to drive themselves. However, when that technology begins using my own psychology against me, then we have a problem.
Machine learning is amazing. If properly harnessed, it could eliminate poverty, starvation, disease, and all the plagues of modern society. It could thrust humanity into a Gene Roddenberry inspired utopian society. It absolutely has that potential. Unfortunately, it also has the potential to destroy mankind, and make humans a blip in the evolutionary chain leading up to its “birth.”
It’s only a matter of years, and not as many as you may think, before humans are not the most intelligent beings on the planet. The AlphaGo machine learning software, developed by DeepMind, and owned by Google, defeated the world’s best human Go players — a game infinitely more complex than Chess. That was five years ago. It’s come a long way since then.
The question of whether machine learning will save us or destroy us is an ethical question. In short, it’s motivations should be altruistic. Unfortunately, that’s not the machine learning Facebook is currently employing on their platform. It’s machine learning motivated by greed. It’s designed to keep us engaged and coming back for more, just so they can serve us the right content or advertisement at the right time. It uses subtle and intentional manipulation, and it’s very good at its job. Which is using our own psychology against us. Like Inception, it’s slowly changing our minds.
Social media knows more about us than you would believe. It knows how long I’ve stared at an image. It knows when I start to write a comment, but then decide not to. It knows when I pause, “Should I add a heart, or smiley face emoji?” It knows my mood, what makes me happy, and what makes me angry. It knows my opinions and beliefs, and how to serve me content that I can’t resist clicking. It’s not just a map of what I’m thinking, it’s how I’m thinking. With that kind of data it’s possible to psychologically manipulate people into doing what you want — like a cult leader. And what they want, is for us to keep looking, and keep clicking.
Social media isn’t truly free. I am the product. I’m providing the data. I’m building the machine, and I’m paying with my own mental well-being. It’s hard enough to trust any news or media these days. I don’t want to struggle with whether or not I can trust myself.
Lifting The Veil
Once I lifted the veil, it became annoyingly obvious. At first, I tried to ignore it, but the notifications kept luring back in. So, I shut off notifications. Then, birthday wishes, which I appreciate, still lured me back in. So, I made my birthday private. Then, my email inbox was flooded with messages from Facebook and LinkedIn. So, I unsubscribed. They’ll probably try knocking on my door next.
My generation will be the last to remember what life was like before the internet and social media. I consider myself fortunate to have grown up with modern technology — and to work with it on a daily basis. As a result, I understand it. I’ve learned to maintain a healthy balance with technology. I’m guilty of sometimes letting it go to far, but I’m aware of those times, and capable of dialing it back. Social media is something I’ve decided to personally dial way back.
I still use Instagram as a digital photo album. Often, I share those photos from Instagram to Facebook. While I appreciate the likes and comments I receive, I’ve let go of any obligatory need to respond. Every once in a while I’ll check in to Facebook or Twitter, throw a couple likes up, and then I’m outtie. That’s the extent of my involvement these days.
Since disabling social media notifications, I’ve began jogging regularly. My wife and I take multiple daily walks through our neighborhood with our baby boy. We stop, and talk with our neighbors. Overall, it’s quieted the noise, and I feel a little happier.
“Some poor, phoneless fool is probably sitting next to a waterfall somewhere totally unaware of how angry and scared he’s supposed to be.”— Duncan Trussell