Nowadays, to abstain from social media is to be some hippie-nomad-luddite type who won’t embrace new technology. I reject this. Instead, I believe that changing our relationship with social media might just be the best decision you can make for your mental health.
The capabilities of artificial intelligence are snowballing. Now, neural networks, designed to learn much like the human brain, are quickly outperforming humans at many tasks. An AI system beat the world’s best Go player in 2016 – a game which has more possible combinations than there are atoms in the universe.
Looking ahead, these deep learning algorithms will become supercharged with efficiency increases and innovations in quantum computing, according to MIT Technology Review. Alongside these improvements, surreally advanced algorithms will become more integrated into pretty much everything you do on your phone. I argue that this integration is growingly dangerous.
Since Facebook – like its competitors – monetises your engagement, its algorithms spoon-feed you content which keeps you glued to your screen as you glutinously crave for more. This means you often leave a social media session feeling drained in the same way you might after hitting the bottom of a jar of nutella. In this sense, your timeline serving isn’t designed to make you feel good; it is designed to make you scroll regardless of how it makes you feel.
Since tech rivals like Youtube, Twitter and Tiktok compete for your engagement, this dopamine abuse is encoded into the app’s recipe and is becoming increasingly acute as it better understands our wants, our annoyances, and our fears.
Sometimes I’m spooked at how eerily personal these algorithms can feel – like Tiktok’s for you page. Have you ever felt that it’s as though it knows how you’re feeling before you notice yourself? If so, this isn’t a coincidence. With every swipe we allow intimate algorithmic analysis of our headspace. It’s no surprise that we lose hours to it and wonder where the time went.
Indeed, if you’re spending countless hours every day locked into this virtual trance, have you ever stopped to think about whether you want to be there? Just because cookies taste good certainly doesn’t mean we should keep eating them until we’re sick. We ought to similarly look after our mental health. Cal Newport makes the same case in his critically acclaimed book ‘Deep Work’. His discussion of Attention Restoration Theory makes the case that the instant gratification of social media severely depletes your ‘directed attention reserves’, making it difficult to learn, work or even relax.
I am also deeply worried about what this near-symbiotic relationship means for children. Most adults have grown up in a time without social media, but for children nowadays, they are plugged into the matrix as soon as their hands can hold a phone. Facebook whistleblower Frances Haugen claimed, in front of the US senate, that the company is aware of dangerous impacts of social media on young people but shareholder pressure prevents the drastic reform required. Indeed, it is proven that in young girls, increased social media usage at age 13 leads to a higher risk of suicide as emergent adults. How can they be expected to regulate their screen time when they’ve known no other reality? If we don’t owe it to ourselves to act, we owe it to them.
Some would view my case as scaremongering. That the harms of social media are minimal and that resisting technological change risks holding back crucial human advancement. To that, I would concede that social media truly does have the capacity to transform human connections for the better. But the data shows that when we allow companies with eye-wateringly large profits to design these algorithms, the outcomes suit not the wellbeing of users, but line the pockets of their shareholders.
Just like how we enforced seatbelts despite resistance from Ford; and just like how we banned indoor smoking despite the protests of the tobacco companies; If they are to profit from us, they must consider our wellbeing. Demanding responsible and transparent algorithms is not unreasonable, it is necessary.