Originally published on Patreon.
When the winter comes, many days grow cold and anxious for me. My eyes glaze over the world with fuzzy vision and little interest, and I find myself a paranoid, chilly ball of sad stress. It’s been this way for as long as I can remember, and when my environment and me just can’t get along, I’ve always found myself losing my self in computer screens, television screens, phone screens, healthy and sometimes unhealthy habits, temporary obsessions, and whatever else I could melt into long enough to forget there’s a world around me. The doctors call it dissociation, or depersonalization, and that shit mixed with anxiety is the fucking worse.
In the past, I’ve been able to curb those moments where I lean too far into the other side with simple techniques, most of which I learned through therapy or in books on mental health. Or, for several years, I’d just go outside and smoke a cigarette. Something about those few moments of hazy peace is divine. Nonetheless, since I stopped smoking cigarettes a few years ago (alhamdulillah) I no longer have those grounding moments outside. I’ve tried other (healthy) things to fill that itch and it simply never comes close to the real thing. That is, until I finally gave in at the beginning of the COVID-19 quarantine and downloaded TikTok.
I began using the app casually to stay abreast with social media trends, since I worked as a social media account manager at the time. But over the last few months, as anxiety and winter blues return me to constant “zoning out” moments, I can get lost scrolling in that app for a long time. I’m not glued to the app for several hours on end, nor am I spending copious amounts of time and energy learning new viral dances like the youth are, but I am picking up my phone when I feel myself zoning out and getting lost in an endless stream of bite-sized, vastly varied, mostly useless, candylike videos that make me feel a full range of emotions in just seconds.
A popular and growing genre on the app are “political education” videos, where users use video format to educate viewers on everything from current struggles and news items to historical events. Truth-be-told it is simply a poor platform to attempt this on, even if some claim otherwise. The inevitable complete lack of context in such small, short videos; the poorly designed, poorly functioning comments section; the intentionally addictive “infinite scroll” style layout that makes reading descriptions or clicking links an afterthought, which has no natural “end” for our brains to process; the hyper-tailored rabbit hole of an algorithm, which so easily leads users down violent pathways, and so forth. The machinery of TikTok, like Twitter and other popular content-driven apps before it, is concretely counterproductive to generative political education.
“You know how some people don’t really like bell hooks because of the fact that she doesn’t tend to cite anyone — she just kind of makes a lot of points and doesn’t back it up? That’s TikTok as political education,” says writer and my friend Najma Shariff, who I reached out to for thoughts on the matter. “There’s a lot of people stringing words together as ideas that they think make sense, that they’ve regurgitated from many iterations of online social justice and political education. First you think about Tumblr, then you have Twitter.”
Shariff says that there is very little citation practice on the app, and whatever citation is there exists in a vacuum of decontextualization, leading to reductionist and/or inflammatory arguments. This isn’t much different than how political discussion and education exists on Twitter, a spectacle-based social media app with very similar engagement mechanisms as TikTok, however the main difference remains that TikTok is ultimately a platform purely for video content.
“The difference between Twitter and Tumblr political education, and TikTok, is that [on Twitter and Tumblr] you can actually link to things, and you have to cite sources to a degree. And people can fact-check you a lot quicker than they can on TikTok,” Shariff explains. “On TikTok, you have the ability to be a tyrant. You can just turn off your comments if enough people agree with you and like your video, or if you have people dissenting in the comments, you can just turn that off. It’s very easy to manipulate people on TikTok, which doesn’t help with political education because we’re supposed to be engaged in discussion together, not just say our peace and leave.”
All content, political or otherwise, must compete within a new beast of algorithm which researchers are still trying to fully understand. This means that regardless of the content’s theme, it must stir enough fast outrage, quick intrigue, or rapid interest in order to be promoted to users “For You Page” and gain tons of views, which in turn creates a feedback loop within the algorithm. This is what tells the app to continue spreading the user’s content, and in turn signals what type of content to suggest to the individual users. Once I watched and liked a few highly-viewed videos of featuring ballroom content, for example, the algorithm quickly determined I was probably Black and gay and suggested me content accordingly. It wasn’t wrong.
With such a sharp-tuned algorithm, one which inevitably assumes the user’s race, gender, political leaning, hobbies and interests, then endlessly suggests content based on those things while constantly sharpening its understanding of you, users can build a platform on the app pretty quickly. Manipulating the algorithm and clickbait viewing trends can turn a user from 100 followers to 10,000 overnight, and unlike other social media platforms which use some sort of “friends” or “mutual followers” list to empower users to assess someone, TikTok basically promotes strangers to prominent roles on the For You Page for the sake of virality.
The small but mighty app is owned by ByteDance, a Chinese multinational corporation with multiple massive internet/digital ventures, and a valuation hovering around $400 billion. Parent companies aside, TikTok itself is valued around $50 billion, and with roughly $2 billion in revenue in 2021, it’s on track to be the most downloaded and highest grossing app this year by a longshot. Oh yeah, and the app had over 1 billion active users across the globe in 2021, up tremendously from 55 million global users in 2018. During the pandemic TikTok exploded, allowing the app to make ridiculous profits, as well as a burgeoning class of (mostly white) “TikTok influencers” who making millions from their content and product placement partnerships on the platform.
Keep in mind as you read the truly massive (and quickly-amassed) numbers above that: some estimates say around $330 billion could end world hunger; we’re living through one of the most violent, rapid, and largest transfers of wealth in world history (no matter what they say about a “Great Resignation” and however they want to phrase it or walk around it, that’s what we’re experiencing); the predatory IMF claims around $50 billion should be ‘invested globally to end the pandemic’; Amazon, the environmentally catastrophic sweatshop package-delivery service that has its hyper-exploited employees literally peeing in bottles while it illegally crushes unionization efforts, is worth trillions thanks to the pandemic; TikTok and ByteDance devour the national economies of a good portion of the world; people globally are dying each day due to lack of access to basic yet necessary medicine like insulin, and just this year I had to ration my asthma medications in the midst of a respiratory crisis because I couldn’t afford to refill prescriptions.
In some regards, the short videos have the ability to effortlessly expose mundane ills of capitalism. On the other hand, the premise of these exposés are often to make viewers laugh or make the creator profit, thus overshadowing the actual ills and exploitations they seek to illuminate; or, in other instances, the videos are simply so decontextualized and seemingly ‘random’ that viewers find themselves simply saying “that’s horrible” before swiping up.
The influencerization of exploitation and of the gig economy reigns supreme on the app, in between videos of millennials making jokes out of their workplace exploitation, or at times even making videos directly from their workplace. In one video which I now cannot find (another problem of the app is it’s intentionally impossible to find older videos), a young Black man sets his phone up at a table, and simply sits down; he’s wearing an Amazon employee shirt and inside an Amazon warehouse. The words at the top of the video read something like “let’s see how long it takes my boss to come tell me to get back to work” and, within just a few seconds of him sitting down, you can hear a woman in a familiarly managerial tone tell him to “get back to work” because he’s “already taken” his bathroom break. Part of the comments on the video were individuals laughing at the circumstances, others were appalled, and others were actually upset with him for daring to sit down.
In another video, a young Black woman describes how herself and her best friend decided to get a part time job at an Amazon warehouse to “make a little money on the side.” She video records her first day, and narrating over the video, she explains that on the first day of the job they were given almost no training before being thrown onto the warehouse floor. Her narration is quite funny, but it’s also insightful: on the very first day she already felt harshly overworked, and on her lunch break, she ultimately decides that she’s driving home and never going back there again. While the main takeaway from the video should be how completely awful it is to work for the company that literally wouldn’t let workers leave during deathly tornadoes, but instead such a takeaway is drowned out by the quick comedy of it all.
TikTok is also overrun with corporate commercials, product placements, and general propaganda that plays on loop with the rest of the content. It’s impossible to scroll for 5 minutes without getting tons of sponsored ads, like the ones (literally) made to trick you into thinking it’s just a “regular” TikTok, not an advertisement. If you use the app you’ve probably seen the ridiculously annoying barrage of commercials telling you “why working for DoorDash is amazing because you are your own boss.” Or perhaps you have seen paid influencers sipping Bang energy drinks with the label always facing the camera; Bang energy drinks, by the way, carry three times more caffeine than a cup of coffee and may lead to strokes.
As previously mentioned, the algorithm becomes attuned to users based on several key markers, and at times it’s so closely aligned with aspects of your personality or interests that it’s shocking. Not to mention, once the algorithm realized I was Black and gay, the advertisements followed suite. Now, the sponsored videos I receive mostly feature Black individuals, and are for exploitative “buy now, pay later” services, or more poorly paying gig economy jobs, or the consistent stream of individuals looking directly into the camera telling me how they can fix my credit or teach me how to land a “six figure tech job.”
In terms of propaganda, there appears to be a bottomless supply of copaganda videos, u.s. military personnel performing popular TikTok dances, TLGBQ individuals in full military uniforms with the “sob story” background music over some “patriotic” garbage. Propaganda videos for apartheid “Birthright Israel” trips play incessantly, filled with Zionist lies and talking points. Capitalist content creators are regularly praised for running sweatshops, call centers, and being landlords, but by lipsyncing over popular TikTok sounds and engaging in popular video trends, no one ever calls them out. A dumb “TikTok For Biden” account made headlines, and dozens more just like it pop up every day. Large news outlets like ABC News, Yahoo News, AP, NBC News, CBS News, etc., all pump out never-ending streams of content related to u.s. politicians and political events, maintaining their imperialist lines all the same on the app.
Individual users decrying “conspiracy theories” push the narrative that “crazy conspiracy-believing Trump supporters who red pilled on the bad corner of the internet” are responsible for our current civic and political divisions, not centuries of imperialist pillaging and racist capitalism. “Arguments” about vaccinations – whether folks should get them, whether children should get them, etc. – all turn into individual morality meters and rarely develop into actually useful information. The popular ability to lipsync the ‘sounds’ of other videos (which originally gave rise to TikTok’s popularity) sonically rebrands war criminals, racists, bigots, and other problematic characters alike, as people use their voices to make ‘funny’ content with little regard for the original audio’s context. One such example is a recent stupid TikTok by white singer Grimes which went viral and has spurred thousands of videos, with virtually no one discussing the fact that she’s married to one of the richest men on earth whose wealth was built from apartheid jewels, who endeavors in active exploitation.
Other aspects of the app are just as questionable, and require a further investigation. One such example is what some are calling the “TikTok to OnlyFans Pipeline,” which represents the growing trend of OnlyFans creators and other adult content creators using the platform to promote their work, and such content inevitably leading into the hands of minors. Users on OnlyFans have boasted about how much TikTok has boosted their careers, and like other users, they’re able to carefully manipulate the algorithm for it to reach as many people as possible.
I don’t blame the sex workers – TikTok is a powerful marketing tool that perhaps they should be able to use to promote their content. The problem is that over a fourth of TikTok’s global users are under age 18, and in fact this demographic represents the fastest growing age demographic on the app. TikTok allows for a certain kind of influencerization (and glamorization) of every aspect of every industry, and OnlyFans workers are no different. The more ‘sexy’ content you engage with, like thirst traps, the more the algorithm gives you. And with OnlyFans already recently having scandals related to children on its platform, as well as TikTok’s recent thirst-trap-to-sex-cult story, this is one example of an area where we have to figure out a way to protect children and everyone involved.
Earlier this year China made headlines for new mandates which would limit children’s usage of social media and video games. In effect, the mandates limit users under age 14 to 40 minutes a day on Douyin, the Chinese version of TikTok, and people under age 18 were disallowed to play video games except on weekends and holidays. While news of these decisions made controversial waves across the west, the actual data around digital usage suggests this decision may be for the best. Roughly 93% of China’s youth are online, and officials say levels of near-sightedness among children is rising, and psychologists have stated very clearly that users – especially children – can become addicted to the dopamine fix they receive from the short videos.
One study done at the beginning of 2021 which looked at the psychological consequences of prolonged time on TikTok openly states that they know little about the “psychological mechanisms related to TikTok use” and that work “investigating potential detrimental aspects” of using the app are scarce. Another psychologist discusses the ways that TikTok may rupture young users’ sense of “self” and “environment,” push an inclination towards addiction, impact self-esteem, and ultimately have potential societal repercussions.
The insistence on virality and always-increasing view counts also means, to put it mildly, acting a fucking fool online for views and shock value. Teens indulge in tons of dangerous games and activities, egged on by the strangers on the other side of the screen typing in the comments, and do things like eat Tide Pods, stretch condoms over their heads, snorting random things, even choking themselves until they pass out. Of course, this aspect is not unique to TikTok – dangerous trends crop up on YouTube and elsewhere all the time – however TikTok’s extremely young user base makes them exceptionally exposed to harm. Can we really be mad at China for attempting to follow the data early on and curve a problem which the u.s. is set on ignoring?
In the last year, a phenomenon of TikTok “tics” similar to those of Tourette syndrome has come to light, leaving mental health professionals and researchers both worried and puzzled. Many users with Tourette syndrome, especially young individuals, have taken to TikTok to share the experiences of their disability with the world, often in an attempt to break the stigmas and misconceptions surrounding the disorder. The #Tourettes hashtag has gained over 5 billion views and hundreds of thousands of videos, and it’s a rapidly growing genre on the app.
Impressionable youth, particularly young women and girls in this case, are developing tic-like behaviors such as head rolling, snapping, involuntary speaking, etc. after watching large amounts of videos by TikTok creators with Tourette syndrome. Doctors say that this development likely signifies an externalization of anxiety and depression already present in the youth, but that it is manifesting in tic-like behaviors because mental health can be a social experience. German researchers in 2019 named the phenomenon “mass social media-induced illness” after a similar event took place related to YouTube content.
Clearly, there is a need for pause and concern.
If we accept that psychological and mental wellbeing can, in fact, be a social experience, and we learn to “communicate our suffering” in digital spaces, then we must look at TikTok and social media in general as a potentially dangerous place for impressionable youth. The dangers of overusing TikTok may still be unfolding, with research lagging behind but clearly demonstrating cause for concern, yet what is clear is that it presents the same ills of a racist, capitalist system as any other corporate-run social media app. The only difference is that this one is designed to make you addicted to it.
This reflection barely manages to scratch the surface on TikTok, as my intentions were to do a general overview/case study as opposed to a single-angle, hyper-specific look. The point of this long essay is not to simply shout at everyone’s favorite time-passing video app which I find myself still using, even knowing all the various ways it’s dangerous. I simply want us to become acquainted with the act of stepping back and honestly accessing the digital world around us which has come to overtake so much of our time. Our personal data, time, energy, and likely our futures belong to these digital spaces, so how we choose to engage with them (or not engage with them) must now be a central tenet of movements and organization-building going forward.
Moreover, we must be receptive to the idea of coming to a cohesive, necessary, and revolutionary digital ethics as “the left”, either within our specific political homes or altogether, that can help guide our endeavors online. What is a “principled” way to interact online when one has a disagreement with someone else? How do we assess the mental and social impacts of apps like TikTok, which feed on our society’s obsession with spectacle and clickbait content? When do we decide that a space has no radical potential and must be abandoned, and how do we make an argument for continued use of a platform? How do the UI design mechanisms of an app like TikTok cater to infighting, division, and the downfall of our movements in digital spaces? What do apps like TikTok or Twitter materially have to offer practices like citation, political education, movement building, comradely disagreement, political discussion, fundraising, and so forth?
These are some of the questions I want to explore in my future work, and questions which I implore us all to deeply reflect on as we scroll on blue screens. TikTok may be the fastest growing social media app on the market, but that doesn’t mean that we have to contribute to the negative aspects of the app. Nor does that mean that we shouldn’t understand the app and its limits as critically as possible, in the same way we’ve come to critically understand Twitter and its limits, and other social media platforms all the same.
This is why I entered the essay with a personal opening; that my eyes, too, drift for days on end at times, endlessly looking at videos that give me a rush of dopamine that’s never enough.