Tiktok: The death of Attention Spans
January 17, 2023
38 minutes into downloading TikTok I received my first experience with government-funded US military propaganda, and it was by liking a single video. The video in question featured a pair of martial artists sparring, one man and one woman. I “liked” the yellow-tinted, poorly-shot video. Upon further inspection of the comment in the corner, it featured a single phrase that echoed through the path I was led down hours after downloading the app: “Equal rights, equal fights”
After liking that specific post, I was sent down a pipeline of military extremism, and over the next 3 hours no matter how many times I tried to like cooking videos or something more harmless, I was getting videos about violence and guns. Over time the videos about guns turned from honest (if not a bit eccentric) gun owners into military extremism, showing off the Navy and Army and begging me to join their ranks. Following the military videos, was a string of political TikTok posts, thus began the algorithm spreading hate, homophobia, and general misinformation.
Tiktok algorithms track users’ interactions with their videos. It tracks the watch time that you spend on a video, checks if you interact with it in any way by liking or commenting, and the next video it brings up will be based on your interest in the current video. Over time this effect ramps up intensely until you are viewing videos that were wildly different from what you were watching an hour ago, further cementing what type of video you would enjoy for later viewing.
The longer you scroll, the better the app has an idea of who you are and what you like to the extent that It probably knows you better than you know yourself. For example, a few days after semi-constant use, the algorithm settled itself out to my interest, and after using TikTok for three days I couldn’t consciously tell I was using it. With most social media you don’t really know how much you used it until it’s gone. When I had the app I found myself using it often, especially whenever I was bored or needed to relax for a bit. I would open up TikTok and the white noise of personalized content made me stop thinking about what was going on in my personal life, but offered nothing of actual substance that I could enjoy.
According to a survey conducted by the Terrier Times of 51 students out of all four grades, it was discovered that 78.4% (41 students out of 51) of surveyed students had TikTok. Of the 41 students using TikTok the amount of time they spend on the app varies significantly. Evenly split between less than one hour, 1-2 hours, 2-3 hours, and more than 3 hours spent on the app. It appears that the time spent on TikTok varies widely because of what (and where) people use the app.
When asked what reason they used TikTok, 90% of the students who had TikTok said they use it for comedy with a follow-up of music and music recommendations ( 52.5%). This implies that the usual TikTok use is short and sweet. For a wide portion of the TikTok audience, they will use Tiktok when they have 5 to 10 minutes to kill, something to make them laugh as they wait on the bus or at lunch, and nothing more than background noise. This also highlights a deeper, more invasive issue with TikTok, and that comes in the method the comedy videos usually take form.
The most interesting occurrence of algorithmic exploitation comes with adult cartoons. Shows like Family Guy or Rick and Morty have always had a large mainstream appeal, but TikTok has optimized their cultural mainstay into something much more parasitic. The most common video featured on TikTok is a video spit in half with the top half of the screen being a scene of Family Guy or Rick and Morty, which is cut down to a minute and posted over the bottom half of the video, most often being Subway Surfer or other mobile games.
This isn’t the first time that these types of audio and visual mash-up videos have been created, after all, they made their start on Youtube years earlier with another form of trend. Reddit text-to-speech videos have been a mainstay of content on the internet wherever you look, first being developed as a way to share stories people found on Reddit, but since it was featured on Youtube primarily, they needed a visual component to make them work. Matching the cheaply made style of text-to-speech Reddit stories, the video often consisted of a simple computer game for eye candy to garner something interesting visually. As this trend morphed over time, internet users seem to connect with the audio and visual stimulation much more than just text presented on screen or just the video game being played. In many ways, this was the internet’s kill-two-birds-with-one-stone to enhance user engagement, and as the trend moved over to Tiktok, the outcome only served to prey on users’ shorter attention spans.
Taking out of context, these “Family Guy Funny Bits”, chopped up to less than a minute, and plastered with the gameplay of a nonsensical mobile game (oftentimes with reused clips between videos) is a bit strange, but it shouldn’t warrant outright panic. It’s what this trend has morphed into that should worry any onlooker, and since it worked for Reddit text-to-speech, Family Guy, and Rick and Morty, where does it go next? The answer is stand-up comedians, oftentimes getting the same treatment as the family guy clips, chopped up and plastered above a game.
From there the floodgates opened, and now almost every other view seen on TikTok are mobile games below clips of, well… whatever the TikTok creator wants. The most detrimental of these videos come in the form of influencers and commentators like Andrew Tate, Ben Shapiro, and Jordan Peterson. Since they are divisive internet personalities TikTok avoids sharing their accounts and videos (Andrew Tate has been “banned” from the app), but when their videos are given the same treatment as comedians and Family Guy clips, they bypass Tiktok’s ban. Much like the other forms of comedy presented in this way, the controversial messages are made more accessible by the visual stimulation of the games. Many people internalize these messages subconsciously because of the general white noise the app engages its users with.
This intentionally vague nature of these minute-long clips bleeds into another majority number of responses. “To keep up with current trends/events” and “news” held a whopping 67.5% of responses from the students who used TikTok. Much like with the lack of context and care in the mobile gameplay videos, news on Tiktok is sensationalist and vague. When someone is qualified to talk about a subject with care, they make 70 episode-long playlists to cover the topic with a reasonable amount of care.
Tiktok is first and foremost a product, it can’t be looked at as an unbiased form of communication because its goal is to get users on the app and keep them there. Many problems with Tiktok come from a misunderstanding of the main point of the app, and because users interact with the app on the belief that they are seeing an unfiltered perspective of the internet, this leads people to forget the negative impact that TikTok has programmed into their algorithms and in turn become a product of the silent radicalization of TikTok.