Lucie Gray-Miller ‘22
If you are someone between the ages of 10 and 23, then I am sure you have heard of the well-known application: TikTok. Appearing in 2018, TikTok has quickly found its place atop the social media genre of the App Store. The application’s self-proclaimed mission is “to inspire creativity and bring joy.” Nowadays, it seems like everyone from Howie Mendel to “The Washington Post” has a TikTok account. Even child predators have TikTok accounts. Yes, you read that sentence correctly, for the presence of child predators on Tiktok is not a new phenomenon. I vividly recall the user @TheBudday (pronounced bidet) gaining notoriety for a video of him dancing to a song by Falling in Reverse, and several young girls duetted this video. I was puzzled at first thought, but I soon discovered that he was sending sexual messages to underage girls (whilst fully aware that they were minors). This is only one example of a predator using Tik-Tok as a means of selecting victims. Popular accounts such as @JeyJey, @BenjiKrol and more have been exposed for sexually harassing and/or grooming children. Grooming, in this sense, refers to when someone builds an emotional connection with a young person to then manipulate, exploit, and abuse them. TikTok’s past efforts to combat this predation include requiring a minimum age of 12 to join the app and changing settings so that users can only message people who follow them back. In all honesty, TikTok still has a long way to go.
For starters, TikTok should develop a warning system for harassment. If an account has received more than three claims of harassment or predatory tendencies, then TikTok should suspend that account until an investigation can deduce whether any truth lies in the claims. Although an investigation may take a while, this tactic could potentially hinder said account from harassing others. It is impossible to verify age since children can always lie. Despite their defiance, these children are still vulnerable targets for online predators. TikTok should protect them by any means necessary. Countries (e.g.: India) have decided to ban TikTok entirely to combat this problem (and many others), but this is quite extreme. Indeed, they solved the problem with which I am most concerned, but they also created another problem.
It is hard to regulate safety on an application like TikTok without infringing upon freedom of speech. Many TikTok trends, especially dances, can be promiscuous; when children decide to participate, these already- promiscuous trends start to dance along the line of child pornography. Albeit necessary, it is hard to prohibit children from viewing and recreating these videos. TikTok has tried to replace some songs with their clean versions compared to their explicit ones, but there is nothing the app does or can do about dances. So, where do we go from this point?
Social media teaches kids a lot, and it opens many doors for communication. Surprisingly young children are seeing how impactful it is in society. As they get caught up in the excitement of it all, they may disregard online safety and accidentally open windows of opportunity for sexual abuse and hyper-sexualization. There is no need to completely ban the application, but make no mistake: Children’s online safety should be addressed expeditiously.
The content of this article, as with every article posted on The Exchanged, does not represent the views of the staff of The Exchanged nor the National Cathedral School, St. Albans School, Protestant Episcopal Foundation, or any employee thereof. Opinions written are those of the writer and the writer alone.