Putting the Fun in Dysfunctional

A critique of TikTok and its immoral practices and prejudices

Khadijah A Malone
3 min readDec 29, 2020
https://time.com/4475627/is-technology-capable-of-being-racist/

It is important to mention that while I am an avid user of TikTok and quite enjoy my time spent on the app, I have quite a few concerns from my perspective as a black woman and from the viewpoint of a user experience designer.

Though I am an avid user of TikTok, I am not very proud to say so. I am able to admit when there are some things that need to be improved upon and when a company has made ethically wrong decisions. Aside from the fact that the company has a long history of race-related scandals and ethical concerns, there are also a few problems with the logistics of how this app is organized. I think that it is important to mention the quite alarming transgressions that Tiktok has had in the short four years since its initial launch. These transgressions include a racially biased algorithm that censors videos made by black content creators, frequently removing content made by creators of color under the false pretenses of “violation of community guidelines”, blocking videos tied to speaking about the Black Lives Matter movement while simultaneous allowing content to be posted that perpetuates stereotypes, uses slurs and supports terrorist groups, as well as problems with the coding of the algorithm that is meant to weed out the circulation of content made by those that are deemed chubby, disabled, ugly, old, or those that are visibly poor in an attempt to make Tiktok look more attractive and appealing to new users. There is also a history of TikTok discriminating against users on the basis of race, gender and whether or not they are members of the lgbtq+ communitity. It was brought to light that “A list of flagged users obtained by Netzpolitik included people with and without disabilities, whose bios included hashtags like #fatwoman and #disabled or had rainbow flags and other LGBTQ identifiers”.Over the past few years, Tiktok has faced much criticism and backlash for this, rightfully so, and has made little effort to improve upon their ways outside of hosting a weekend event for black TikTokers, forming a diversity collective, and featuring “Black Lives Matter” as a header on the search page for a couple of days in response to backlash for their censorship of black voices.

This article is titled “Putting the Fun in Dysfunctional” because thats precisely what TikTok is at it’s core, dysfunctional. This is not a problem that is just limited to Tiktok, as many other social media applications have had discrimination issues, like Twitters recent scandal where . Racial bias in software development is not a new concept, as “machine learning algorithms, especially facial recognition systems, are prime examples of racially-biased systems. In 2015, a software engineer noticed that Google Photos was categorizing his black friends as gorillas. Google addressed this issue by simply removing gorillas from the dataset”. These same programs, algorithms and techniques are pretty much used for the majority of social media platforms, which is why bias is not uncommon.

As product designers, engineers, and even as users, we need to start being more aware of these biases and holding these companies accountable. People already face alarming rates of self esteem and body image issues from social media use. It was said that “60% of people using social media reported that it has impacted their self-esteem in a negative way”. People should not also have to face discrimination from the application itself. We need to be aware that apps and programs are built by people, and the biases that people and cultures have transfer over to the applications that they create and use. If societally, white faces are seen as more attractive and the people creating and coding the app follow that same belief, then the creations that they make will also favor those that they think fit the mold. However, this is a harmful practice that actively works against marginalized groups without them knowing. It is systemic, and the racism and prejudice is literally built into the makeup of these applications. One way to combat this is by hiring more diverse staff that can offer insight into the problems that arise, often giving a different perspective and pinpointing possible issues early one while still in the development phase.

--

--