With the rise of digital platforms like social media, we are finding ourselves spending more and more time flipping back and forth between various social apps and scrolling tirelessly through endless feeds. As someone who has worked in digital marketing for the better part of the past decade, specializing in SEO and various other algorithm manipulation and optimization tactics, I have found myself with many questions and observations surrounding digital culture and rhetoric over the years. However, I keep finding myself coming back to one specific term, one that at this point in my research is not commonly used, or really used at all for that matter– digital eugenics.
According to Oxford Languages, eugenics is defined as, “the study of how to arrange reproduction within a human population to increase the occurrence of heritable characteristics regarded as desirable.” So, what exactly do I mean by digital eugenics and why does it matter? As I mentioned, I have yet to find any existing definitions for this particular term, but this is how I have been thinking about it over the years:
The homogenization of individual thoughts, actions, style, rhetoric, etc, accelerated and amplified by algorithmic influence. Like ‘group think’ or ‘the bandwagon effect’, but more intimately connected to and deeply entangled in our individual and collective consciousness due to our connection and dependence on technology.
Some examples of this can be found on YouTube’s suggested video sidebars, Instagram’s Explore page or TikTok’s #fyp (For You Page)– where diversity of ideas, perspectives, and content get lost entirely in the undertow of these platforms’ powerful and complex algorithms.
As social creatures, mimicry, mirroring, and other means of coalescence are not uncommon and instead, are natural psychological impulses and part of our human behavior due to our need for validation, acceptance, and community. Through my research, I came upon a tangential term– cultural homogenization which is an offshoot of the term cultural globalization. Both speaking to this blending and integrations of cultures, tied to our ever-increasing interconnectedness. Cultural sociologist John Tomlinson in his book Cultural Imperialism: A Critical Introduction speaks to the inevitability of a “cultural synchronization” as a part of our evolution into our modernized global landscape. Though everything has its tradeoffs, he, and I both agree that we do not believe that this “synchronization” is necessarily a bad thing. My concern is less about what feels like the fatedness of a large scale social and cultural merger, and more so it being thrusted upon us due to strategic technological influence.
Ross Cunliffe, MA student of Literature and Digital Culture from University Centre Doncaster & University Hull from the U.K is the only person I have been able to find who has not only talked (albeit four years ago) about this subject, but has researched it– at least to some extent. Though Cunliffe channels his research through a rhizomatic lens, he still highlights similar concerns about the potential damage of algorithmic interference both individually and culturally. For his project he writes:
“What we are focusing on is a shift towards a ‘new type of human’—cultivated and engendered by the incision of the digital into the natural world. What traits are auspicious for integration into this emerging new world (for which I believe there is no ambiguity toward our inevitable destination)? Who is permitted entry? Are the youth being molded by forces outside of the usual worldly institutions (such as schools and parents) in an unprecedentedly auxiliary manner? What are the governing forces (concealed or not) in the digital world?”
These questions are ones that I have been asking myself both personally and professionally. I think it is imperative to be conscious of the content we consume and how we integrate it both on a conscious and subconscious level into our lives. However, living in a world where we are constantly being inundated with an overwhelming amount of information and have unlimited access to entertainment and other forms of escapism online, it can be incredibly challenging to constantly maintain that level of ongoing self-awareness.
One can think of this idea of digital eugenics as an inevitable cultural “synchronization,” which John Tomlinson describes. To contrast, this algorithmic demise may exist on the opposite end of the spectrum– in separating or othering. In thinking about the intricacies of “cleaning” in data aggregation and analysis, as Catherine D’Ignazio and Lauren F. Klein talks about in chapter seven of their book Data Feminism, I begin to find an apparent level of irony that comes from algorithms’ ability to push us together and keep us apart simultaneously:
“… we find it important to acknowledge that these ideas of cleanliness and control contain troubling traces of a movement from a prior era: eugenics… While the most odious aspect of these ideas have been largely (and thankfully) stripped away, certain core principles– like a generalized belief in the benefit of control and cleanliness– remain… the ideas underlying the belief that data should always be clean and controlled have tainted historical roots. As scientists, we cannot forget these roots, even as the ideas themselves have been tidied up over time (131).”
In a similar sentiment, in chapter two, D’Ignazio and Klein highlight how data predictive technology such as algorithms have a history of creating and perpetuating oppression and inequality, particularly within marginalized communities. In speaking about risk assessment algorithms, they say, “Both [risk assessment algorithms] use aggregate data about social groups to make decisions about individuals… Furthermore, both use past data to predict future behavior– and to constrain it….” They continue, “This is one of the challenges of using data about people as an input into a system: the data is never “raw.” Data are always a product of unequal social relations– relations affected by centuries of history (55).”
After reading these two chapters, I was particularly struck by their call to action regarding the necessity to not forget the “roots” even after they had been “tidied.” Although the For You and Explore pages mentioned earlier are filled with seemingly harmless 15-second dance videos or pictures of food, I cannot help but think of the more sinister consequences in this loss of diversity. I am not claiming that the Tech industry is purposely creating these social networks to engage in modern day eugenics, however, I do find the term digital eugenics potent and relevant. As Cunliffe asks in his research, “what we are focusing on is a shift towards a ‘new type of human’… What traits are auspicious for integration into this emerging new world… who is permitted?” I believe D’Ignazio and Klein are asking similar questions, and truthfully, I wish more of us were. I do see immense value in these digital spaces, they hold so much capacity for good. However, though “digital eugenics” may not be the best or the ultimate term to describe this phenomenon, I believe it is one that we must actively pay attention to.
Bibliography:
Cunliffe, Ross. “Digital Utopia and Digital Eugenics.” Humanities Commons, Humanities Commons, 9 Oct. 2020, https://hcommons.org/groups/digital-utopia-and-digital-genetics/.
D’Ignazio, Catherine, and Lauren F. Klein. Data Feminism. The MIT Press, 2020.
Genschow, Oliver, et al. “Mimicking and Anticipating Others’ Actions Is Linked to Social Information Processing.” PLoS ONE, vol. 13, no. 3, Mar. 2018. PubMed Central, doi:10.1371/journal.pone.0193743.
Tomlinson, John. Cultural Imperialism: A Critical Introduction. Johns Hopkins University Press, 1991.
Comments