“All of pop culture — be it movies, music or TV — is now engaged in a delicious orgy of self-consumption.”
Peter McKnight — “Is pop culture eating itself to death?”
The Globe and Mail, July 31, 2000
The promise of artificial intelligence(1) is that it will usher in a technological revolution that will transform industries, economies, and societies. However, as AI begins to permeate our lives, a darker side is emerging that threatens to undermine human culture.
Culture, the collective expression of human creativity, values, and identity, is a living, breathing entity that is constantly evolving. The experiences, beliefs, and imaginations of individuals and communities shape it. Yet, as algorithmically-generated media becomes increasingly prevalent, there is a growing concern that our cultural landscape is at risk of being homogenized and ultimately cannibalized by algorithms that lack the capacity for true creativity and learning.
Toronto-based lawyer and ethicist Peter McKnight, you were prescient!
In his July 2000 article“Is pop culture eating itself to death?”, Peter McKnight investigated the trend of self-reference in popular culture and the potential harm it could cause. With his observation in mind, we should consider whether our present-day enthusiasm for generative technologies such as ChatGPT and MidJourney could inadvertently amplify and accelerate this self-referential trend. Taken to an extreme conclusion, might self-referential AI technologies and their static learning models drive our shared culture to the point of homogeneity — essentially creating a hermetically sealed set of possibilities that can never yield novel outcomes?
Large Language Models (LLMs) that power popular generative tools like ChatGPT and MidJourney are trained on massive amounts of data, encompassing a wide range of cultural artifacts, including literature, scientific articles, and internet content. These models learn to generate responses or create new content by identifying patterns in the data they’ve been trained on. This inherently involves a degree of self-reference, as the LLM’s “original” output is, in fact, a synthesis of the inputs it has processed.
Self-reference begets more self-reference
This characteristic could exacerbate the self-referential tendencies in pop culture in several ways. AI algorithms are increasingly used to generate content, from news articles to movie scripts, music, and even fine art. If these models are increasingly trained on self-referential content, they will produce more of the same. This will result in a negative feedback loop, where self-reference begets more self-reference.
The new and improved filter bubble
The use of AI in content recommendation systems could also contribute to this issue. Platforms like YouTube, Spotify, and Netflix use algorithms to suggest content to users based on their past preferences. While this helps users discover new content, it also risks creating “filter bubbles”, where users are primarily exposed to more of what they already know and like, stifling exposure to novel ideas.
An algorithm won’t tape a banana to the wall and call it art (for better or worse)
An algorithm’s inherent lack of creative intuition will also exacerbate the issue. While generative AI tools can mimic the styles and patterns they have been trained on, they don’t possess a genuine understanding of the training material or the creative ability of humans. Algorithmically-generate content could contribute to a culture that’s rich in self-reference but poor in novel ideas. What happens when AI-generated content becomes the norm rather than the exception? What happens when the algorithms that create our music, art, and literature are no longer learning from us but instead regurgitating the same patterns repeatedly?
Infinite cultural loops
The danger lies in the potential for AI to create a feedback loop of cultural homogenization. As algorithmically-generated content becomes more prevalent, it risks crowding out human-created works, reducing the diversity of voices and perspectives essential to cultural evolution. The result would likely be a cultural landscape that is increasingly monotonous, repetitive, and devoid of the novel ideas and experimentation that drive cultural progress.
The Sugarhill Gang will light the way forward
The Sugarhill Gang’s “Rapper’s Delight” was released in 1979, arguably the first commercial music recording that used samples of other music. The recording is also credited with introducing the world outside of the five boroughs of NYC to hip-hop. Rather than eat its own cultural tail, many consider hip-hop and the sub-genres that it spawned the only area of innovation in modern music. Rather than imitating music they were listening to, Sugar Hill Gang and their contemporaries leveraged samples to create a more complex musical vocabulary to create with. Might we sample (sorry) the Sugar Hill Gang’s playbook here?
Can we make efforts to train and utilize AI in ways that promote creativity, diversity, and novelty, rather than falling into the trap of endless self-reference? If we are aware of the self-referential spirals of algorithms and can be intentional in our use of these tools, might we preserve human creativity? Essentially, we must ground ourselves in the role of tool users instead of being used by the tool.
A diverse and representative set of tool users, tool makers, and training models that represent all stakeholders are some of the key elements of the way forward.
We are at the leading edge of an AI-infused world, yet we must not lose sight of the value of human creativity, diversity, and intuition in shaping our cultural fabric. The obvious call to action is not to abandon (or pause) AI, but rather to actively guide its development and application, helping to ensure it is enriching rather than eroding human culture. We are all implicit stakeholders and architects of this technology (it’s made of us), we have to demand the ability to steer it towards a future where it is a tool for cultural expansion, not contraction.
Following the lead of the Sugarhill Gang, we can use AI as one of many tools in our creative toolbox in the service of a more novel, diverse, and culturally-rich future.
- Kate Crawford, in her excellent book “Atlas of AI” offers the following definition of AI, after declaring AI is neither artificial nor intelligent: “artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications. AI systems are not autonomous, rational, or able to discern anything without extensive, computationally intensive training with large datasets or predefined rules and rewards. In fact, artificial intelligence as we know it depends entirely on a much wider set of political and social structures. And due to the capital required to build AI at scale and the ways of seeing that it optimizes AI systems are ultimately designed to serve existing dominant interests. In this sense, artificial intelligence is a registry of power.”
- You should really treat yourself and go watch the Rapper’s Delight video
- Copious amounts of AI were used in the writing and illustration of this post
- A version of this post appeared on Medium