The Algorithmic Lens: Exploring Bias, Creativity, and Racial Capitalism in AI

week 7: Algorithms, machine learning, and racial capitalism

Use an AI story or image generator to write a story or create an image. Discuss and analyse the story or image and explain the values and biases manifested in it, using concepts and theories from today’s session.

Algorithms and AI

Noble (2018) argues that the data economy is constituted by applications driven by mathematics based on human choices. It is a symbiotic process that provides information to users and is partly provided by users. For example, users’ search history becomes part of the data used by the algorithm because their web search behavior often reflects their opinions. Therefore, this is inherently flawed. Technological design encodes human biases, misunderstandings, and biases into the software systems that increasingly govern our lives, leading to problems such as algorithmic racism.

To explore deeply, I tried to use artificial intelligence (AI) to write some stories. This is because these stories are often based on the data used to train the algorithms, which inherently reflects the sociocultural context and values at the time of collection.

This AI story was written based on my instructions: “Two young girls and boys start their new jobs.” The setting of this story reinforces society’s traditional expectations of gender.

The introduction of the female character Lily focuses on her appearance. Here is the description of Lily.

  • Contagious smile and natural appearance.
  • Her keen eye for detail and her willingness to lend a hand whenever needed made her a valuable asset to the team.

The name lily is closely associated with the symbolism of the flower in Western culture and is often seen as a symbol of femininity. These symbolic traits are also traditional social expectations for women. Society usually expects girls to have qualities such as elegance, gentleness, and approachability, which are consistent with the image of the name “Lily.” Therefore, the AI’s choice of this name may have been influenced by social stereotypes that position women as emotional communicators, collaborators, and maintainers of team harmony. This traditional gender role expectation tends to assign value to women based on their appearance, likeability, and interpersonal skills while ignoring or downplaying their intellectual or professional abilities in the workplace. This may be due to typical stereotypes resulting from social expectations that women’s roles in the workplace focus on emotional communication, collaboration, and maintaining team harmony, rather than challenging or leading complex business decisions.

The introduction of the male character Ethan focuses on professional abilities. Here is the description of Ethan.

  • Analytical thinking
  • Creative problem-solving abilities
  • The ability to think outside the box

“Ethan” means “strong” or “determined,” which reinforces its association with masculinity and society’s expectations that men should be reliable, stable, and capable of taking on leadership responsibilities. For example, Ethan is described as adept at creating innovative and unconventional solutions to difficult situations, reflecting the stereotype that men tend to have stronger leadership and technical skills, allowing them to lead thinking and decision-making processes. Additionally, Ethan’s participation in more challenging projects also means that men are often seen as leaders who can handle decision-making and take on challenging tasks. This stereotype reinforces traditional gender role expectations that men should have decision-making power, leadership, and technical skills, further shaping society’s view of men’s roles.

In fact, such gender stereotypes may limit women’s advancement in the workplace, while men may face greater pressure to conform to the standards of “leaders” and “innovators.” This gender stereotype may invisibly deepen society’s expectations for the division of labor between men and women in the workplace, and link gender characteristics with certain professional behaviors or abilities, thereby forming stereotypes.

Although AI is not subjectively aware, it still builds characters and plots based on patterns in past data. AI models rely on large amounts of data during training, and these data often contain gender stereotypes that are prevalent in many cultures. Therefore, AI-generated content will be affected by the cultural background and biases contained in the material. As Haravis criticizes search engines as a window into our desires, this phenomenon may further influence social values (Noble, 2018, p. 25).

Reference:

  • Noble, S. (2018) Ch. 1: ‘A society, searching’, Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press.

Leave a Reply

Your email address will not be published. Required fields are marked *