Algorihms of Opression
This video also speaks to the bias’ of algorithms and technology and how its not a private struggle but rather a public one that communities and the people that make these technologies need to think about. She speaks to the hypersexualization of women of color in the media, and how this stereotype has been perpetuated in media even to this day. She says to combat this we must no longer be neutral in this issues, reject being color blind, and curate the web with a more guiding hand, and continue to educate (especially educate the people who make these programs so they can think about the implications of their work)
Race after Technology
The first thing that really struck me here, not necessarily as surprising, but heartbreaking, was how easy it is for something that could be deemed helpful to be turned into a way to enact racism or racial bias. The app Citizen makes sense to me in some ways but the ability for it to be abused is just so high. It reminds me of something we’ve spoken about before which is that all tech will have human bias, because it is made by us. For instance any scanning software for facial recognition might be more capable of handling white people than black because the white person programming it didn’t think about it. Therefore technology cannot be perfect because we are not perfect. I really appreciated that she spoke to how race neutrality or being “color blind” can be dangerous and harmful. It doesn’t look at the differences that matter and need to be addressed because it treats everyone as if they are the same. Professor Benjamin gave a few examples of ways that people are fighting these imbalances, but what are some other ways that we as artists might tackle these issues?
Laboring Infrastructure
Discusses the relationship between women of color’s labor and technological advancements. For example having phones to buy that give us access to the internet, which are made for stupid cheap in China. Nakamura talks about how these conditions across the board can be seen as a feature rather than a problem within the system, the cost of advancement is suffering of others. She speaks to how VR can enact empathy for these types of sufferings because once someone feels the reality, it’s hard to disbelieve it anymore. But we need to be using VR in this capacity by giving resources to marginalized groups to create in VR, instead of allowing it to remain colonized in the way that is currently happening. I think this is true for not just VR but almost all forms of creative storytelling and empathy creating. Film for example needs more women, people of color, and people with disabilities. Video games need more women, people of color, and people with disabilities. All of these things rely on technology and stories but they are mostly being used by one group instead of a large platform of groups.
In some ways this talk makes me think of Horizon Zero Dawn’s plot which speaks to humans creating things we cannot fix that inevitably destroy the world. Maybe our advancements aren’t doing that currently, but they are creating inequalities at an alarming rate.
A lot of my work tries to evoke empathy, but sometimes boarders on pity. so my question may be a little farther away from the talk itself, but at what point does storytelling and empathy push too far? Is there a too far when it comes to creating a new reality away from our white centric world? What are some ways that we can create and tell these stories and critique society that enacts change?