Technology and Race

Lisa Nakamura – Laboring Infrastructures (talk, 30m)

I really enjoyed this talk. The idea that VR systems are ultimately manipulating human emotion is really interesting. There’s this one section where Nakamura talks about how people are using VR to replicate the experiences of the marginalized in efforts to “fix racism”. There’s this idea that if people can find an emotional connection to the oppressed that we can have faith in humanity again. But faith in humanity can’t be restored with mere empathy. Empathy here is being used as an excuse to not feel bad about the opression that continues to happen. The “Yes there’s oppression still going on but I feel bad about it so it’s not that bad” is self-centered. It excuses us from handling these issues in an institutional matter that would actually make change/progress. By showing that we feel empathy, it dismisses us as the ones to be blamed, when in reality the people that can experience the VR system are probably some of the most priviledged with ability to help those who aren’t. Nakamura brings up that putting ourselves into the “shoes” of someone else can be oppressing itself when they might not even have full rights of their own body in the first place. It made me think of this article read in Interaction II (linked below). It brings up the efforts of tying together ethnography and VR so that people have global access to communal documentation. But it makes me question not just VR, but rather the way we think of documentation when it comes to marginalized communities? Whats the foundation for documenting and why is it important to take these experiences from them, and share them around the world. It’s a question of are we right to take this from them, and how do these “experiences” subject to instilled bias?

Ruha Benjamin – Race After Technology (talk, 20m)

Racial norms shape how we understand tech and society. Citizen app reflects creators biases. Instead of stopping crime, we’re guided to avoid it. So, what does this say about how we choose to handle racism and crime?

Celebrities promoting wrong apps? Jay-Z?

This entire thing about creating algorithms to find a suspect/criminal is crazy to me?? Using facial recognition from linked in to all other platforms brings into light how much data can be taken from us without out intention. This way of systematically organizing potential suspects from previous data collection seems too risky. It trusts a computer to navigate future criminalization and curate crime ratings that can be racially directed.

Safiya Noble – Algorithms of Oppression (talk, 45m)

Google “accidents” that create racial bias and stereotypes in google search. Three black boys vs white boys.

WHY is the answer always technical? Why do we always look to computers for answers? Why do answers have to disconnect emotionally? Why is emotion bad and why are we encouraged to think outside of morals/emotions?

Technology is not flat not neutral they are man-made.

Leave a Reply

Your email address will not be published. Required fields are marked *