Technology and Race Response

Algorithms of Oppression– Safiya Noble
In this talk, I found it very interesting when Noble talked about how Kabir Ali searched up “3 black teenagers” vs “3 white teenagers” on google. For the search results of the black teenagers, the images that came up were mugshots in comparison to when it was white teenagers, only getty stock photos came up (nothing to due with criminality.) This is a problem of google’s algorithms which they didn’t formally apologize for, they only tweaked the algorithms. I found a bunch of articles where they questioned if google was “racist.” I’m curious to know how these algorithms were first put in place. I don’t think google was intentionally trying to be “racist” but these algorithms stemmed from somewhere. This reminds me of when I was in high school, my senior year english class, we talked a lot about social media, politics, as well as social issues. We focused a lot on the BLM movement, and one thing we learned about was when a young black teen was shown on the news, they would pick images that portrayed them as “criminals” or “trouble makers” rather than choosing to show more wholesome photos such as their graduation or pictures with their families. I think these algorithms are a representation of society, and it influences the things we see. If algorithms are biased, how much more will that influence and affect the people that are exposed to these patterns on the internet and in real life.

What have you searched up on google that has very biased and non representational results? I searched up ‘3 asian teenagers” and the images do not reflect asian teenagers at all. How can we fix this algorithm?

Race After Technology– Ruha Benjamin
I thought the app “citizen” was a great app to be updated on the latest crimes in the area. And the users can also upload incidents onto the app. I think it is very unfortunate that we need to have apps like these to report on the “bbq beckys” or our neighborhoods. Another thing she mentioned was the new Jim Code which are: engineered inequality, default discrimination, coded exposure, techno benevolence. She brings up an abolition approach to the new jim code which is the “Digital Defense Playbook” which are dealing with pervasive data collection and data driven systems. Their aim is to develop power not paranoia according to our data bodies. I find it very interesting how communities are fighting injustice through technology. I feel like in some ways, it can be more effective.

What are some ways you see technology fighting racial injustice?

Laboring Infrastructures– Lisa Nakamura
I think the VR ‘Pathos’ is a very interesting concept. They create empathy based VR experiences to disrupt interpersonal oppression, discrimination, and misperceptions. What I liked is what the founder of Pathos Lab, Romain Vak, had said, “The best and worst thing is that nobody in the filed of VR has any clue what’s going on. What that means is that there are no rules yer but it also means that there a lot of eggs are in a basket that is difficult to predict. There is a lot of speculation and theorists, but I don’t really think anyone knows where the heck the industry is going.” I think it’s fair to say that no one in the tech industry can fully predict what will happen in the future. Although they can look at past data, there is no way to really see what direction technology will be going. FB can predict as far as their data has collected, but they don’t know where their future lies 10 years from now.

perhaps because VR is still very new, where do you think the tech industry in VR will be in 10 years?

Leave a Reply

Your email address will not be published. Required fields are marked *