Technology and Race Responses

Lisa Nakamura’s Laboring Infrastructures: Nakamura starts her lecture with the fact that women of color are working low wages to make the technology we use. It isn’t until 2018 that big tech companies are questioned and become not seen as “the good guys” anymore. Facebook getting fined $5mil for privacy violations isn’t enough to deter them into doing the right thing. Ethics in AI seems to have been given more attention than empathy in VR. Tech companies tote VR as an empathy machine that’s able to ‘fix’ racism, sexism, etc. by having mostly rich straight-white males become refugees, people of color, sexual harassment survivors, transgender and/or disabled people (a.k.a. the undercommons) in VR. Nakamura calls this toxic re-embodiment because they are inhabiting the body of someone who doesn’t own themself. Examples Nakamura includes is My Beautiful Home from Pathos which invaded the privacy of people while filming, One Dark Night (a work about what happened to Trayvon Martin) which reproduces racial violence in the name of reducing it, and the BBC losing their minds over a VR revelation that the player embodies a black woman getting a haircut (if the player was revealed as white there’d probably be no story). Nakamura brings up identity tourism and describes it using Adrienne Shaw’s terminology- to identify with instead of as. Overall, the undercommons don’t have the luxury of taking off a VR headset and living without their own personal struggles. The reality is that VR is being used to optimize privileged people too busy to do the real thing (engage with people suffering); and therefore, must have a ten minute VR experience to automate their feelings.

  • Do you think Mark Zuckerberg is an empathetic guy or a failure at it? -I think having a VR tour of Puerto Rico after they’ve been through a flood just to talk about new Facebook features is insensitive.
  • Do you think VR’s ’empathy’ (or is it sympathy?) is enough to fix exacerbating inequalities, or do we really need rules/regulations?
  • What are your thoughts on ‘feeling good about feeling bad’? Can white people enjoying slave literature be seen as similar to white people enjoying VR?
  • If the ‘undercommons’ were given the chance to make their own VR, what might that look like?

Memorable quotes: 1) “If somebody is going to put you in the shoes of somebody, that means that they have stolen those persons’ shoes” 2) “VR is the totem of the overdeveloped world”

Ruha Benjamin’s Race After Technology: Benjamin says racism is productive -not an accident, outdated, or a glitch. Sociologists will say race is socially constructed, but they don’t typically say racism constructs. When it comes to technology, our social inputs matter. Benjamin describes imagination as a battlefield; and the fact is that many are forced to live in another’s imagination. Benjamin mentions the citizen app was first used to report crime, but is now used to avoid it (the Starbucks racial bias incident is coming to mind). I love that later on in the lecture the White-Collar Early Warning System counteracts this racial profiling with a much larger crime going on that’s more readily ignored (white male corporate execs embezzling millions/billions). Knowing that Google is using AI to make drone strikes more effective and Microsoft is working with ICE disappoints me, but it doesn’t surprise me. I’m also not surprised to know that racial bias in a medical algorithm favors white patients over sicker black patients. Of course it’s not focused specifically on race, or need, but on cost; and these low costs could be correlated with systemic racism. I am familiar with Michelle Alexander’s book The New Jim Crow, but I wish I was told more about The New Jim Code which covers engineered inequality and default discrimination. I also wish we didn’t need Appolition (an app that gives bail money to incarcerated black people), but I’m glad it exists. I should check out Data for Black Lives and The Digital Defense Playbook

*After the talk there is mention of a soap dispenser not dispensing onto dark skin which reminded me of this HP motion tracker video only working for those with light skin.

Safiya Noble’s Algorithms of Oppression: Algorithms are biased too and can’t make better decisions for us! Google searches put misogyny and racism to the forefront. I didn’t know that during Obama’s presidency in 2015 if you searched Google Maps for the N-word, it gave you the White House, but I do remember there being a lot of other appalling racist crap coming out during his presidency. It has been incredibly disheartening to witness (during the 2016 presidential election and after Trump) how easy it’s been for white supremacists to manipulate the internet to their advantage. Hate groups thrive in an echo chamber. I’m not surprised that googling “three black teenagers” produces a lot of mug shots while “three white teenagers” produces a bunch of phony stock images. Of course Google would try to ‘correct’ their algorithm to save face, but bumping up a white teenager in court for a hate crime still feels like an affront. I’m also not surprised that professional hairstyles for work are not only racialized, but pertain to women. It’s unfortunate that economic bias has been talked about more prevalently in regards to algorithms than racial or gender bias. It’s no coincidence that when searching for ‘(black/Asian/Latina) girls’ on the internet -you get porn, which ultimately infantilizes women and eroticizes any/all women of color. It’s apparent that old media is re-emerging in new media.

Leave a Reply

Your email address will not be published. Required fields are marked *