I was thinking on doing something related to our relationship to AI. Things like nintendogs or visual novels, or our reaction to Boston Dynamics kicking their robots. Things that make us feel things for inanimate, unfeelable AI
Category Archives: Uncategorized
Technology and Race
Lisa Nakamura – Laboring Infrastructures (talk, 30m)
This lecture talks about empathy and VR, and the opportunities it has to become a medium that allows people to feel more emphatic towards different culture and events. One of the things that I heard and can’t imagine doing, is listening to the audio and witnessing the tragedy of Trayvon Martin. However, having the ability to bear witness of such tragic events, could help a lot of people grow more empathetic towards a different community. However, she also states that one of the downside is sacrificing privacy.
Would you be comfortable being filmed for a VR experience in your own home?
Safiya Noble – Algorithms of Oppression
This lecture talks about how the search algorithms display some kind of bias towards those of color. The talk also mentions how around 60-70% of people believe that these search results contain factual data, and how this can lead to prejudice due to misinformation. She also mentions how no one cared when these biases were present when it was regarding women of color, but now that it has affected the presidential election, everyone cares about it.
How can we sort out fake news from real news? Why do some topics raise more controversy than others?
Ruha Benjamin – Race After Technology (talk, 20m)
Something I found strange was when she was talking about how racism is productive. Something that I remembered was that the video we watched in Ben’s class where the facial recognition software wasn’t able to detect an African American man. The design for technology should be changed so that it can allow everyone to utilize it efficiently.
What are some possibilities of eliminating racism in the tech world?
Technology + Race
Safiya Noble
We are on a quest to “curating human information needs”.
Why don’t we practice and promote cross disciplinary involvements and discussions, like in example of this institution?
These capitalistic institutions, such as UIUC, keep us separate. I remember us talking in Interactions I about how difficult it is to double major in two majors that are not from the same “sides” of campus, such as New Media alongside Pre-Med. Trying to schedule classes around these curriculums that are not designed to be flexible between different majors is made even more complicated, when the websites algorithm denies you access of registration in some cases, and is programmed with restrictions that cannot be fixed by anyone but the programmers, and who wants to program a new site if they don’t understand the implications of the old one? Computer engineers, and STEM majors in general, are trained to think critically in terms of “X, Y, Z” ideology and theories, not socially, politically, or communally.
Ruha Benjamin
The idea that Ruha keeps exfoliating is that nothing, especially technological programming, is made without intention. She said that even programs that are portrayed as compassionate and proactive, can be the most dangerous, because they hide the larger inputs’ desires for intentions that are not based around an abolitionist commitment.
This idea that putting out all knowledge ever, “MORE MORE MORE, capitalistic and democratic tendencies of having everything out in the open”, yet there are politics within that that restrict and withhold information. This ideology of more is better and putting it all out there is a homogenous way of thinking about everyone involved, (or not thinking about everyone involved, intentionally judging that all those who participate have shared equity, and not that they are coming from different social environments).
“We can’t lose sight of the larger structures that continue to fuel the problem”, and we especially cannot ignore the programmers and CEO’s fueling the fire. We can try doing this by breaking down how we are taught to think in very systematic ways, and teaching others that you cannot create algorithms for culture and society if you do not know anything about those things.
Lisa Nakamura
Lisa’s presentation was interesting because it talked of how open-game virtual reality is, there are, “no eggs in the basket”, or more so that there are, but they have not been studied to their full extents. She talked about how VR is the “harbinger” of the third industrial revolution, and how it redresses the problem of the second industrial revolution, the “immiseration of humans as machines taking our jobs (…) by making available the last kind of work that machines can’t do, create the right kinds of emotions that humans need”.
She talked a lot about emotional labor, argues that VR, “automates the labor of feeling pain and sadness on behalf of another”. This emotional labor that VR is able to hold, puts into place feelings of compassion and empathy, that creates an “alibi” for “material conditions of labour for racialized and gendered people”, that have always been present. In the case of Travon Martins VR starring in One Dark Night, giving viewers the experience of witnessing how Travon got murdered on that night, Lisa emphasizes that we need to think further of VR than just a source of empathy, shock, or compassion. VR needs to “invite you to be with you, instead of as, in a virtual space/experience”.
NOTES:
First talk (45)


Technology is not flat, it is the construction of human beings, what are those human beings putting in? What are their experiences
Critical race theory
Things that are actionable
Hyper-sexualization is not a product or an observation of the NATURE of these black women
These ideologies are tied to old media practices
Hundred year history
Counter narrative of what its been portrayed as
In these stereotypes of black women as Jezebel
Ofc, There has to be a mass justification for the reproduction of the slave labor force
Part of why that mass justification of the labor force comes into existence as characterizing black women as sexual, women who like sex and who want to give babies to the “labor force”
Racist capitalistic stereotypes used as economic subjugation of black people and women
When the enslavement of black labor force became illegal
How that justification was imagined and instilled
Hyperlinks, that have capital underneath them
They are well trafficked images
We are on a quest to “curating human information needs”.
Why don’t we practice and promote cross disciplinary involvements and discussions, like in example of this institution?
These capitalistic institutions, such as UIUC, keep us separate. I remember us talking in Interactions I about how difficult it is to double major in two majors that are not from the same “sides” of campus, such as New Media alongside Pre-Med. Trying to schedule classes around these curriculums that are not designed to be flexible between different majors is made even more complicated, when the websites algorithm denies you access of registration in some cases, and is programmed with restrictions that cannot be fixed by anyone but the programmers, and who wants to program a new site if they don’t understand the implications of the old one? Computer engineers, and STEM majors in general, are trained to think critically in terms of “X, Y, Z” ideology and theories, not socially, politically, or communally.
———


Imagination is a contested (to gain power) feel of action




People that create tech companies aimed to help out social causes like Jayz in Promise, those who don’t have an abolitionist commitment(..)
Seen as empathetic to the cause because his decarcerarion startup addresses the problem of pre-trial detention, but his app sells their GPS data (gps is in business with them) that tracks those individuals, trapping them further in the industrial prison complex surveillance system.
Promise exemplifies the new jim code
It’s insidious because it’s packaged as betterment
AI drone strikes more effect
ICE CONTRACT – microsoft, They basically said – we will not work alongside these people that support development of warfare and surveillance in the war.
Workers efforts to sway the industry- we can’t wait for them to change the system.
Professionalism, individualism, and reformism
to contribute to radical labor organizing


Racism is not the “white-boogey man” that everyone thinks is hiding behind the screen
She is trying to distinguish that racism can indulge systematic oppressions by having ulterior motives
(Are they trying to be racist?***notes)
You cannot design something without intention…
Someone designed the thing to have intentions, and perhaps by not being aware of the political and social environments they are entering by creating these technologies, these make for things like… (weapons of warfare!)
*Its about large and small inputs that cater to the metainterface entrapment of surveillance capitalism.
Its not a singular person that is out to get someone through an app or computer screen, that is what is “malicious”, but its the thoughts and patterns and predictions of the person who is intending this algorithm to have certain behaviors that are based off desires or gains, and/or ignorance, of race, gender, class, economy, professionalism, capital, politics, etc. etc.
We can’t lose sight of the larger structures that continue to fuel the problem




…because if the ways it can be MISUSED and TURNED against them

The idea that Ruha keeps exfoliating is that nothing, especially technological programming, is made without intention. She said that even programs that are portrayed as compassionate and proactive, can be the most dangerous, because they hide the larger inputs’ desires for intentions that are not based around
This idea that putting out all knowledge ever, “MORE MORE MORE, capitalistic and democratic tendencies of having everything out in the open”, yet there are politics within that that restrict and withhold information. This ideology of more is better and putting it all out there is a homogenous way of thinking about everyone involved, (or not thinking about everyone involved, intentionally judging that all those who participate have shared equity, and not that they are coming from different social environments)
“We can’t lose sight of the larger structures that continue to fuel the problem”, and are the main culprits. We can try doing this by breaking down how we are taught to think in very systematic ways, and teaching others that you cannot create algorithms for culture and society if you do not know anything about those things.

Don’t leave it up to the technicality nerds to let us know what is ethical

“Thin description”
Bio’s?



Whether exposure of their practices is necessarily the most prudent way of going about that” – student
_____

Invite you to be with you, instead of as, in a virtual space

Emotional labor
The third industrial revolution
Norbert Weener; Fred Turner
Facebook and Oculus Quest
VR as a harbinger of the third industrial age
Redress problem of second Industrial Age
Immiseration of humans as machines take our jobs, and to create the (…)by making available the last kind of work that machines can’t do, create the right kind of emotion/feelings.
Empathy and compassion, is:
Valuable and fundamental, as perceived by (Lauren Berlin)
VR takes the place of the progress of rights and resources
Refugee women of color disabled seek to not white men proxied in VR to find human recognition
Norbert Wiener
Claims: compassion is something u can make, u can do that.
Automates the labor of feeling pain and sadness on behalf of another.
Empathy into the realm of non-human virtual witnessing and connection, or non-virtual witnessing and connection


Before

Lisa’s presentation was interesting because it talked of how open-game virtual reality is, there are, “no eggs in the basket”, or more so that there are, but they have not been studied to their full extents. She talked about how VR is the “harbinger” of the third industrial revolution, and how it redresses the problem of the second industrial revolution, the “immiseration of humans as machines taking our jobs (…) by making available the last kind of work that machines can’t do, create the right kinds of emotions that humans need”.
She talked a lot about emotional labor, argues that VR, “automates the labor of feeling pain and sadness on behalf of another”. This emotional labor that VR is able to hold, puts into place feelings of compassion and empathy, that creates an “alibi” for “material conditions of labour for racialized and gendered people”, that have always been present. In the case of Travon Martins VR starring in One Dark Night, giving viewers the experience of witnessing how Travon got murdered on that night, Lisa emphasizes that we need to think further of VR than just a source of empathy, shock, or compassion. VR needs to “invite you to be with you, instead of as, in a virtual space/experience”.

Technology and race
Safiya Noble
Noble touches on how our ideas of sexism racism bleed into the technology world in ways we can hardly imagine, namely through search engines. Because the programming for software is often built from user inputs, search results for any given topic will bring up what is most often searched for. In the case of race, oftentimes searches for beautiful people will yield images of white people, or searches for criminals will yield images of black people. Autocompletes for searches about women will show a number of sexist remarks, such as women can’t drive, etc. Despite how small and insignificant things like this seem, they reinforce prejudiced attitudes on a grand yet subtle scale. Personally, I am unsurprised by this, but I find it annoying how no one seems to be interested in considering this an issue, much less attempt to solve it.
Ruha Benjamin
Benjamin goes into this idea of discriminatory design, or (usually software) design that will put marginalized peoples at a disadvantaged position in regards to how we read data, and how our machines learn from us. The problem with machine learning is that it’ll correlate racial traits to varying levels of credit, trustworthiness, etc. Given that race is largely made up of social construct, the inefficiency of tech companies to address these issues of technological inequality feeds into a sort of “Jim Code”. I feel like this talk really opens up what Noble was saying earlier, and provides an argument about why digital bigotry impacts society beyond just making people feel bad.
Lisa Nakamura
Nakamura starts by talking about the failings of technology in recent years to have the consumer’s best interests in mind, and suggests that the way to resolve this is via VR. VR has this ability to physically put you in the position of others in an immersive way, which some believe will increase overall empathy. I don’t think I agree with this very much. I’m reminded of Jane Elliot’s eye color experiments – wherein she took a class full of third-graders in the 60s and taught them about discrimination by expressing prejudice to either brown-eyed kids or blue-eyed kids. While this was highly effective, she did a similar study with college students, and found they weren’t nearly as receptive, especially white people. I feel like VR or not, the older you get, the harder it’s gonna be to get out of your worldview and empathize.
Technology and Race
Safiya Noble – Algorithms of Oppression
This talk is about the algorithms is oppressing certain groups of people. Actually, I think the algorithms are oppressing all of us. Because it uses the feature to measure people, and people shouldn’t be tagged. Due to the big data and algorithms “we have the tendency that computers make better decisions than human beings.” Women are coded as girls, woman of color are more likely to be mold as pornographic.
Ruha Benjamin – Race After Technology
This video talked about technical issues related to race. People in different colors and races are facing different interfaces. I like what she described imagination as a battlefield. And she gave us some examples like the police app. The purpose of the app is to report crime, but in reality, it became a tool of avoiding the crime. What people see about how dangerous an area is is dependent on the number of people imagined crime instead of real crime seems sarcastic to me.
Lisa Nakamura- Laboring infrastructures
This video is mostly around VR. She talked about people in the tech field who tried to use VR to create empathy, mostly by putting people into another person’s view and life. Such as refugees and other people who are having a hard life. This process is described by Romain Vak as a process of hacking your own body. And there are lots of people who are not able to turn the imagination into empathy, but something else.
Technology and Race
Algorithms of Oppression
This critical approach to systems is the only way we avoid the worst of the likes of Blade Runner and The Matrix. Decisive policy did the job of oppressing minorities on top of existing social ideology. Moving forward this seems to be the work of mega-monopolies wielding algorithms even they themselves don’t understand. There’s a lot to be said about how this critical approach hasn’t gotten widespread attention until it affected an election, and not so much necessarily when these systems so readily and heavily affected black and brown women.
Race After Technology
This idea of technology inheriting the ideas of it’s creators sounds a lot like racist parents having racist kids. It’s super odd how this is a bizarre concept to these developers. They’re used to syntax errors, bugs in the engine, having hardware malfunctions, data loss, etc. But for a time it was inconceivable that their system might not be outputting correctly and that it goes as far as harming specific people that didn’t fit into their original scope of the user/participant pool. This doesn’t speak to the code’s flaws, it speaks to the creator’s mistakes or complete ignorance.
Laboring Infrastructures
Capitalist ideas of infinite growth is starting to sound more like a body builder selling you on the idea of anabolic steroids.
Bro, the gains. THE FUCKING GAINS BRO!
Genuinely, this subsection of humanity existing as a hidden and unpaid working class is getting really, really old. As old as time at this point and I refuse to believe that with all these resources and technologies that its impossible to create a future without what’s essentially slavery or slavery-like conditions. Not saying Capitalism inherently begets slavery, someone smarter than me will say it.
Safiya Noble – Algorithms of Oppression (talk, 45m)
This video was very insightful and actually sent me on a mission to try and find out biases of googles search engine. I thought it was very surprising that not only for minorities, but for women there was ridiculously inappropriate websites coming up for certain searches. Since this video is from 2017, I’m sure there was a lot of different changes that google made to their search engine, especially when searching phrases about race or gender. I understand there is a lot of things about google searches that can’t be patched, what you search is what you search. But there is a a lot of instances where there is deliberate racism and oppression in some searches. When looking at criminals, professional hairstyles, and porn websites, it is not a coincidence that these searches are discriminatory.
Ruha Benjamin – Race After Technology (talk, 20m)
The project in St. Paul Minnesota that tried to prink youth that are at risk ties back to the software we talked about that rates students in college. I feel like this heavily discriminates against certain people, and sometimes can do more harm than help. There shouldn’t be an algorithm, or a company telling you who is “at risk” and who isn’t.
Lisa Nakamura – Laboring Infrastructures (talk, 30m)
This talk takes a different view on racism and technology. Lisa starts the talk discussing how we wouldn’t have any of our novel technology without asian women making them for little money. I find it interesting that she brings up the company Pathos and there website description is that they disrupt oppression and discrimination. Right before that Lisa shows an excerpt from their founder saying that nobody seems to know anything about where VR is going and there is no rules yet. Kind of ironic.
Technology and Race
Safiya Noble – Algorithms of Oppression
In this article, Noble presents the underlying issue of Google’s search algorithm that contains elements of sexist and racist results when certain keywords are looked up. A prime example is the derogatory search results that appear when looking for women of color (most pertaining to the sexualization of them as well as overall pornographic fetishization of that group). She also brings forth another example of the difference in search results when looking up “three white teenagers” vs. “three black teenagers”, and how Google’s response to that backlash (as well as many others) has been discrete and somewhat unapologetic.
How can we as users of this search engine combat this type of oppression? Bring awareness of it to the masses?
Ruha Benjamin – Race After Technology
Benjamin presents the ever-present issue in the technology that we have created that is making it somewhat racist despite it being inherently an objective form. This is due to the fact that there are biases present in the creators, which in turn translate themselves into their creations. She presents the concept of the Jim Code (derived from the Jim Crow laws), where racist methods of oppression exist in the code that many of us end up using. She presents multiple examples of technology that utilizes that. When oppression thrives, retaliation by the oppressed will rise with it, and so she talked about the racial justice movement groups that have emerged to combat this phenomenon.
How can technology in the future be barred from the biases of their creators in order to create something more objective?
Lisa Nakamura – Laboring Infrastructures
VR is not something necessarily new in the industry, however, relative to other existing technologies it is still in its early stages of development, meaning no one really knows what powers it truly holds and how it will affect our future. And with that, many different kinds of experiments and tests have been done to test the capabilities of making a virtual space that distorts the user’s perception of reality, and with that comes artificially constructed empathy. Even with that, there’s an underlying issue of constructing these virtual realities to provide users an experience they would not have experience on their own, which is that many creators have taken upon themselves to simulate oppressive situations of the marginalized, without really taking into account the consequences of that.
What can be done to prevent these constructions of reality from going “too far”?
Junior Project Ideas
I’m in a bunch of other media classes right now and the topics always seem to intertwine, so I’ve suddenly been thrown into constantly thinking about media and its place in our world. I’m interested in topics of surveillance and convenience – which is tied directly to our phones. I even think now about what I say out loud because “do I really want my phone to know that about me?”
Yet, going past our relationships with phones (and surveillance), I’ve also recently been very interested in the idea of social media and ‘likes’. In another media class of mine, we watched an episode of Black Mirror called “Nosedive” – if you haven’t seen it, you should. It shows a future where you rate the people around you (between 1 & 5 stars) and everyone is constantly trying to get a higher rating. Their rating affects their lifestyle – what jobs they can have, what neighborhoods they can live in, etc. We watched this in discussion of the idea of emotional labor. The characters in Nosedive were practicing their laughs and how to be charming to everyone because that would give them a better chance at being rated higher by their peers, but then no one is truly showing each other their true selves.
I bring this up because this could become our future (although, hopefully not now that there’s a visual of what could happen). It makes me wonder about how far our society will go since social media is so important to us and is manipulating us; could we reach this point or something similar in the future?
Beyond this idea, I’m also interested in the concept of emotional labor because I feel like this heavily applies to youtubers – and I watch a lot of youtubers (I watch much more youtube than TV). In fact, I watch a lot of the ‘bigger’ youtubers and will watch all of their content just because I’ve grown to like them over the years. But I think there’s this idea of people being genuine on youtube specifically because they can choose their own content and it’s not like a tv show or series that has an extra layer of professionalism to it. However, we all have to know that people aren’t always this cheery and their lives can’t always be this glamorous – these are human beings too.
So, my main interests are in surveillance, the future of social media manipulation, and emotional labor. Hopefully, I’ll have a chance to talk about all three in my project because they all tie in well together with the help of Nosedive (which I could use as a reference).
So, my working thesis for now will be this: How does emotional labor influence what and how we see each other online from the big social media influencers to our own friends and acquaintances?
(I may need to narrow that down a bit more – let me know what you think!)
Technology and Race
Safiya Noble – Algorithms of Oppression (talk, 45m)
Safiya talks about how algorithms have been secretly been built in a more biased fashion. This has been proven time and time again – as Safiya discusses – through simple things like google searches. The example that really stood out to me was that of the “three black teenagers” and the “three white teenagers”. The first search was almost entirely mug shots of people of color, but the second search was mostly made up of candid photos of white people playing sports or something. The example of the “professional hairstyles” was equally as strong. What was even crazier to me was that google tried to cover it up by changing the algorithm by the next day.
Why do you think google allows things like this to happen? Why would they not try to make searches to do with race (maybe more specifically in image searches though) more equally represented?
Ruha Benjamin – Race
After Technology (talk,
20m)
Ruha argues that racism is productive – and not always in the negative sense, but most often is. She too talks about how algorithms can be racist by looking at a specific example where an algorithm chose white patients over black ones even though the algorithm wasn’t inherently ‘racist’. Instead, the algorithm looked at cost to predict healthcare needs, but black people on average “incur fewer costs for a variety of reasons”.
Do you think it’s possible for algorithms to be created without racial bias – I ask this because we have an example of an algorithm that didn’t even take race into account, but still ended up making race-based decisions. Do you think things like this will continue to occur in the future through easy fixes? Or will we probably encounter this problem for a long time?
Lisa Nakamura – Laboring Infrastructures (talk, 30m)
Lisa talks a lot about VR and empathy – and how VR is not empathetic. It often puts a white audience into a different racial experience – often in a way to bring awareness to something going on in our world. Yet, while this audience might be affected due to the VR experience, it doesn’t always mean that anything will happen because of that. She also talks about the idea of being in someone else’s shoes; she says that being in someone else’s shoes means you’ve taken those shoes.
I’ve been dealing with VR in my Interaction II class: do you feel that VR in general is something that should be pursued and used in this method? Or are there other effective ways to get our points across?