BFA Project (In Progress)

 Those in my Interaction II class will recognize this project. I still want to do some tweaking of it over the weekend though. Karin informed me of having to ‘sandbox’ anything code related for the show, so I’m hoping Ben will talk more about this on Wednesday.

Title: There’s No Escaping COVID-19

Artist Statement: It seems that the COVID-19 pandemic (as a topic) is almost entirely unavoidable via the news. I think to some degree the news reflects our own psyche, which is constantly being bombarded with both information and misinformation regarding this virus. But what do you think? Is it possible to search for something and not have any results be related to what is happening now surrounding COVID-19? Find out with this search bar that navigates through news articles!

Most of my attention has been on making my own website (sierrabenson.com) which has social media links. I’m wondering, since we are technically allowed to have 10 things, if I should include my sketches for the project I ended up ditching? Or if I should stay on theme and have my COVID-19 zine included? (both of which are visible on my website)

BFA Show Update (of the Update)

I’m feeling less of a drive to go through with my original idea… Maybe I will pick it up after graduation, but for now I’ve been thinking about making a website that embeds my API project I recently did for Karin’s Interaction II class. Obviously I will be making some tweaks to the code, but I think that showing how this pandemic (as a topic) is almost entirely unavoidable via the news reflects our own psyche that is constantly getting bombarded with both information and misinformation regarding this virus.

BFA Show Update

I still want to go through with my original idea, but I want to tweaked it slightly so that it becomes a, “I thought getting into a car accident was the crappiest thing that could happen to me in 2020, but clearly the universe had something else up its sleeve,” kind of video.

I also think it would be great to delve into the stages of grief I felt in both instances. In a way, I think going through the stages of grief after losing a car helped prepare me for the stages of grief I went through after realizing my senior year was not going to be as planned; because we need to social distance (thanks coronavirus -ugh). It is peculiar —and upsetting— how in both cases finances become a major concern; one is admittedly more personal while the other is more global.

I want to address the absurdity and selfishness of those who are in panic mode, but I also want to express the silver linings. Ultimately, I hope this project can help heal some of the trauma we’re all experiencing now with a bit of humor.

AI/Predictive Analytics/Recommendation Algorithms Responses

James Bridle’s Something is wrong on the internet: I think I had a Dell laptop at 13 years old and used Facebook all the time… I remember when there were no ads on Youtube and Facebook had only ‘like’ and ‘become a fan’. Not that long ago I actually looked back through my Facebook timeline with cringe and awe. I have definitely brought up my concerns with content aimed at kids on YouTube in the recent past. It never before occurred to me how easy it is to go from a verified page to a non-verified page with autoplay on. What the heck is going on with these ‘finger family’ videos!? I don’t recall the corruption of Peppa Pig, but I do remember Spiderman Elsa… I agree that it’s not about what teenagers can or can’t handle, nor about trolls, it’s about very young impressionable minds (babies/toddlers) being traumatized by content that targets them on the internet.

Memorable Quotes: 1) “I don’t even have kids and right now I just want to burn the whole thing down.” 2) “It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives.” 3) “This is a deeply dark time, in which the structures we have built to sustain ourselves are being used against us — all of us — in systematic and automated ways.”

Rachel Metz’ There’s a new obstacle to landing a job after college: Getting approved by AIThis is insane! I’ve never heard of HireVue (an AI gatekeeper for entry level jobs)… Although I am horrified, I can also see how this could be convenient for employers who have to choose from a large applicant pool to fill one job position. The problem is that I highly doubt a computer can detect “empathy” or a “willingness to learn” and I think raising a laptop camera to be eye-level is ridiculous. When it comes to an AI testing for confident or negative language I’d probably strike out on both fronts… but that shouldn’t mean that I’m a bad employee! I didn’t know that there was an Electronic Privacy Information Center (EPIC). I think the EPIC asking the FTC to investigate HireVue’s algorithm is futile; then again, the FTC did look into TikTok.

Jia Tolentino’s How TikTok Holds our AttentionAm I the only one who thinks it’s shady that nobody who was affiliated with TikTok (which Tolentino got in touch with) seemed to know anything about TikTok?! Also, users not seeing a whole lot of the Hong Kong protests through this one particular social media is super sketchy… I actually did hear about what’s happening to the Uighurs not too long ago… It is strange to me that all it takes to have a chance for a record deal is to have a sound bite which gets popular on the app (not a full length song), but I do like Old Town Road. I mean how can you not? It was such a huge phenomenon! That said, I will never understand why young girls flock towards Jacob Sartorius or Jake and Logan Paul. This guy, Zhang Yiming, is giving me Chris Wiley vibes… How do these guys do these things so young?!

Rachel Thomas’ The problem with metrics is a big problem for AI: I don’t think I’ve had an algorithm grade my essays, but I do think some of my school papers have been put into an algorithm to test for plagiarism. This article reminds me a lot of Jill Walker Rettberg’s and Cathy O’Neill’s TED talks. Of course length and sophisticated words are all it takes to game an essay grading algorithm… It’s depressing that the actual content doesn’t matter all that much. Unfortunately, I think we’re living in a time where people don’t truly understand statistics. I watch stuff on YouTube all the time, but that doesn’t always cause me to feel happy —and even if some of it did, correlation does not equal causation. I have seen some white supremacist channels come down, but yeah, Youtube/Google has a problem with letting that crap infiltrate (I’m being reminded of ‘Sad by Design’ now).

Eric Meyer’s Inadvertent Algorithmic CrueltyI’m so heartbroken! I personally try to avoid these ‘look back’ algorithms like the plague… but I am a sucker for Spotify end-of-the-year playlists. Of course Facebook doesn’t really care about asking any of us for permission. It’s already not asking us how much of our privacy we want… I think we all know by now that Mark Zuckerberg’s empathy level is non-existent.

Big Data/Algorithms/Algorithmic Transparency Responses

Frank Pasquale’s Black Box Society: I get paranoid thinking about how the government/financial institutions want to remain secretive while still having the ability to track/surveil the general public. As an American, I know all too well about our government lying or not telling the whole truth about our past wars; and I’m sure they would prefer the public to not know anything at all. I also know that Super PACs (i.e. CEOs of corporations) can fund ads -with no limit- for or against political campaigns which must have a huge algorithmic impact. Honestly, I can’t hear about ‘nondisclosure agreements’ without instantly thinking of Mike Bloomberg and Donald Trump. Pasquale uses the word ‘black box’ to refer to how we are constantly being recorded, but in a secretive manner by a secretive entity. What is described in this book is very reminiscent to me of Cambridge Analytica and surveillance capitalism (coined by Zuboff). I think this short clip from The Simpsons Movie (which came out before Snowden) humorously depicts the ‘black box’ society we live in. Although I know about obfuscation, I didn’t know about real v. legal secrecy (one refers to locking a door or making a password, the other is about keeping sensitive information a secret i.e. not giving out SS#s). I guess the Equifax data breach would be a great example of how legal secrecy can be violated.

  • Is transparency the answer to uncovering government/financial/corporate institutions’ secrets?
  • If so, how might we obtain more transparency?
  • If not, is there another way? Should we just live with the unknown secrets?

Cathy O’Neill’s The era of blind faith in big data must end: O’Neill first asks, “What if algorithms are wrong?” and then follows up with, “To make an algorithm you need data and a definition of success.” According to O’Neill, everyone uses an algorithm with or without code. One example of an algorithm used without code has to do with the process of making a meal. Whoever is making a meal has to factor in the time, resources, and energy required to make it (data). Also, the success could be eating enough vegetables; but if there are children, eating enough sugar could be their own success (which probably isn’t good for them in the long run). Ultimately algorithms are opinions, not objective science. It’s sad that teachers lost their jobs over inconsistent data. It seems hard enough just to be a grade school teacher in the U.S. …Codifying sexism, racism, homophobia is definitely a major concern of mine. I think the game Portal is a great example of how algorithms can go wrong. The main robot in Portal, GLaDOS, practically embodies an algorithm running tests (in the name of science) which ultimately achieve the death of people.

  • What are other examples of algorithms that don’t operate on code? -Could getting a degree/diploma be considered another example? (With data being the classes/hours it takes and success being graduating)
  • What kind of opinions do you think social media algorithms have?

Virginia Eubanks’ Automating Inequality: Eubanks starts the lecture by stating that we are building a digital version of a 1820s poorhouse. Apparently in the 1820s there was an economic depression and of course the concern was of people being ‘dependent’ on the government and not of people living in poverty. Unfortunately it has been two centuries later and this same sentiment seems to be around today. I didn’t know ‘county farm’ is coded language for where a poorhouse once was… Anyway, Eubanks gives a couple of modern day examples of this phenomenon. In Indiana, applying for food stamps (a.k.a. SNAP) turned into a digital process which ended up leaving a lot of eligible people without the program. Occasionally those who went through the process of signing up would get notified of an error, but there were no specifics as to what the error was. The story of a woman with the last name Young was heartbreaking! She couldn’t go through the welfare process while in the hospital with cancer, so of course she loses her coverage and dies before the court case reaches a verdict. According to Eubanks, the middle class can pay with private benefits so there is not as much data on them. A system that only has data from the lower class is obviously going to conflate parents living in poverty with poor parenting. For child protective services this lack of data from middle class families can result in false negatives (i.e. not seeing harm when there may be some) and for lower class families this can result in false positives (i.e. seeing harm when there is none). Not to mention that there is a racial bias when it comes to child welfare. This racial bias is most apparent when communities call on black/biracial families (what Ruha Benjamin referred to when talking about the citizen app and the BBQ Becky). Eubanks says that California’s VI SPDAT collects data on the homeless for housing opportunities, but it’s possible for the LAPD to abuse that data. Also a guy who goes by Uncle Gary living in skid row filled out the VI SPDAT multiple times, but was not seen as vulnerable enough for housing. It just goes to show that welfare systems are not better helped by status-quo algorithms. I think this lecture had a stronger connection to The Computer Says No skit since healthcare is brought up. I can’t imagine a 5 year old needing a hip replacement, tonsils sounds more like it. I swear there’s too many customer service surveys these days…

Janet Vertesi’s My Experiment Opting Out of Big Data…  Opting out of personal data collection isn’t easy when it makes you appear as a rude family member, inconsiderate friend, and/or a bad citizen. Vertesi talks about her experience trying to not have big data detect her pregnancy… I can’t imagine downloading something probably used for the dark web (Tor) just to visit BabyCenter.com in privacy or setting up a separate Amazon account and buying gift cards in cash to purchase things online. I feel bad for the uncle who PMed her a congratulations, but yeah, I see Facebook as trying to calm the “Big Brother is watching you” anxiety by calling it a ‘private’ message. I swear I’ve seen ads on my Facebook Messenger too! I have a friend who’s very protective of their privacy so I only message them through Signal. They also use duckduckgo and Firefox. I should probably get into that habit, but I don’t think I can give up my gmail and google drive.

  • Do you think it’s even possible to live a life completely ‘off-the-grid’ today? Why or why not?
  • How many conveniences/relationships would you have to give up in order to do so?

Technology and Race Responses

Lisa Nakamura’s Laboring Infrastructures: Nakamura starts her lecture with the fact that women of color are working low wages to make the technology we use. It isn’t until 2018 that big tech companies are questioned and become not seen as “the good guys” anymore. Facebook getting fined $5mil for privacy violations isn’t enough to deter them into doing the right thing. Ethics in AI seems to have been given more attention than empathy in VR. Tech companies tote VR as an empathy machine that’s able to ‘fix’ racism, sexism, etc. by having mostly rich straight-white males become refugees, people of color, sexual harassment survivors, transgender and/or disabled people (a.k.a. the undercommons) in VR. Nakamura calls this toxic re-embodiment because they are inhabiting the body of someone who doesn’t own themself. Examples Nakamura includes is My Beautiful Home from Pathos which invaded the privacy of people while filming, One Dark Night (a work about what happened to Trayvon Martin) which reproduces racial violence in the name of reducing it, and the BBC losing their minds over a VR revelation that the player embodies a black woman getting a haircut (if the player was revealed as white there’d probably be no story). Nakamura brings up identity tourism and describes it using Adrienne Shaw’s terminology- to identify with instead of as. Overall, the undercommons don’t have the luxury of taking off a VR headset and living without their own personal struggles. The reality is that VR is being used to optimize privileged people too busy to do the real thing (engage with people suffering); and therefore, must have a ten minute VR experience to automate their feelings.

  • Do you think Mark Zuckerberg is an empathetic guy or a failure at it? -I think having a VR tour of Puerto Rico after they’ve been through a flood just to talk about new Facebook features is insensitive.
  • Do you think VR’s ’empathy’ (or is it sympathy?) is enough to fix exacerbating inequalities, or do we really need rules/regulations?
  • What are your thoughts on ‘feeling good about feeling bad’? Can white people enjoying slave literature be seen as similar to white people enjoying VR?
  • If the ‘undercommons’ were given the chance to make their own VR, what might that look like?

Memorable quotes: 1) “If somebody is going to put you in the shoes of somebody, that means that they have stolen those persons’ shoes” 2) “VR is the totem of the overdeveloped world”

Ruha Benjamin’s Race After Technology: Benjamin says racism is productive -not an accident, outdated, or a glitch. Sociologists will say race is socially constructed, but they don’t typically say racism constructs. When it comes to technology, our social inputs matter. Benjamin describes imagination as a battlefield; and the fact is that many are forced to live in another’s imagination. Benjamin mentions the citizen app was first used to report crime, but is now used to avoid it (the Starbucks racial bias incident is coming to mind). I love that later on in the lecture the White-Collar Early Warning System counteracts this racial profiling with a much larger crime going on that’s more readily ignored (white male corporate execs embezzling millions/billions). Knowing that Google is using AI to make drone strikes more effective and Microsoft is working with ICE disappoints me, but it doesn’t surprise me. I’m also not surprised to know that racial bias in a medical algorithm favors white patients over sicker black patients. Of course it’s not focused specifically on race, or need, but on cost; and these low costs could be correlated with systemic racism. I am familiar with Michelle Alexander’s book The New Jim Crow, but I wish I was told more about The New Jim Code which covers engineered inequality and default discrimination. I also wish we didn’t need Appolition (an app that gives bail money to incarcerated black people), but I’m glad it exists. I should check out Data for Black Lives and The Digital Defense Playbook

*After the talk there is mention of a soap dispenser not dispensing onto dark skin which reminded me of this HP motion tracker video only working for those with light skin.

Safiya Noble’s Algorithms of Oppression: Algorithms are biased too and can’t make better decisions for us! Google searches put misogyny and racism to the forefront. I didn’t know that during Obama’s presidency in 2015 if you searched Google Maps for the N-word, it gave you the White House, but I do remember there being a lot of other appalling racist crap coming out during his presidency. It has been incredibly disheartening to witness (during the 2016 presidential election and after Trump) how easy it’s been for white supremacists to manipulate the internet to their advantage. Hate groups thrive in an echo chamber. I’m not surprised that googling “three black teenagers” produces a lot of mug shots while “three white teenagers” produces a bunch of phony stock images. Of course Google would try to ‘correct’ their algorithm to save face, but bumping up a white teenager in court for a hate crime still feels like an affront. I’m also not surprised that professional hairstyles for work are not only racialized, but pertain to women. It’s unfortunate that economic bias has been talked about more prevalently in regards to algorithms than racial or gender bias. It’s no coincidence that when searching for ‘(black/Asian/Latina) girls’ on the internet -you get porn, which ultimately infantilizes women and eroticizes any/all women of color. It’s apparent that old media is re-emerging in new media.

The Social Photo/Media Metrics Responses

Nathan Jurgenson’s The Social Photo: Jurgenson describes social photography as photos people take/share of their everyday life (i.e. selfies, food pics, etc.) in order to communicate. It started with point-and-shoots, but has since changed to smartphone cameras… I swear I have witnessed ‘amateurs’ turning more and more professional on social media… After reading this, I now know ‘pics or it didn’t happen’ was first coined by Emile Zola in 1901. Ah 1800s photography, takes me back to high school when I took an old-school/dark-room/black and white film photography class. I’ve never heard of Hipstamatic, but obviously I know about Instagram. And yes, ‘vintage’ things kept making a comeback in the 2010s. Maybe it will still continue in the 2020s… I’m kind of reeling from the fact that I’ve never thought twice about why we use the terms/metaphors ‘file’ or ‘folder’ on the computer… The idea that the gentrification of inner cities demonstrates middle/upper-middle class white people’s desperate search to find authenticity outside of the fakeness of the suburbs/Disney/McDonald’s troubles me.

  • Can you be nostalgic for a time you never lived during or a place you’ve never been?
  • Do you think the fascination with vintage looks is caused by nostalgia or a need for authenticity? Both? Neither?
  • Do you see yourself as taking photos of experiences rather than objects?

Jill Walker Rettberg’s What can’t we measure in a quantified world?: Walker begins her talk explaining her activity in Fitbit graphs and then segues into describing this more recent phenomenon of being able to ‘measure’ yourself (via geolocation or residual data on phones/’wearables’). Walker considers automation to be a human dream since it seems anything humans do can now be measured by machines. For children, it’s tracking their schooling based on how much discipline they received and/or time they spend on homework (reminiscent of the SpotterEDU attendance app). For babies, it’s monitoring their milk intake, weight, and/or sleep (with an ankle bracelet?). I do firmly believe that humans desperately try to find patterns where there are none. Walker debunks the idea of Dataism, a belief that data is always true, by suggesting the steps her Fitbit counts is not definitive because the device might be moving without her feet. This talk reminded me of my boyfriend who downloaded a sleep app and has been showing me his graphs. Of course me and a dog are also sleeping in the same bed, so it’s probably finicky at best. Side note, I did like seeing Johanna Drucker’s subjective visualization which acknowledged gender as not rigidly binary.

  • Do you think social media has made you a post-industrious citizen?
  • Would you use a sex tracker which measures bed movements and noise? Why? I guess if you wanted to obfuscate it you could just play a bunch of porn…

What do Metrics Want? How Quantification Prescribes Social Interaction on FB: A future where quantity matters more than quality depresses me… Recently I have been a lot more passive than active on Facebook. I don’t write posts, like, or comment all that much. I think high numbers and the ‘more, more, MORE’ mentality overwhelm me (I should really be using the Facebook Demetricator). I mostly go online to see what others are doing; it’s kind of like living through others’ posts (which may not be super healthy, but I can’t help that I’m a hardcore escapist). I can’t argue that capitalism thrives on us wanting more. I’m not surprised Facebook will use metrics to manipulate people into clicking an ad. Clearly this article is slightly dated, since it only talks about likes, I’m curious how Facebook views the emotional responses? If I’m sad or angry about a post are they less inclined to show me similar posts? I never thought the number of friends you have on Facebook could correlate to social capital, but yeah, these days it feels like there’s more of an emphasis on social networking now than ever before. I don’t like the thought of me being ‘homogenized’ by a system. I do think it’s funny that someone would have anxiety over liking something unpopular.

Interface Criticism/Tactical Media/Software Art Responses

Geert Lovink’s Sad by Design: There’s a techno-utopian self-help movement called Cosmism? I didn’t know what Douglas Rushkoff was talking about at the start of the podcast, so I found this article. I guess Stanford and Silicon Valley met Russian cosmists and came to the conclusion that machines can do things better than humans (a.k.a. humans are the problem and tech is the solution). According to Rushkoff, technologists and corporate capitalism enable each other with their own anti-human agendas (this harks back to Zuboff). Rushkoff is trying to fight back against these agendas; hence why this podcast is called ‘Team Human’.

The guest on this podcast, Geert Lovink, wrote a book titled Sad by Design which critiques ‘platform capitalism’. I think platform capitalism refers to large tech corporations (such as Google, Facebook, Amazon, Uber, etc.) creating platforms that profit off of young people being addicted to their online services. This becomes like an abusive relationship… Rushkoff brings up Snapchat’s streak feature (I vaguely know of it but I don’t actually have a Snapchat). Selfies are also brought up as being expressive yet voiceless… I don’t quite understand what Lovink is saying regarding today’s social movements, because I feel as though both Black Lives Matter and the Me Too movement have gained a lot of traction thanks to social media and both are organizing. Not to mention Greta Thunberg, a 17 year old, is the face for climate change activism. I guess the need for these social movements does make one sad though.

  • Do you think the smartphone will eventually become looked down upon as outdated/uncool?
  • Should men be designing for a female audience? What are the limitations?
  • We know the youth are being surveilled more than ever before, but are their voices also being drowned out?

Wendy Chun’s Programmed Visions: Chun makes a lot of metaphors in this chapter… I’m going to try my best to decipher one. To start, new media is described as an elephant from a parable involving blind men. Basically these six men are describing the parts of an elephant without acknowledging the fact that it’s an elephant. Apparently those in the field of new media were doing this when talking about specific content/tech and have since tried to look at the big picture which is referred to as software. This seems incredibly hard to do since software is defined as “a visibly invisible or invisibly visible essence” by Chun. The quote, “Its separation of interface from algorithm, of software from hardware — makes it a powerful metaphor for everything we believe is invisible yet generates visible effects,” I can envision due to having written code. In other words, the code isn’t immediately present, but the webpage is.

Matthew Fuller’s How to be a Geek: I feel as though the quote, “To be a geek is, in one way or another, to be over-enthused, over-informed, over excited, over-detailed,” fits me to a tee. I also identify with the statement, “Geeks may often mute themselves, try and pass as underwhelming. This is probably an adequate survival strategy in many circumstances.” I think I always present myself as nothing special and assume other people know as much as/more than me. Ever since I’ve arrived at UIUC I’ve been a straight-A/Dean’s List student and I haven’t really cared to talk about it all that much. No really, I have a serious problem where I can’t internalize my own accomplishments. Though I guess my imposter syndrome juxtaposes the geeks Fuller is mentioning, because really these geeks can’t seem to internalize any negative consequences caused either directly or indirectly by their accomplishments (ahem, Wiley).

Soren Pold’s New ways of hiding: towards metainterface realism: I’m facepalming so hard right now, I thought it was pronounced metain-terface instead of meta-interface… Lol! I’m kind of curious about Galloway’s “speculative realism,” Walter Benjamin’s concept of “tendency,” and Donna Haraway’s “feminist objectivity,” but too lazy to start deep diving into each. I’m definitely guilty of using an ad blocker (once again, Zuboff and Cambridge Analytica come to mind) and I think I’ve heard about Safebook at some point. Algorithms Allowed sounds really interesting; it is weird/hypocritical for the U.S. to have an embargo against certain countries while U.S. based companies (like Facebook and Google) interact with them. Though I guess this is really being used to secretly surveil them… I love that James Bridle captured video only to remove people in motion and leave the homeless visible, because society often treats homeless people as invisible. “Grammar(s) of action” gets brought up a lot in this article, and I couldn’t figure out what it meant with context clues, so I searched the name Agre. As for Quickening, I guess I can empathize as a young woman living in America who witnessed the wave of abortion bans take place here. Of course women are going to go where they can decide their own health/circumstance, but it is insidious for there to be heavy surveillance/video evidence of them at this seemingly open border. Imagine if the states in the U.S. with these abortion bans started tracking women and incriminating them for crossing state-lines for an abortion… I can’t imagine signing a terms of service contract for free coffee; then again, I do it all the time for “free” things on the internet.

BFA Show Idea

I loved doing a rotoscope animation last year (it was stressful and tedious at times, but I felt like the final result was worth it). So, I want to do a rotoscope animation with some voice over about my car anxieties.

  • I’m 23 years old and I’ve been in two car accidents now -one as a pedestrian, the other as a driver
  • They both seemed to have happened at roughly the same time each year (before Valentine’s Day)
  • The money involved when it comes to car accidents is overwhelming -I often wonder if I settled for less as a pedestrian
  • The day immediately after an accident happens my senses are heightened (i.e. not jaywalking even if there are no cars around or being a backseat driver) and my body aches (frayed nerves)
  • It’s official, my car is a total loss and it didn’t even deploy the airbag -on the hunt for a new one

Rotoscope animation of the past, vector drawings turned into an animation, and the car accident documentary.