Project Rough Draft Update

The Final Project I settled on was a home vlog for every day since quarantine.

Here is a sample video of my project. This is unedited. I was considering shortening it because in the video it was before we cleaned our house up. My family doesn’t like it but I think it makes it better to show it progressively get cleaner as the days go by. I actually used up my 128GB SD Card some are stored on my phone camera. And I still have some footage on my 8GB one… So I might have to cut some footage for time’s sake rather than for artistic reasons. Hopefully, after I get some work out of the way I could finish this…

I’m thinking to include videos and pictures

Title: My Life in Quarantine

a representative image , 1080×1080 pixels (if you could only use one image this is the one)

I think this is a good depiction of the whole quarantine situation going on. While most of my footage was taken from home, I think this is the most representative.

Updated Project

As my Junior Project I’ve decided to create a vlog of my everyday life since quarantine struck. I just recently made a video of me making dinner with my family.

I plan to go more in detail with voiceovers on the videos but editing is slowing me down. The videos unedited are fine as is but just long. I plan to have it be a bit of mixed media. Some could be through tiktoks, phone/snapchat videos, or just camera videos. I was originally going to upload a video each day but it hasn’t really happened since some videos are actually pretty long and require editing so we’ll have to see.

AI / Predictive Analytics / Recommendation Algorithms (11 Mar):

James Bridle – Something is wrong on the internet (Medium)

You know, this is actually a rabbit hole I found myself extremely interested in!! (I almost want to change my junior topic to talk more about it or something similar…but I’ll email you.) More and more I see these “kids channels” on Youtube have been shadier and shadier. The fact that they have so many subscribers and I fail to believe MILLIONS these children or their parents would watch these shows (quite creepy) let alone know how to subscribe to those channels. I do believe bots or some sort of AI is involved in exploiting the youtube algorithm. Watching these videos is honestly nightmare fuel. While there is nothing decidedly gross, or disgusting or inappropriate, it’s that feeling where something just isn’t right… I’m sure you all have heard of ElsaGate (if not I suggest you look it up). Youtube needs to step up and take control of these videos. They shouldn’t be suggested, they shouldn’t even exist.

Rachel Metz – There’s a new obstacle to landing a job after college: Getting approved by AI (CNN)

It’s interesting, it kinda reminds me of the Ted Talk we watched where the computer may be biased against women because in the past only men have had the job. Because the difference between a human and a computer is that people trust computers blindly and humans can make changes to unique situations. It’s even worse when applicants don’t even know what it’s looking for (I’m not sure if employers even know). I understand that companies are growing and interviewees are plenty, but I feel it’s unfair to choose them based on an algorithm.

Jia Tolentino – How TikTok Holds our Attention (New Yorker) (read or listen)

I remember I started liking a lot of the “relationship” tiktoks and so it would continue to show me more of what I wanted. It’s so interesting to me how these creators can all of a sudden get so big, so quick. Which I guess shouldn’t be surprising given the age we live in right now. It’s on a constant timer. Even when I watched the example tiktok, if I continued to watch the screen for more than 5 seconds it would refresh with new tiktoks for you to click and enjoy, and would continue to do this the longer you stayed. I got a tiktok because I thought it would be like vine (and in some ways it is). Though most vine stars have moved on to different things now.

Eric Meyer – Inadvertent Algorithmic Cruelty (article)

Facebook memories are pretty hard to look at sometimes. Life happens quickly and the code doesn’t really know any better. I know from some of my friends, that it can also affect trans people who no longer identify with their past self anymore. Or if there was a picture of ex partners or past friendships that are no longer a part of your life.

Rachel Thomas – The problem with metrics is a big problem for AI (article)

The problem with metrics is the fact that there will always be a factor that hasn’t been considered. I feel like for a lot of these AIs, the information is taken at face value. It also feels like the things that people click on will encourage more people to click on it, even though it could be badly written or have false information in it. Like the recommended video was Russia Today’s take on the Mueller report and it garnered a lot of attention and clicks. It’s almost like that phrase where the rich get richer. The clicks get clickier? Or the fact that these metrics are looking for something in particular in a way that promotes those to cheat, for example the teachers who don’t cheat may be penalized by the system.

Big Data / Algorithms / Algorithmic Transparency (4 Mar):

Frank Pasquale – Black Box Society – chapter 1 (pp 1-11)

  • I really enjoyed the metaphor used in the very beginning. That the light is the powerful source that he has to work around. Comparing that kind of power to big data. The fact that nobody knows too much about big data is the reason why it is so powerful. This article argues that this “knowledge problem” is probably for good reason, “to whose benefit”. For pharmaceuticals, they mention that they are allowed to hide the dangers of a drug. Everything we do online is recorded, our credit, our phone location, so where does that information go? For how long? By whom?
  • How can we get that control back? Do we need to continue to rely on whistleblowers? Will companies ever do anything when they know we’re eating at the palm of their hand?

Cathy O’Neill – The era of blind faith in big data must end (Ted Talk, 13m)

  • Algorithm = data + definition of success. She mentions how they take the teachers and scores of their students into an algorithm to shame them, yet nobody outside could access the algorithm. Now how is that fair, when the algorithms could be inherently biased or does not have the full context of information? I also thought the idea of an algorithm of fox news would filter out women because they have not really been the face of success in the past, or most likely people of color as well. This is the type of systemic racism that is put in place today, with police. She suggests algorithm checking through a) data integrity b) definition of success c) accuracy and d) long term effects.
  • How can we allow the public to gain access to these algorithms?

Virginia Eubanks – Automating Inequality (talk, 45m)

  • It’s interesting how throughout the years, even though we change the algorithm, we still end up being racially biased. First it was containment (1819), then investigation (1873), and then digital surveillance -> prediction (1973) which we still use today to keep POC, specifically black people, in their place. Because even with good intentions, they can still have bad outcomes.
  • Have algorithms ever prevented white men from doing anything?

Janet Vertesi – My Experiment Opting Out of Big Data…  (Time, short article)

  • This kind of experiment is really interesting to see just how long you can keep a secret from big data. The internet, other people’s comments, pictures, credit cards, cookies, phone tracking, messages, etc. What I thought was a really powerful comment was the idea that “No one should have to act like a criminal just to have some privacy from marketers and tech giants”. We shouldn’t be required to give our personal information in order to not be perceived as a criminal.
  • Will big data just get even bigger? Is there no way to minimize it?

Walliams and Lucas – The Computer Says No (comedy skit, 2m)

  • The reason that this can even become a comedy skit is the fact that this kind of thing is so relatable and happens so often. We don’t question the system even when the system is clearly wrong. Instead of fact-checking it, we’re more likely to just (as Cathy O’Neill put it) put our blind faith into it. Yes, this is just a sketch, but life imitates art, and there is a sense of accuracy to what is being portrayed.
  • How can we convince the general public to be more cautious in their trust?

Ideas

I was thinking on doing something related to our relationship to AI. Things like nintendogs or visual novels, or our reaction to Boston Dynamics kicking their robots. Things that make us feel things for inanimate, unfeelable AI

Tech and Race

Algorithms of Oppression: To be quite honest, I’m not surprised at all. I remember hearing of her work before, I believe from your Interactions I class and it’s honestly quite moving how a search engine, that we would think to be neutral, offers up so many racist and misogynistic suggestions. How three black teens are seen criminalized while three white teens are innocent stock photos or what women cannot/shouldn’t/should/need to do or the hypersexualization of asian/latina girls. Google is merely a reflection of our own society and what we search up. Are we really that surprised when we live in a country that voted for a racist misogynist to become the president?

How can we change a search engine to become neutral when the data is dominated by the users? Is it google’s fault, or ours?

Race After Technology: While Jim Crow is gone, Jim Code lives on through racist robots. It’s interesting to see that while this technology was never born racist, it became racist because it lacked the context of the systemic racism already put in place. This also made me think of the HP(?) computer that was racist because it could not identify a black face because most likely, the people who made it, tested it on themselves and did not consider POC (which is why a diverse work environment is so important). The Stop The Cradle To Prison Algorithm reminded me a lot on the other topic we addressed in where they also tracked through GPS location, collected, and shared the data of college students to see which students were at-risk in class. I found the white-collar early warning system hilarious because this is the type of crime that we don’t really see being reported on with facial recognition of likely perpetrators (the average white male). 

In what ways can we help black youth succeed? How can police learn empathy and create a safer relationship with these kids?

Laboring Infrastructures: I remember Lisa Nakamura! She introduces the idea of empathy in VR. Which I am still fascinated by. We are so accustomed to our own ways of life that it might be hard for us to be put in the shoes of someone entirely different. Many people don’t care about what happens to other people, as long as it’s not happening to them. Having VR give the public a chance to experience Trayvon Martin’s last moments, or even just the life of the average black woman or transgender wheelchair users still offers that unique experience of being able to feel what they feel.

I feel, however, that this idea might have to be disguised in some way because I feel like people whose mind we genuinely want to change, will not give this a shot because of the title

Social Interaction, Social Photography, and Social Media Metrics (12 Feb)

Nathan Jurgenson – The Social Photo – (book, pp. 1-15)

I remember when I first got Instagram, I’d add all my friends and wheel through all the different filters to see which one looked the best. I can definitely attest to these old, vintage, nostalgic styles. I think it’s interesting because I think people my age who were born in the late 90s, we grow up with these older technologies of cameras where we take these less accurate and pixelated images, yet now as we’re older we yearn to go back to those childhood days and place these filters on as if we were transported back.

Jill Walker Rettberg – “What can’t we measure in a quantified world?” (talk, 20m)

It’s crazy that in such a digitized world, almost everything we do can be tracked or measured. This idea of life-logging or self quantifying seems both immensely interesting and at the same time scary. We were already talking about third parties taking our data, how would we know that these apps aren’t reusing the information we give them? That’s another topic, though. I was surprised that someone even created an app to measure sex. Something that people would consider so intimate and emotional, data and logic just seems to be the complete opposite and frankly a mood killer. Speaking of tracking, I downloaded an AI texting app that keeps track of the texts that I send over my phone. It counts the amount of times emojis are used, question marks, words in order to measure and guess what kind of a person it is that you’re talking too and will tell you to check up on someone if they’re acting weird. While I have always thought that not everything could be measured, I might have just been convinced otherwise. Sure, some data is unreliable, but we learn from it, fix the tweaks, and we continue measuring.

Ben Grosser – What do Metrics Want? How Quantification Prescribes Social Interaction on FB (article)

I won’t lie, there are times that I have been on Reddit, and if something has a ton of upvotes I will upvote it without thinking too much into it and vice versa for ones with a ton of downvotes. While I would normally go with my gut, when I see these numbers, my instinctual reaction is to follow the majority. I also felt this way with mutual friends on Facebook. If I don’t know who they are, I might accept them just solely based on the fact that we have 300 mutual friends (I don’t do that now for my safety now, and will ask others about them before I accept). Even outside of social media. We are also judged on our self worth numerically through grades, net worth, and even age! If somebody is younger than me, I normally feel more confident talking to them since I feel a bit wiser. I’m wondering if our idea of good scores in academia correlates with our want for high scores in social media as well. The more metrics we give of ourselves, the more they use it against us.

Interface Criticism / Tactical Media / Software Art (5 Feb)

Wendy Chun – Programmed Visions, (book, pp. 1-2, and optionally pp. 3-10)

The elephant metaphor I found really interesting. That although we may be given a part of the same thing, we can still have different ideas on what something is. One thing I didn’t like about the article was how they decided to word everything. They made it unnecessarily wordy and applied a lot of repetitive words to make it sound sophisticated but it really just made it more confusing to follow along with what they’re saying. Although, in doing so- I do know that it’s focused on software, invisible/visible, known/unknown, and metaphors.


Matthew Fuller – How to be a Geek (book, pp. 12-14, and optionally pp. 63-71)

I don’t usually reflect on the definition of geek. When I think of the word “geek”, I think smart, awkward, computers, etc. The way that Fuller describes a geek gives it a much more complex spin on it. It also talks about the complexities of software and how you can’t really place things into one category because there is too much that goes into it. My favorite quote in here was that “contemporary technology is not simply an extension of a man–a purely mechanical effect”. Suggesting that as humans, technology is just another way we operate.

Geert Lovink – Sad by Design (podcast w/ Douglas Rushkoff, 60m)

It’s funny. When my brain is not being stimulated, I feel bored so I start to scroll on social media- there’s nothing really too interesting but temporary entertainment and when I finish scrolling, I’m sad. I spent all that time doing something that wasted my time. Seeing how wonderful and more successful people are doing than me makes me sad. It’s ironic that seeing other people just as sad as me, makes me happy knowing that I’m not alone. Even online celebrities are seen breaking down despite them having so many fans and fame because people are always expecting more from them. Something as stupid as a snapstreak can make you upset if you forget to send a snap back one day- a way that it strategically gets users to come back and use their app. These apps make you unhappy so you can try to do better, become happy, only to be unhappy again. It’s this weird emotional manipulation that gets you trapped in this social media design.

Soren Pold – New ways of hiding: towards metainterface realism (article)

Similar to what we talked about last week, this also touches on the idea of surveillance and profiling through data. How to the average person, they don’t even understand what is going on or how they are being tracked. I feel that it is one thing to tell the public what is happening to them and another to completely shut and hide what is going on. People don’t read the terms and conditions- and if you don’t accept them, you can’t use the product. People are so willing to give things up for convenience- this is pure manipulation through capitalism. Your idea of Safebook and the autonomous trap I found humorous in the way that even without words or explanation, we know how this operates. We understand the language of Facebook and the language of the road as to not even need instructions to interact with it.

Surveillance / Privacy / Resistance (29 Jan):

Shoshana Zuboff – The Age of Surveillance Capitalism (video documentary, 50m)

I’m familiar with Zuboff’s work. I’ve actually read a lot into her writings on The Information Panopticon and The Discovery of Behavioral Surplus. Big companies explain that they’re using the data that you give them to improve their services- which is true, but they also use that data to analyze and profile their users to recognize their behaviors and essentially exploit their users for their data. We are so ingrained with the idea of convenience that we feel that giving up this data is worth the exchange.

Helen Nissenbaum – Mapping Interventions – Digital Democracies Conference (talk, 30m)

This felt a lot more positive than the rest because she actually poses solutions for what seems like a neverending uphill battle with our privacy concerns. Things like TrackMeNot and Cryptogram are both really useful tools that can be used to protect your privacy and encrypt your data from websites. AdNauseum also seemed pretty interesting. I use AdBlockPlus which is similar except for certain sites like Youtube (in an effort to support the YouTubers who create this content for me).

Carole Cadwalladr – ’I made Steve Bannon’s psychological warfare tool’ (Guardian)

I’ve heard of the whole Cambridge Analytical thing but I never really took a deeper look into it. The whole idea of taking data from MILLIONS of profiles in an attempt to target political ads in order to rig the election is horrifying and dystopian. To think that the ads that we receive are targeted based on everything we do online and offline. The surveillance being used on online netizens is astonishingly invasive and what makes it worse is that it’s invisible.

Stuart A Thompson and Charlie Warzel – One Nation, Tracked (NY Times)

A quote that really captivated me was the idea or “describing location data as anonymous is a completely false claim”. Who else would go to your address and your work at the exact times that you do? We are constantly being tracked and although users may have “consented” to this, how likely is it that the privacy policy that they read made it clear that this was happening? It also makes a really important point that I hadn’t thought about… Victims of abuse and people in the LGBT Community. The idea that this information could be bought and sold makes it even scarier for those who are possible targets in danger.

Drew Harwell – Colleges are turning phones into surveillance machines (Washington Post)

I really hate this idea of attendance tracking. If you’re in class, you’re in class. Why should phone surveillance be involved? Even for iClickers, I remember once when I had left it in my dorm as I was leaving. When I told the professor about the situation they responded by saying I would be marked as absent if I didn’t have the iClicker with me. It baffled me that the idea of me physically being in the classroom and participating, that I would be marked down for not carrying in the device that indicates whether or not I was present in class. A quote that I loved from Erin Rose Glass, librarian at UCSD “We’re reinforcing this sense of powerlessness … when we could be asking harder questions, like: Why are we creating institutions where students don’t want to show up?” We are growing into adults. We should not be monitored and grow up with the idea that being surveilled is a common practice. We have our own independent, private lives.

Jenny Davis – A clear case for resisting student tracking (Cyborgology)

I had a vague idea that students on the school wifi were being tracked but only on the websites we view. I never thought it went much further than that but clearly I haven’t thought about the idea of that a lot. It’s interesting to think that students could be monitored for “attendance” and “mental health issues” as if schools really care about the mental well being of students. If that’s what they are concerned about, maybe they should improve their health programs rather than find a way to control the students that they hope to teach. The aggregate data could also provide a distorted idea of what the student is actually like. As mentioned, if the student has a full-time job and they miss class, or if the movements of a minority differ from the movements of what they consider the “average”, it can be hard to judge between right and wrong.

Question: Convincing a group of people to take action on privacy is already hard enough to do. How would we convince a mass to adopt privacy applications that would allow people to take better control of their data?