Frank Pasquale’s Black Box Society: I get paranoid thinking about how the government/financial institutions want to remain secretive while still having the ability to track/surveil the general public. As an American, I know all too well about our government lying or not telling the whole truth about our past wars; and I’m sure they would prefer the public to not know anything at all. I also know that Super PACs (i.e. CEOs of corporations) can fund ads -with no limit- for or against political campaigns which must have a huge algorithmic impact. Honestly, I can’t hear about ‘nondisclosure agreements’ without instantly thinking of Mike Bloomberg and Donald Trump. Pasquale uses the word ‘black box’ to refer to how we are constantly being recorded, but in a secretive manner by a secretive entity. What is described in this book is very reminiscent to me of Cambridge Analytica and surveillance capitalism (coined by Zuboff). I think this short clip from The Simpsons Movie (which came out before Snowden) humorously depicts the ‘black box’ society we live in. Although I know about obfuscation, I didn’t know about real v. legal secrecy (one refers to locking a door or making a password, the other is about keeping sensitive information a secret i.e. not giving out SS#s). I guess the Equifax data breach would be a great example of how legal secrecy can be violated.
- Is transparency the answer to uncovering government/financial/corporate institutions’ secrets?
- If so, how might we obtain more transparency?
- If not, is there another way? Should we just live with the unknown secrets?
Cathy O’Neill’s The era of blind faith in big data must end: O’Neill first asks, “What if algorithms are wrong?” and then follows up with, “To make an algorithm you need data and a definition of success.” According to O’Neill, everyone uses an algorithm with or without code. One example of an algorithm used without code has to do with the process of making a meal. Whoever is making a meal has to factor in the time, resources, and energy required to make it (data). Also, the success could be eating enough vegetables; but if there are children, eating enough sugar could be their own success (which probably isn’t good for them in the long run). Ultimately algorithms are opinions, not objective science. It’s sad that teachers lost their jobs over inconsistent data. It seems hard enough just to be a grade school teacher in the U.S. …Codifying sexism, racism, homophobia is definitely a major concern of mine. I think the game Portal is a great example of how algorithms can go wrong. The main robot in Portal, GLaDOS, practically embodies an algorithm running tests (in the name of science) which ultimately achieve the death of people.
- What are other examples of algorithms that don’t operate on code? -Could getting a degree/diploma be considered another example? (With data being the classes/hours it takes and success being graduating)
- What kind of opinions do you think social media algorithms have?
Virginia Eubanks’ Automating Inequality: Eubanks starts the lecture by stating that we are building a digital version of a 1820s poorhouse. Apparently in the 1820s there was an economic depression and of course the concern was of people being ‘dependent’ on the government and not of people living in poverty. Unfortunately it has been two centuries later and this same sentiment seems to be around today. I didn’t know ‘county farm’ is coded language for where a poorhouse once was… Anyway, Eubanks gives a couple of modern day examples of this phenomenon. In Indiana, applying for food stamps (a.k.a. SNAP) turned into a digital process which ended up leaving a lot of eligible people without the program. Occasionally those who went through the process of signing up would get notified of an error, but there were no specifics as to what the error was. The story of a woman with the last name Young was heartbreaking! She couldn’t go through the welfare process while in the hospital with cancer, so of course she loses her coverage and dies before the court case reaches a verdict. According to Eubanks, the middle class can pay with private benefits so there is not as much data on them. A system that only has data from the lower class is obviously going to conflate parents living in poverty with poor parenting. For child protective services this lack of data from middle class families can result in false negatives (i.e. not seeing harm when there may be some) and for lower class families this can result in false positives (i.e. seeing harm when there is none). Not to mention that there is a racial bias when it comes to child welfare. This racial bias is most apparent when communities call on black/biracial families (what Ruha Benjamin referred to when talking about the citizen app and the BBQ Becky). Eubanks says that California’s VI SPDAT collects data on the homeless for housing opportunities, but it’s possible for the LAPD to abuse that data. Also a guy who goes by Uncle Gary living in skid row filled out the VI SPDAT multiple times, but was not seen as vulnerable enough for housing. It just goes to show that welfare systems are not better helped by status-quo algorithms. I think this lecture had a stronger connection to The Computer Says No skit since healthcare is brought up. I can’t imagine a 5 year old needing a hip replacement, tonsils sounds more like it. I swear there’s too many customer service surveys these days…
Janet Vertesi’s My Experiment Opting Out of Big Data… Opting out of personal data collection isn’t easy when it makes you appear as a rude family member, inconsiderate friend, and/or a bad citizen. Vertesi talks about her experience trying to not have big data detect her pregnancy… I can’t imagine downloading something probably used for the dark web (Tor) just to visit BabyCenter.com in privacy or setting up a separate Amazon account and buying gift cards in cash to purchase things online. I feel bad for the uncle who PMed her a congratulations, but yeah, I see Facebook as trying to calm the “Big Brother is watching you” anxiety by calling it a ‘private’ message. I swear I’ve seen ads on my Facebook Messenger too! I have a friend who’s very protective of their privacy so I only message them through Signal. They also use duckduckgo and Firefox. I should probably get into that habit, but I don’t think I can give up my gmail and google drive.
- Do you think it’s even possible to live a life completely ‘off-the-grid’ today? Why or why not?
- How many conveniences/relationships would you have to give up in order to do so?