Big Data, Algorithms, Algorithmic Transparency

Pasquale – Black Box Society

Pasquale describes “the black box” – a sort of metaphorical box which can either (ideally) obscure your data and thus provide privacy, or, more frequently, is the mechanism by which companies or the government conceal how much of your data they have. This is really to protect the interests of those who have access to user data rather than the user themself. The release of this data is often not an option, as non-user parties will often argue that data is hidden for their own good – to protect privacy, enforce peace, etc. Honestly, I thought this article built off of the other sources this week – we’ve become reliant and complacent on what we perceive to be objective gatekeeping of our data, but in reality, we really have no clue who has access to our information. Privacy becomes a myth, and only now are individuals beginning to sound the alarm on this.

O’Neill – Ted Talk

O’Neill is concerned about the broader social repercussions of relying on algorithms, and training them solely based on archival data as opposed to controlling for systematic oppression that impacts pre-existing data. Simply because the algorithm accurately predicts success, doesn’t mean that that algorithm is correctly accounting for human bias. A lot of this comes from the laziness of company execs to spend the money into intentionally skew their data to be more equitable, and from the comfort that the population has that math is objective. If my years in high school taught me anything, it’s that math is a bad subject for bad people who have bad taste in literally everything. The same (roughly) goes for data scientists. They need to be held accountable for their actions

Eubanks – Automating Inequality

The cycle of poverty in the US was already convoluted and hard to break (thus making it a cycle). There were already cuts to welfare programs that did give the poor chances to live a decent life, but the introduction of the digital age has made certain that these inequalities stay and worsen. Eubanks brings up the example of Medicaid and Medicare early on to illustrate this – the job of human moderators to look through and vet automated decisions was replaced by programming that wasn’t properly overseen, leading to millions getting their insurance cut, and an inordinate amounts of preventable death. This topic of essentially screwing over the poor to save up on administration costs has a dangerous history. I think it’s especially relevant now in the context of a national conversation, as we now not only have one, but two presidential candidates who promise progressive reform to specifically target this. The quieter we are on this issue, the less likely it will get resolved.

Vertesi – TIME article

Vertesi talks about how hard it is to really have a private life on the internet, because even interactions outside of your direct control are monitored. There’s the example of the girl whose dad found out after Target did that she was pregnant, and Vertesi’s own example of how she was unable to conceal her pregnancy from ads because of an innocent email from her uncle. Private isn’t private as far as advertisers are concerned, because we are the commodity being sold. Personally, I find it frustrating to explain this to older people who don’t explicitly work with tech this concept, because my generation grew up with a camera pointed at us. Overall it’s just irritating how little regard is given to out personal information, and how people with the power to regulate it (again, often older Senators/Representatives) don’t even see a big problem with it in the first place.

Williams and Lucas

This one was hilarious, I can actually remember scenarios I’ve been stuck explaining something to admin and they literally keep referring to some sort of mix up in the computer system. It’s funny in a comedic context, but a little concerning when you realize the real-world implications of a lazy person at the desk. There isn’t too much to say here, it’s just deeply relatable.

Leave a Reply

Your email address will not be published. Required fields are marked *