Big Data / Algorithms / Algorithmic Transparency

Janet Vertesi – My Experiment Opting Out of Big Data…  (Time, short article)

I think it’s interesting to see how Vertesi was actually at a financial disadvantage when hiding her pregnancy. Loyalty reward cards were excluded from her options when buying supplies. She couldn’t arrange a baby shower gift list. Her husband even had to buy multiple gift cards to buy a stroller without leaving a trace of their card information. In this way, it creates an institutional consequence for a citizen to live with privacy. Vertesi even points out how she felt like her reputation as a person, friend, and family member was being compromised. This adds to the pressure that the US pushes onto it’s citizens. Because there’s this social media element that’s surfaced into these recent decades, recording your identity online is socially expected upon you. Social medias set you up to self document your life for big data collectors whether we know it or not. It’s set up to feel like a personal choice (and in a way it is because social media is communicative and collaborative) but it’s only when we stop conforming that we face repercussions.

This whole idea makes me think about what it looks like to do this experiment long term. Is having privacy bad? Why is privacy so interconnected with secrecy? What about the people who don’t have access to online platforms, credit cards, etc? Like what about the people who don’t have these ways of documenting accessible to them? How does this play into ways of institutional aid and acknowledgment of these communities?

Virginia Eubanks – Automating Inequality (talk, 45m)

Why base the future from the past? And how do we integrate basic human rights into our society? How are we investigating citizens and how are we categorizing/organizing them into a paper society? How is a person’s integrity compromised through their loss of their privacy?

This keynote made me think about the effects of a data collecting society. It also raises the question of how I personally am being documented and the way that documentation changes my life. How does this compare to a different identity? And how comfortable am I with this?

Walliams and Lucas – The Computer Says No (comedy skit, 2m)

This reminds me of last week’s articles where they brought up issues of technology’s reliability. I think this skit does a great job in pointing out how people can conform so easily to what technology says. There’s not a hint of doubt in the receptionist’s mind that the computer data just might be wrong. It’s gotten to the point where the computer has higher credibility than the patient’s, who are the most direct source of information. This is building a distrust from human to human as the trust in tech rises. Although technology is made by humans, the visual element of the machinery, and the manual input necessary from the user to activate a response makes it’s answer somehow otherworldly(elevated?). It’s suddenly disconnected from humans and their inaccuracy. People can constantly be wrong, so it’s really easy to rely on machinery in order to tell the truth. But how is this working against us as a collective society? If we can’t even trust the direct source because it’s human, how are we disconnecting ourselves from having loyalty, trust, and commitment to one another? In a larger scale, how is big data replacing our voices in places as important as elections, census, social/government profiling?

Cathy O’Neill – The era of blind faith in big data must end (Ted Talk, 13m)

Cathy O’Neill develops a lot of the questions I had above in her ted talk. Algorithms are biased on what success is. They all have their own version of what success is. The people who create these systems are biased. It’s all just a mindset, but adds numbers and equations to make a quantifiable value. Cathy O’Neill brings up how numbers relate to math and how math is intimating which ultimately discourages us to question algorithms. This then sets up a blind faith in big data out of fear of being wrong.

Trusting these things so blind lead to big repercussions. Teachers can loose their jobs because an algorithm says they are not doing well. Suddenly these data collectors are so relied on. And if there is any doubt, it’s shut down automatically. Not letting people even know of the algorithm pushes this rhetoric that the mass public is incapable of understanding. To deny people access to the system denies them from being educated. Assuming they’re incapable of understanding it, assumes that they can’t ever understand truth. This further disconnects people from tech and elevates big data to a high position than the average citizen.

Random thoughts:

“Silent but deadly” “weapons of mass destruction” –> private companies sell to government, private power, “theres’s lot of money to be made in unfairness”

Standardized testing = lazy, biased, mathematical, regressive, automate status quote

DNA testing 23&me

how do we support the ones that are being scrutinized? How do we protect the people hidden, lowered, and oppressed?

Frank Pasquale – Black Box Society – chapter 1 (pp 1-11)

“real” secrecy, legal secrecy, and obfuscation.

Real secrecy: establishes a barrier between hidden content and
unauthorized access to it. Legal secrecy: obliges
those privy to certain information to keep it secret; a bank employee is obliged both by statutory authority and by terms of employment not to reveal customers’ balances to his buddies. Obfuscation involves deliberate attempts at concealment when secrecy has been
compromised.

This idea that secrecy is so private and unbreakable is interesting. I keep thinking back to the visual of locking your front door everyday. Person A can lock their door every day, use multiple chains/locks/doors. They can lock their car, lock their phone and set up a password for their laptop. But at the end of the day, our hardware is just hardware. A Person B can show up and with enough intention and supplies, they can break into the house, phone, laptop, car, etc. This habitual routine of locking our stuff functions to a very limited extent of protection. If anything it’s a false sense of protection that makes us think we are safe. That we have had some agency to protect ourselves, and that we have ownership of our machinery. But at the end of the day, its breakable. So when I’m thinking about these large companies that hold so much data from the people, that oppress communities through biased data collection, there must be a way to counter that, right? Surely they have multiple layers of protection unlike the ones they give the public (power, law, govt, money, etc) but at the end of the day it’s machinery. Is it possible to have absolute privacy? And what does privacy mean? How hidden does information have to be to be considered “private”? Does that even matter? Is it more important to look at the consequences instead?

Leave a Reply

Your email address will not be published. Required fields are marked *