The black box society
The article makes a metaphor of the current secrecy problem as “black box”, a recording device and a system whose workings are mysterious. “Knowledge is power. To scrutinize others while avoiding scrutiny oneself in one of the most important forms of power.” The law aggressively protected the secrecy in commerce while staying silent when it comes to personal privacy.
It might be worthwhile if the decline in personal privacy was matched by comparable levels of transparency from corporations and governments. But sadly, it hasn’t. The commerce and technology industries are still keeping the box closed, using three strategies: “real” secrecy, legal secrecy and obfuscation.
“Transparency is not just an end in itself, but an internal step on the road to intelligibility.”
The authority is increasingly using algorithms, while there are a lot of questions we need to query. Are they fair? To what extent can we trust these automatically made decisions, instead of decisions based on human reflection? The distinction between state and market is fading.
The era of being faith in big data must end
This Ted talks about what if algorithms are wrong. There are two aspects that the algorithm could be wrong. The first one is data, the second one is the definition of success. “Algorithms are opinions embedded in code.” The marketing trick tries to make people believe that algorithms are scientific and tries to intimidate people with algorithms. Algorithms can have deep destructive effects with good intentions. Algorithms can’t make things fair, because they repeat our past practices, our patterns. They automate the status quo. In some ways, algorithms are killing minorities.
“data laundering” A process by which technologies hide ugly truths inside black-box algorithms and call them objective.
We should check our data integrity; we should introspect the definition of success; we should improve our accuracy and we should give a long-term effect on the attention it deserved.
Data scientist: We should not be the arbiter of truth. We should be translators of ethical discussions that happen in a large society.
Non-data scientist: This is a political fight instead of math tests. We need to demand accountability for our algorithmic overlords.
Automating inequality
More evolution than revolution. They rationalize and recreate politics. They promise to address bias but in fact, they just hide it.
They create empathy override, it eased emotions. The talk then gives some examples of people losing qualification of medical service due to algorithms. The limitations of data itself cause enormous concern. Feedback loop.
My experiment opting out of big data made me look like a criminal
The author wrote her experience of trying to opt out of big data. And she found that she was being treated as a criminal. She was suspected by banks because she withdraw a lot of cash. She has to carefully choose her words when she needed to communicate with other people online. What she said “No one should have to act like a criminal just to have some privacy from marketers and tech giants.” is absolutely right but happening right now.