The book The Black Box by Frank Pasquale explains how companies use our data to determine basically everything about our lives, especially finances. With all these companies tracking us through our phones, computers, and other devices, companies sell our data and use it to determine what kind of person you are by that; if you’re reliable with making payment or receiving a loan, but by doing this, it is discriminating people, who don’t fit well into the algorithm.
How do we fight against these companies selling our data and using it for their own agendas?
In Cathy O’ Neil’s Ted Talk, she further explains how these algorithms are biased, and how it can discriminate against minorities, women, and poor people. And, that just because the algorithm is a computer, doesn’t mean that it isn’t biased. It is biased because of the requirements that are put into the algorithm.
How do we find out who are making these programs and how do we make regulations to stop them from using our data for nefarious purposes?
In the article written by Janet Vertsi about how not using big data made her look like a criminal. It kinda reminds me of when you (Ben Grosser) made the email program that adds in words that government trackers.
Should people stop opting out of big data and if so how or will it have an impact on these companies?
The skit was really funny because it showed that people rely so heavily on computers, that even when they are wrong, people will still get the bad end of the stick and be at fault for it. It’s also interesting how common these mistakes are made, but people will still get negative effects from it.
Why do we put people at fault when technology and algorithms are wrong even though we know it is wrong?
The talk given by Virginia Eubanks called “Automating Inequality” describes how these data algorithms are forcing discrimination. People are put into these systems and if they don’t fit the requirements it can still screw them over even when it wasn’t their fault in the first place. This ties into both the book and ted talk when describing how these algorithms are biased, especially towards women, minorities, and poor people.
How should we protect these people that are negatively affected by these algorithms?