https://sssummerchao.web.illinois.edu/wp/2020/05/11/advanced-seminar-junior-research-project/
Author Archives: Summer
Project update
I am doing a speech to emoji reader. For now it’s not translating voice input into emoji very correctly… still trying to find a way for it to return more accurate translation
https://editor.p5js.org/summerchao/present/Gv_DmSIWv
Project Update!
My original idea was to explore the influence of emoji on our communication. Now I found another interesting concept- dark pattern, which are the tricks used in websites and apps that make you buy or sign up for things that you didn’t mean to.
I don’t know how I am going to execute this idea yet, but I still hope my project could be something interactive. I am thinking maybe make an introductory webpage for it, or some kind of interactive visual presentation?
Big Data/Algorithms/Algorithmic Transparency Response
I DIDN’T REALIZE I DIDN’T PUBLISH THIS LAST WEEK AHHH
My Experiment Opting Out of Big Data Made Me Look Like a Criminal
Janet tried to hide her pregnancy from all the online advertising companies because a pregnant woman is worth as much as knowing the age, sex and location of up to 200 people. Hiding a secret like this needs much cooperation (with the people you know), but I wonder if it’s ever possible to keep yourself off the grid?
The era of blind faith in big data must end
What if the algorithms are wrong? We shouldn’t blindly trust big data and fear mathematical formula in algorithms. Algorithms are not fair because they repeat our past patterns by using data in the past. O’Neil suggested we can fix the biased algorithms through algorithmic audits- data integrity check, definition of success, and accuracy.
How do we raise the public awareness of biased algorithms?
Automating inequality
Eubanks started from explaining why we’re building a digital poorhouse now in a historical standpoint. Then she gave some examples of how algorithms were never fair to everyone- Indiana food stamps, system bias towards lower class families…
The Computer Says No
I realized I’ve seen some clips (and a lot of other memes) from Little Britain. But this is exactly what’s going to happen if we blindly trust algorithms and data!
The black box
Black box can refer to data-monitoring systems that we can only see the inputs and outputs, but not the process. Like credit reports are common now, but no one knows how it was calculated. I am surprised to know that financial institutions could hide things behind the public too.
I wonder what will be the vision of a transparent society?
AI / Predictive Analytics / Recommendation Algorithms
How TikTok Holds Our Attention
The way TikTok’s algorithm works is quite different from most of our social networks- “Some social algorithms are like bossy waiters: they solicit your preferences and then recommend a menu. TikTok orders you dinner by watching you look at food”. Recommendation algorithms give us what we want and provide content based on our individual interests. We become less and less likely to separate algorithmic interest and our own. I agree that this TikTok sensation will pass fairly quickly, since people do get bored of it and more and more similar platforms are coming up. While technology is being more customized/tailored to our interests, we should also be more cautious towards it.
Inadvertent Algorithmic Cruelty
Algorithms are thoughtless- even until now I still get birthday notification from some deceased ones on FB. I know now you can report a deceased person on FB and you can memorialize the account, in that way you won’t see the person on any public spaces- whether it’s suggested friends or any reminders. I assume this is how Facebook tried to implement the empathetic design.
Something is wrong on the internet
I’ve seen Youtube Kids very differently after reading a lot of articles about Elsagate (mostly from reddit/noSleep). But I never notice ginormous amount of videos like Surprise Eggs and Little Baby Bum. “What concerns me is that this is just one aspect of a kind of infrastructural violence being done to all of us, all of the time” I wonder why these absurd videos specifically target kids?
There’s a new obstacle to landing a job after college: Getting approved by AI
I never knew AI is also involved in the recruiting process. I feel like digitizing the process will expose more falses of the system- like what if someone hacked into AI and chose themselves as the candidates? I also wonder if the performances of the candidates chosen by AI are good? Maybe in the future, we will be interviewed by AI!
The problem with metrics is a big problem for AI
I love the analogy of junk food and recommendation algorithms because it’s accurate. Metrics are everywhere in our life now- but the impacts of metrics really depend on how we use them. I also find dark pattern interesting, it could be an interesting thesis for my research.
Research idea
I am not set for which topic to do for my research yet- probably between social interaction and social media metrics, recommended algorithms, and surveillance. But I do hope my work can be an interactive installation or game.
Technology and race response
Databite No. 124: Ruha Benjamin
Benjamin discussed how structural racism operates in algorithms, social media, and other technology that has not yet been discovered. “Tech fixes for social problems aren’t just about tech impact, but also about social norm, racial norm, and structure shape what tools are imagined in the first place.” The best example would be the racist robot in the medical algorithm that favors white patients over sicker black patients. Indifference to social reality on the part of tech designers and adopters can be more harmful than malicious intent. The New Jim Code shows how the algorithms can extend instead of diminishing racial discrimination.
Main question:
What are some other examples of discriminatory design?
What are the ways tech designers/artists can do to construct a different social reality? Is it really possible to create fair and just algorithms in an unfair and unjust society?
Laboring Infrastructures
Nakamura talks about VR2.0. Though VR2.0 still shares some major commons with our current VR experience: expensive equipment and male-dominated developers/users, it helps generate specific engineering feelings like compassion sympathy and tears. “You can’t trust people when they speak their own truth, but you need to experience for yourself to know”.
Is VR a promising industry for you? Where do you see the industry go in the future?
Algorithms of Oppression
Noble discussed the biased algorithm towards the underrepresented people. I’ve heard of all those search engine algorithm’s racist examples (image results of three black kids vs white kids, Asian/Latino/Black girls)- but I think they tweaked the algorithm, so the search results are not the same anymore. I realized how far we’re from real gender equality even though it’s 2020 now.
Is Google the only browser having this issue? Or all the search algorithms are like this?
Social Interaction, Social Photography, and Social Media Metrics response
Wearables and how we measure ourselves through social media
This TED talk is about dataism- how we objectively quantify everything now. More and more life logging app/products are invented (the vessel, bracelet records your calories, the location-logging app, love all of them). But we always ignore the uncertainty of the data. I agree with her when she says data shouldn’t be the representation of our lives, it’s just the trace we left behind.
I wonder if we will ever overcome the uncertainty of the data? (Like the video we saw in class- a guy used 99 phones to create a fake traffic jam)
The social photo
“Social photography” is about photography and social media, which are the desire for life in its documented form. Almost everyone has phones and anyone can be a photographer. He also brought up the faux-vintage trend, which suggests the continuity with the nostalgia that all documentation implies. I feel like social photography is a fairly new word and I would like to know what it means to the generation before us.
What Do Metrics Want? How Quantification Prescribes Social Interaction on Facebook
Ben built a software called “Facebook Demetricator” and it removes all the metrics from the Facebook interface, and it reveals users’ “desire for more” culture on social media. The metrics guides users’ behaviors and reveals prescribed patterns of sociality.
Imagine a social media without any quantification, will it be able to compete with all the popular social media platforms?
Interface Criticism / Tactical Media / Software Art Response
Programmed Visions
The author explained new media by using different analogy, then focused on its visual/invisible essence- software. I like a series of metaphors she used for software and cognitive science, molecular biology, and machines. After reading this intro, I feel a bit more confident in introducing what my major is (lol), but I think everyone has their own definition of new media, and I wonder what it is?
How to be a geek
Fuller started from writing software and software cultures. He shifted the focus from the computational system to cultural and political aspects of computer sciences. Geek is how a person being too enthusiastic and public about something (anything), though others might find the topic tedious. But tech geeks also have the power of turning their fascinations into reality. Would you like to be called a “geek”? Why or why not?
Sad by Design
“Technology will be the evolutionary successor.” Our entire social life now is based on this anti-human agenda corporate capitalism. The big tech companies decide our engagement with technology. One of the examples was how they (big corporates) make sure people constantly check their phones by integrating FOMO into interface design. The youngsters now are more observed, more watched, while their voices are more drowned out.
What are the changes social media have brought to us?
New ways of hiding: towards metainterface realism
Pold brought up this new idea called metainerface. This interface is more abstracted while tailored our urban, mobile life now more. It combines what you see, how you see it, and how it sees you. Metainerface can help us go beyond what we normally see on the visual interface.
I’ve seen some of the works here- Safebook and most of Joana Moll’s works. It feels like metainerface isn’t limited to a single digital interface. Do you think more and more artists will base their works off metainterface realism?
Surveillance / Privacy / Resistance response
‘I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower
This article is about Christopher Wylie. He brought big data and social media to information operations and used it on the US electorate. This idea made Cambridge Analytica and helped Donald Trump’s election campaign. It also explains how Cambridge Analytica acquired data from SCL and Facebook and eventually made this political message-targeting tool. I wonder if the same thing like the 2016 election online media manipulation will ever happen again?
Twelve Million Phones, One Dataset, Zero Privacy
Absolutely love this headline/title. This is a project about privacy- they track millions of people’s smartphone and their data files (Love how they called this kind of act Tiny Brothers). All these data visualized maps indicate that we have zero privacy with our phone tracking us 24/7. Among all kinds of data, location data is the most powerful one.
I wonder if it’s even possible to go completely off the grid? Even if we do, we can still be tracked by the people around us who still use their phones right?
Colleges are turning students’ phones into surveillance machines, tracking the locations of hundreds of thousands
More and more schools have started to use surveillance technology to monitor students’ academic performance, analyze their conduct or assess their mental health, like the app SpotterEDU and Degree Analytics. “These administrators have made a justification for surveilling a student population because it serves their interests, in terms of the scholarships that come out of their budget, the reputation of their programs, the statistics for the school.”
Does collecting all the data really benefits the students’ performance at school? (does it make the learning process more enjoyable or not?) To be honest I think college is the time where we shouldn’t need anymore tight supervision. Too much of surveillance technology on campus ruined the opportunities for us to take control of our lives.
A Clear Case for Resisting Student Trackin
This article is backing up the previous Washington Post news. She thinks that though the data from SpotterEDU would be helpful- but aren’t worth the social cost. Not only did it take away the opportunity for students to be independent, but the data also creates a more unequal environment for disadvantaged groups. I know we don’t have such technology here at UIUC yet, but is this going to be a trend in the US?
The Age of Surveillance Capitalism
“The information we provided is the least important part of the information they collected about us.” Professor Zuboff pointed out the companies that collect data are using them to analyze human behaviors. Based on these behavior surplus, companies can predict our personality, emotion, or even sexual orientation. All these companies that sell users’ data to third-party companies take no responsibilities of what the third-party companies will do with our data. Like Pokemon GO and Google Earth saturate us in their enjoyable services, so that we wouldn’t even notice how much data they took from us. At the end of the day, we are controlled by the technology instead of the other way around. Do we ever have the chance to regain control of our data?
Mapping Interventions
Helen Nissenbaum created TrackmeNot which send fake queries to the search engines and browsers. She also introduced Cryptogram, a program you can encrypt your images so that Facebook wouldn’t see your images. She mentioned the word obfuscation- the obscuring of the intended meaning of communication by making the message difficult to understand, usually with confusing and ambiguous language. This strategy has been used in military services, orb weaving spider, Heather Dewey’s project, and Ben’s project!! Her team created AdNauseam because they worried about target online advertising, which highlight the surveillance and creation of profiles. I personally find Advault very intriguing. I’ve been using many ad blocking softwares and I always questions their capabilities- do they actually work?