BFA Rough Draft

The Work

Alternative sketches that I’ve been playing around with: https://editor.p5js.org/tkim36/sketches/bWiZftZr0

https://editor.p5js.org/tkim36/sketches/5-LrtWq-q (3D from last year that I might incorporate)

Work’s Title

  • Sound Flower (?)
  • Frequency Flower (?)
  • Still thinking about the name

Artist Statement

  • “As an artist, I focus on how people interact with technology and how that interaction can be manipulated in order to create personal, emotional, and engaging pieces of work through code, video, and sound.

A Representative Image

Probably a collection of these screenshots with the associated sound titles

A Personal URL

http://tonyhkim.com/

Project Update

Since the BFA show will now be online and my idea of CCTV surveillance won’t work, I wanted to shift into making something interactive through the webcam, and integrating that online.

I know that a lot of people have been messing around with Zoom and using functions like changing the background and whatnot so I was maybe thinking about further pushing that idea of interaction? Maybe making people collectively interact with something that is online like Ben’s clicking thing we saw last year?

Another idea would be to mess around with some video archives and stuff since I haven’t done a lot of video work in a while and it would be a good time to make something like that.

AI / Predictive Analytics / Recommendation Algorithms

James Bridle – Something is wrong on the internet (Medium)

This essay talks about how the youtube algorithm could lead underaged/ young children to be exposed to not appropriate content. This is because people do anything for the views (such as making the video longer like the article states).  Especially if there is a common computer that the family shares, a child may be exposed to something violent. I remember when I was little my mom yelled at me for watching gaming videos that involved shooting.

How can we regulate / protect children from inappropriate content? 

Rachel Metz – There’s a new obstacle to landing a job after college: Getting approved by AI (CNN)

This article freaks me out as I look for jobs and internships. The process of getting hired by non-empathetic machines is something that is ridiculous to me. Though the use of selective AI, that means that the algorithm is looking for one specific type of individual, and that may be negative because it demonstrates some sort of conformity. There will be no dynamic interactions nor ideas due to the lack of individualism, since I feel like everyone would be similar. 

How do you feel if HR (Human Resources)  became autonomous?

Jia Tolentino – How TikTok Holds our Attention (New Yorker) (read or listen)

I personally get really addicted to tiktok once I start watching, and could just scroll endlessly. The article talks about how the recommended algorithm gets users hooked, but I also know that people try to get on the “for you page” with the #fyp in order to utilize this algorithm to get famous. The article also talks about how young individuals like Lil Nas have the ability to be famous, but I also know that many vine people and youtubers become easily famous on tiktok compared to normal individuals. 

Do you feel like anyone can become famous if they tried/ just manipulated the algorithms? 

Eric Meyer – Inadvertent Algorithmic Cruelty (article)

This article talks about how algorithms are “thoughtless” and doesn’t concern for emotions. The author of this article/ post talks about fixing awareness and considerations for. The failure modes and worst case scenarios. This article is sad because our social media brings up memories of our past which we may not want to remember. I know snapchat does a similar thing where they take you to a memory you had last year on this date. 

I know some people who just don’t use social media, do you think you could do that? 

Rachel Thomas – The problem with metrics is a big problem for AI (article)

This article talks about how AI plays a key role in optimizing metrics. While this can be useful, the article states that it can also be harmful when they are “unthinkingly applied”. This article talks about over-emphasizing metrics, and how it can mislabel useful data and bad data. I think I agree that although metrics are useful, it can be easily taken advantage/ misused. For example, I remember that I saw this college “life-hack” where students began writing random words in white while writing their essays in order for the word count to be the number they wanted it to be. 

How can we make metrics more empathetic? 

Big Data / Algorithms / Algorithmic Transparency

Frank Pasquale – Black Box Society – chapter 1 (pp 1-11)

This chapter talks about how information becomes opaque, limited, and discreet, thus, narrowing our vision. The laws that govern these systems protect the commodity, but threaten our security. I think it’s interesting when the book talks about how our lives are public, but information is often not. I also recently signed a non-disclosure agreement and I certainly didn’t feel 100% comfortable.

Do you think companies should be transparent in exchange for more money?


Cathy O’Neill – The era of blind faith in big data must end (Ted Talk, 13m)

This talk talks about how our faith in algorithms have become so great, that we become dependent on it. I can relate to this talk and how I’ve been judged based on an algorithm. As I apply to jobs, I’ve learned that many companies use ATS/ algorithms that track keywords and other information on resumes. I feel like this is unjust because it limits our vision on who people are into just data. She talks about how algorithms can also wreak havoc, but it takes time to do so.

Google’s search algorithm is what makes them money. Why do you think algorithms are considered so important?


Virginia Eubanks – Automating Inequality (talk, 45m)

This talk talks about how data/ technology becomes harmful for those who are less fortunate. She talks about cases where applications for medicare become neglected because of application errors. Those who need access to such resources the most, lose their benefits because of technological error. She talks about how these digital tools are always hiding besides us, and one error could lead to a spiral of other errors. I think that’s why the credit score system is a little freaky. I’ve been trying to build my credit since freshman year in college because this small data determines my house, my car, and my future.

How can we make data less personal?


Janet Vertesi – My Experiment Opting Out of Big Data…  (Time, short article)

I think this article and the experiment she carried out is very interesting. She tried to stay off the grid for 9 months during her pregnancy without trying to get caught by marketing/ advertisement. In order to do so, she withdrew cash and used a private internet server (thus talking about how it made her look like a criminal). I think it’s interesting to see how we’re monitored every day, and we can’t escape from it without compensating our comfort. Recently, I started using a sleep tracker app, and I was surprised that it could also figure out if I was snoring or not.

Can someone truly live off the grid?


Walliams and Lucas – The Computer Says No (comedy skit, 2m)

This skit demonstrates our dependency on technology and viewing it as an absolute truth. This also reminds me of the previous talks that we had to watch where people thought that google search results were all accurate, factual information. I think the problem displayed in this skit also displays the problem when it comes to data, big data, and data organization. We need to find a way to effectively store and search such data.

Do you think machines (such as kiosks at mcDonalds is more effective than humans?

Technology and Race

Lisa Nakamura – Laboring Infrastructures (talk, 30m)

This lecture talks about empathy and VR, and the opportunities it has to become a medium that allows people to feel more emphatic towards different culture and events. One of the things that I heard and can’t imagine doing, is listening to the audio and witnessing the tragedy of Trayvon Martin. However, having the ability to bear witness of such tragic events, could help a lot of people grow more empathetic towards a different community. However, she also states that one of the downside is sacrificing privacy.

Would you be comfortable being filmed for a VR experience in your own home?

Safiya Noble – Algorithms of Oppression

This lecture talks about how the search algorithms display some kind of bias towards those of color. The talk also mentions how around 60-70% of people believe that these search results contain factual data, and how this can lead to prejudice due to misinformation. She also mentions how no one cared when these biases were present when it was regarding women of color, but now that it has affected the presidential election, everyone cares about it.

How can we sort out fake news from real news? Why do some topics raise more controversy than others?

Ruha Benjamin – Race After Technology (talk, 20m)

Something I found strange was when she was talking about how racism is productive. Something that I remembered was that the video we watched in Ben’s class where the facial recognition software wasn’t able to detect an African American man. The design for technology should be changed so that it can allow everyone to utilize it efficiently.

What are some possibilities of eliminating racism in the tech world?

Social Interaction / Photography / Media

Documentary Vision 

The reading talks about the “desire for life in a documented form”. The emergence of a generation that is dominated by phones and how culture is changing is due to the new technology and apps. This causes systematical and social behavioral changes. The article talks about our obsession with photography, and how we are also replicating the past with these new apps that make our photos look a specific type of way.  

Why is society becoming more concern about their public self image? 

What Do Metrics Want? How Quantification Prescribes Social Interaction on Facebook

The article talks about how metrics through a social system such as facebook, drives our need to increase these metrics. However, removing these metrics with Facebook Demetricator demonstrates how the platform solely focuses on the quantification and how our society is headed towards that direction.  Users are bombarded with numbers on the home page, but removing them makes the platform look unfamiliar. I think a lot of the metrics are making us addicted to this ideal of self improvement and self worth. 

Would you use social media if it didn’t have quantifiable data available to you? 

 Wearables and how we measure ourselves through social media

This tedTalk talks about data and how we are always quantifiably measuring ourselves through media. She talks about how we use these data to improve ourselves, such as through Fitbit and baby trackers. Something I found very interesting and never really thought about was the idea of dataism, which the belief that all data is the absolute truth. I think I agree and disagree with this fact because although data may show our non measurable activities in a quantifiable way, it only shows the “digital traces” we leave behind. 

Do you think that automated technology and application will make us more greedy to become this “perfect human” that Benjamin frankiln talked about? 

Interface Criticism/Tactical Media/Software Art

Sad by Design 

This podcast talks about how we, as humans, must overcome the corporate tech monopoly and their tactical methods which forces us to become dependent on their applications and technology. The podcast first introduces tactical media, and how it uses interactive media to promote the human agenda. This then leads into platform capitalism, where tech giants such as Facebook, uber, and etc. makes money by integrating these platforms into our everyday lives. I think it was interesting how they talked about distinguishing the real vs. the virtual, and how it is all intertwined. With the emerging technologies of VR, imagine how much harder it will be to distinguish the two worlds. I also found it relatable when the podcast stated that the world of distraction leads to a world of sadness, and how we get not only addicted to this sadness, but are drawn into it as well. They talk about the “rabbit hole of sadness”, and we hide in these targeted holes and sink deeper into them. That’s why a lot of the news sources focus on the negative things happening around the world.

Like the podcast states, how can we (if we can) overcome the smart phone?
Would you use a (lightphone)?

Programmed Visions This article begins with a parable of the six blind men and comparing it to the topic of New Media. Because New Media and technology is such a big topic, they state that it is difficult to know what it exactly is as a whole, and we can only know bits and pieces of it at best (individually). However, the article states that out of all things New Media, software is the universal thing that we may use in an effort to understand it. The article describes that the software becomes a metaphor for culture, while nature is the metaphor for hardware; however, it is difficult to understand software as well so there is a paradox that occurs. I also found it interesting when the article stated that software empowers users and helps us navigate through our increasingly complex world.

Should we learn how to read code? Would it be necessary in the near future?

The reading states that software helps us navigate through our increasingly complex world. Does that mean as technology advances, will it be more difficult for the elderly to adjust? How can we make technology more accessible globally?

How to be a Geek This article talks about how learning about the gap between those who understand software and are over enthused, over informed, and over excited about it, and those who aren’t. The article also talks about how it is difficult for people to get into software and how it discourages strangers (i.e. writing on software is also partially inside software, when it is presented as a paper book). The article also talks about how the geeks/ tech giants run the world in shadows, trying not to get too much attention. The article also talks about the myth of Icarus, and how we might fall trying to understand software (maybe referring to AI)?

What are some ways we can introduce coding/ software without people getting frightened/ intimidated by it?

New Ways of Hiding 

This article talks about the different ways metainterfaces hides and collects your data and how it is both omnipresent and invisible. The hidden information between objects and how data and software are now integrated into the global cloud, should force us to become more afraid and unfamiliar since interface has become more abstract. The article talks about the two different ways that it “hides”: Minimalist hiding and environmental hiding, and how we are agreeing to this surveillance by “voluntarily agreeing to its terms” and being manipulated. The article argues that we should increase our critical literacy so that we can design less manipulative metainterfaces.

Would you give up your information for more comfort/ accessibility?

Residual Data

Social Media platform : Instagram

The data you provide it : Likes

App extension : Likes random things and unlikes past liked posts in order to make explore page and ads more random and not catered to you.

Social Media platform : Snapchat

The data you provide it : Photo

App extension : Blurs background so people dont know the location

Social Media platform : Facebook

The data you provide it : Age + photo

App extension : AI generated profile pic and random age / info

Surveillance / Privacy / Resistance

Digital Democracies 

This video lecture talks about online databases that collect and mine people’s data, and the possible measures that people can take in order to protect their digital privacy.  Helen Nissenbaum talks first talks about achieving this privacy through crypto, which is hiding the message, but then diverges into a similar yet different method: obfuscation. She describes obfuscation as introducing more noise in order to create misleading and false data to distract the people who gather the data. These data gathers usually work in the advertising industry and in search engines. 

  • Reading this article made me think about the Netflix show maniac and how they have an “ad buddy”. The ad buddy is a personal who follows you around and constantly advertises different products in exchange for you loaning out some money. So my question is, how do you think advertisements will advance in the future? Will it manifest into an even more personal method? 

A Clear Case for Resisting Student Tracking 

This news article talks about how universities are monitoring students through mobile applications, wifi, and bluetooth. While they do talk about the possible benefits of such methods of monitoring, such as preventing and identifying students with possible mental illnesses, they also talk about the possible inequality and discrimination that may surface due to “overlaying data systems into social systems”. 

  • Do you think this should be integrated into our school instead of iClickers? (Since some students use other people’s iclickers for them) 

I made Steve Bannon’s psychological warfare tool 

This news article talks about mining people’s facebook and online information in order to curate a psychological and political online profile. They also state that personality traits could be a precursor to political behavior. Targeting people based on their user profile with a political agenda is something that is emerging because of our advancement of technology. This also reminds me of how NSA defines potential terrorists by monitoring suspicious activities. 

  • Are you okay with using your digital footprints being available to companies if it makes your life simpler/ easier? 

One Nation , Tracked 

The article talks about how it is legal for companies in the US to collect and sell your data. Most of us don’t even read the terms and conditions before clicking accept, and we would most likely click it anyways because we have become so technology dependent. The article talks about how these applications secretly track our location while running in the background. This reminds me of how some applications that don’t require location based services asks permission for my location.  

  • Do you read the terms and conditions? 

Surveillance Capitalism 

This video talks about the behavioral srplus, which takes collected data that used to be considered (additional, useless data) and targets certain groups with similar characteristics and prosperities. We believe that the tradeoff between our data (that we think is somewhat not as important) with the services that the companies provide are worth it. The video also talks about how only a little of our data is used to improve the services, while a lot of the data are sold for businesses.  One of the more surprising facts that I learned was how uploading our photos on facebook could be correlated to helping facial recognition software for regimes. The fact that these data manipulation are happening secretly and without us knowing is very frightening. The video talks about how Pokemon go secretly lures people into business without the users knowing it. 

  • Do you think you could live without these apps/ websites and go completely off grid? 

Colleges are turning students’ phones into surveillance machines

This article talks about how colleges are utilizing applications and networks to track students’ location and the frequency of those visits (utilized for attendance). The article states that these methods,  although they can be beneficial, takes away the organic process of learning. I feel like this is true because Freshman year, I would go to one of my giant lectures and see people come to class, sleep, and only wake up when the iclicker question came up and then left. The purpose of education could be misdefined due to all the technological interferences (such as laptops in classrooms). 

  • Do you think technology interferes or supplements a student? When do you think technology should be introduced inside the classroom?