BFA FINAL UPDATE 2

Title: “No, we can’t go home, there’s food at McDonalds“.


I want to shed light on the constant barrage of advertisements that beckon us to consume and put money towards capitalistic for-profit establishments, conditioning the notion that assimilating oneself into American mainstream culture will bring success and equitable social status. Advertised mostly to the persons whose identities are not represented or recognized as being part of the Western consumer “standard”(…)

This project is an observance of the push in assimilation of the mainstream cultural through a first-generational narrative look on food red lining and infrastructure, emmigrant family dynamics of gratitude and guilt, and the media’s role in the enactment of algorithms that aim to deliver the personas of desirable, consumable objects.

*I need help with an overview for this statement! I know what I am trying to say, but I’m not sure how it sounds outside of my head. I’ll need to think some more on what to add or take away from the statement, if anything

I am excited!!! It’s been going swimmingly so far

I need to put one more coat of tinted resin on all of the hanging plastic boxes, as well as a small cardboard box of empty nuggets that I taped over so that the resin wouldn’t seep through it. It should take one night/day to fully dry, then i’ll put another coat on the small cardboard box after the initial layer

Then, once my dried fruit has gotten a little bit more dry (i’m air-drying them in the sunroom right now), I will dip each piece in resin until fully coated and attach it to the outside of the tulle-encased-in-resin-sculpture (from which I will remove the underside latex skeleton so that it stands on its own). We will see if I will have enough resin for all of the fruit pieces! This should also dry within 24-48 hours

I then need to code a quick code for slow color-changing gradients, in the warm ranges (reds, oranges, yellows, pinks, limes) and test my analog-to-digital cable converter with my HDMI output on my laptop and see how it looks on the MINI TV

I then need to de-construct the living room and do my set-up, for a whole day, taking pictures at different times of the day.

(…) to be continued

BFA Project Final Proposal

Title: “No, we can’t go home, there’s food at McDonalds“.

I want to shed light on the constant barrage of advertisements that beckon us to consume and put money towards capitalistic for-profit establishments, conditioning the notion that assimilating oneself into American mainstream culture will bring success and equitable social status. Advertised mostly to the persons whose identities are not represented or recognized as being part of the Western consumer “standard”(…)

This project is an observance of the push in assimilation of the mainstream cultural through a first-generational narrative look on food red lining and infrastructure, emmigrant family dynamics of gratitude and guilt, and the media’s role in the enactment of algorithms that aim to deliver the personas of desirable, consumable objects.

Big Data / Algorithms / Algorithmic Transparency (4 Mar):

Cathy O’Neill – The era of blind faith in big data must end (Ted Talk, 13m)

****Cathy talks about data laundering, a process in which technologies hide ugly truths inside black box algorithms and call them objective. We are calling these things “meritocratic” when they are in fact non-transparent, and have the importance and power to wreak destruction. They are in-fact, if we think of one of the worst possible outcomes, mass-weapons of destruction. Except this is not “revolution, but evolution”. This adaptation of technology that integrates meta interfaces for data laundering is only revolutionizing and codifying biases that have been present in our society long before contemporary codes were able to solidify and exacerbate them.

Virginia Eubanks – Automating Inequality (talk, 45m)

****Virginia Eubanks talks about Automating Inequality as a creation of an insititution that upholds this “idea of austerity in which there is seemingly not enough resources for everybody, and that we have to make really hard choices about who deserves to attain those basic rights to get those resources”. She talks about the feedback loops of inequality that live in database storages and assume this idea of austerity, therefore looping it back into the system. Something very important she talks about is how the feedback loops create false negatives and false problems, which are the dangerous and debilitating parts of how this data can be used against individuals in favors of others. 

Frank Pasquale – Black Box Society – chapter 1 (pp 1-11)

—ABOUT BIG DATA and how there are multiple hidden layers of “firewalls” to get through, if you were to start calling out firms for being more transparent, for example, you would still have to get through all that incredibly hard to understand contract-jargon, that might end up being the discouragement needed to not press the subject further, “However, transparency may simply provoke complexity that is as effective at defeating understanding as real or legal secrecy”.

Transparency is not just an end in itself, but an interim step on the road to intelligibility.”

Janet Vertesi – My Experiment Opting Out of Big Data…  (Time, short article)

This experiment is based on a woman using using a site called Tor to source random servers for her searches. She did this because she wanted to evade the millions of data bots that would have invaded her online and in person existence through data referrals or advertisements. She used this site to bought everything in cash using gift cards to purchase things on amazon, and had to act like a criminal, using these gift cards via a site that is known for illicit activity.

This may seem insignificant, so what if companies send me coupons for baby stuff, but this exact same kind of “false negative”, it is, “—paved with the heartwarming rhetoric of openness, sharing and connectivity—actually undermines civic values and circumvents checks and balances”. 

Walliams and Lucas – The Computer Says No (comedy skit, 2m)

***I feel like this is something my parents would complain about, especially in hospital or administrative settings in which unnecessary bureaucracy takes places, and everything is handled through paperwork or computers, passed of by real hands, but only as a middleman. The idea of having a middle-person there for all these bereaucracy processes helps ease and alleviate the thought that these centers of “help” and importance (hospitals, DMVS, tax centers, banks, etc) are just databases that act as black boxes, storing our data as willingly or unwillingly as we give it. 

NOTES:

Cathy O’Neill – The era of blind faith in big data must end (Ted Talk, 13m)

****Cathy talks about data laundering, a process in which technologies hide ugly truths inside black box algorithms and call them objective. We are calling these things “meritocratic” when they are in fact non-transparent, and have the importance and power to wreak destruction. They are in-fact, if we think of one of the worst possible outcomes, mass-weapons of destruction. Except this is not “revolution, but evolution”. This adaptation of technology that integrates meta interfaces for data laundering is only revolutionizing and codifying biases that have been present in our society long before contemporary codes were able to solidify and exacerbate them.  

Separates winners from losers

Or a good credit card offer

You choose the success of something

Algorithms are opinions embedded in code

Reflect our past patterns

Automate the status quo

That would work if we had a perfect world

But we don’t, we are reinforcing what we put in, our own collective memories and biases

They could be codifying sexism

Or any kind of bigotry

DATA LAUNDERING

Its a process by which:

  1. Technologists hide ugly truths
  2. Inside black box algorithms
  3. And call them objective

Call them meritocratic

When they are secret, important, and destructive

Weapons of mass destruction

These are private companies

Building private algorithms 

For private ends

Eve the ones I talked about with police and teachers

Those are built by private institutions

And sold to the government 

This is called: Secret Sauce

PRIVATE POWER

They are profiting from wielding the authority of the inscrutable (impossible to understand or interpret)

Now you might think

Since all this stuff is private

Maybe the free market will solve this

There is a lot of money in unfairness

Also were not economic rational agents

Were all racist and bigoted in ways that we don’t understand and know

We can check them for fairness

Algorithms can be interrogated 

And they will tell us the truth every time

This is an algorithmic audit

1.Data integrity check

  1. Definition of success; audit that

Who does this model fail?

What is the cost of that failure?

We need to think of: The long-term affects of algorithms, feed-back loops that are enacted.

Data scientists

We should not be the arbiters of truth

We should be interpreters of the voices…

__ VIRGINIA EUBANKS Automating Inequality

Virginia Eubanks – Automating Inequality (talk, 45m)

****Virginia Eubanks talks about Automating Inequality as a creation of an insititution that upholds this “idea of austerity in which there is seemingly not enough resources for everybody, and that we have to make really hard choices about who deserves to attain those basic rights to get those resources”. She talks about the feedback loops of inequality that live in database storages and assume this idea of austerity, therefore looping it back into the system. Something very important she talks about is how the feedback loops create false negatives and false problems, which are the dangerous and debilitating parts of how this data can be used against individuals in favors of others. 

We are creating an institution 

This idea of austerity

That there is not enough for everybody 

And that we have to make really hard decisions/choices about who deserves to attain their basic human rights

Tools of AUTOMATING INEQUALITY

More part of evolution 

evolution  than revolution

Their historical roots go back all the way to 120’s

  1. Digital poorhouse assumes austerity (by assuming it it recreates it)
    1. DATAWAREHOUSES- private healthcare stores that information secretly, public health databases may not / do not
    2. If you can afford to pay your babysitter out of pocket, then that information about your family will not end up in the Data Warehouse

False positives problems 

seeing risk of harm where no harm actually exist

the system confusing system of parenting while poor, with poor parenting 

creating a system of Poverty profiling- spent so much time investigating and risk-rating families in their communities, created a feedback loop of i justice

that began with 

-families getting more data collected about them because they were interacting with county systems 

-having more interactions meant their score was higher

-having their score higher meant they were investigated more often, which means they were investigated more often

-and because they were investigated more data was collected on them and so forth so on the loop continues and data collection grows 

feedback loop- same as predictive policing 

FALSE NEGATIVES

Not seeing issues where issues might actually exist

Because there is barely any data on abuse in upper and middle class families in the data warehouse, the kinds of behaviors that might lead to abuse or neglect in those families can be ignored or misinterpreted by this algorithm because its not logged 

Geographically isolated places, or suburbs (misses key opportunities to outreach to places like these

Discrimination is occurring or getting initiated the most within the community, is when the COMMUNITY calls in to share those cases with the Child Welfare phone responders

-When call screeners receive these, there is also a bit of disproportion that happens in that moment

Referral – which is embedded in our cultural understandings of what a safe and healthy family looks like, and in the United States, that family looks white, heterosexual, and rich,

-Removing discretion from those online workers could remove a stop, to the massive amount of discrimination thats entering earlier in the process, and could potentially worsen inequality in the system rather than making it better

Discretion- energy

Never created or destroyed, its just moved

Who are we taking discretion away from, who are we giving it to?

-Removing discretion from frontline child welfare workers that make up a large amount of the diverse women work force

Giving it to the economists, computer engineers, and social scientists who are building the models

These tools, at their worst, can serve as an ~empathy override~

We’re allowing us to outsource to computers some of the most difficult problems and decisions we face as a society

Coordinated Entry System (used around the country, around the world)

Responds to the county’s extraordinary housing crisis

-works by assigning each unhoused person who they have managed to survey a number/score that falls on a spectrum of vulnerability

VISPIDAD

Vulnerability  Index and Service Prioritization Assistance Tool

-Serves those at top of scale, chronically homeless, 

-Serves those at the bottom of the scale, new homeless who need just little to help get back on their feet

Labeled as: not vulnerable enough to merit immediate assistance, BUT

Not STABLE enough to be served by the time-limited resources of rapid RE-housing

Leave people feeling as if they are included in a system that asks people to incriminate themselves in turn for a higher lottery number in the system

-Give folks your data and HOPE you get matched with a better housing opportunity

-or close yourselves out of most housing resources in the community at all

Data management from survey of the organization is shared with a 161 organizations, and because of federal law and databases, one of those orgs is the LAPD (homeless management information system)

___BLACK BOX SOCIETY

There is even an emerging fi eld of “agnotology” that studies the “structural production of ignorance, its diverse causes and conformations, whether brought about by neglect, forgetfulness, myopia, extinction, secrecy, or suppression.” 

But what if the “knowledge problem” is not an intrinsic aspect of the market, but rather is deliberately encouraged by certain businesses? What if fi nanciers keep their doings opaque on purpose, precisely to avoid or to confound regulation? That would imply something very different about the merits of deregulation. The challenge of the “knowledge problem” is just one example of a general truth: What we do and don’t know about the social (as opposed to the natural) world is not inherent in its nature, but is itself a function of social constructs. Much of what we can fi nd out about companies, governments, or even one another, is governed by law. ****

Laws of privacy, trade secrecy, the so- called Freedom of Information Act— all set limits to inquiry. They rule certain investigations out of the question before they can even begin. We need to ask: To whose benefi t?

Some of these laws are crucial to a decent society. No one wants to live in a world where the boss can tape our bathroom breaks. But the laws of information protect much more than personal privacy. They allow pharmaceutical fi rms to hide the dangers of a new drug behind veils of trade secrecy and banks to obscure tax liabilities behind shell corporations. And they are much too valuable to their benefi ciaries to be relinquished readily.

Even our po liti cal and legal systems, the spaces of our common life that are supposed to be the most open and transparent, are being colonized by the logic of secrecy. The executive branch has been lobbying ever more forcefully for the right to enact and enforce “secret law” in its pursuit of the “war on terror,” and voters contend in an electoral arena fl ooded with “dark money”— dollars whose donors, and whose infl uence, will be disclosed only after the election, if at all.6

But while powerful businesses, fi nancial institutions, and government agencies hide their actions behind nondisclosure agreements, “proprietary methods,” and gag rules, our own lives are increasingly open books. Everything we do online is recorded; the only questions left are to whom the data will be available, and for how long. 

Knowledge is power. To scrutinize others while avoiding scrutiny oneself is one of the most important forms of power.8 Firms seek out intimate details of potential customers’ and employees’ lives, but give regulators as little information as they possibly can about their own statistics and procedures.

Sometimes secrecy is warranted. We don’t want terrorists to be able to evade detection because they know exactly what Homeland Security agents are looking out for.12 But when every move we make is subject to inspection by entities whose procedures and personnel are exempt from even remotely similar treatment, the promise of democracy and free markets rings hollow. Secrecy is approaching critical mass, and we are in the dark about crucial decisions. Greater openness is imperative. (No transparency!)

Financial institutions exert direct power over us, deciding the terms of credit and debt. Yet they too shroud key deals in impenetrable layers of complexity. In 2008, when secret goings- on in the money world provoked a crisis of trust that brought the banking system to the brink of collapse, the Federal Reserve intervened to stabilize things— and kept key terms of those interventions secret as well. Journalists didn’t uncover the massive scope of its interventions until late 2011.13 That was well after landmark fi nancial reform legislation had been debated and passed—without informed input from the electorate— and then watered down by the same corporate titans whom the Fed had just had to bail out.

Deconstructing the black boxes of Big Data isn’t easy. Even if they were willing to expose their methods to the public, the modern Internet and banking sectors pose tough challenges to our understanding of those methods. The conclusions they come to— about the productivity of employees, or the relevance of websites, or the attractiveness of investments— are determined by complex formulas devised by legions of engineers and guarded by a phalanx of lawyers.

Frank Pasquale – Black Box Society – chapter 1 (pp 1-11)

—ABOUT BIG DATA and how there are multiple hidden layers of “firewalls” to get through, if you were to start calling out firms for being more transparent, for example, you would still have to get through all that incredibly hard to understand contract-jargon, that might end up being the discouragement needed to not press the subject further, “However, transparency may simply provoke complexity that is as effective at defeating understanding as real or legal secrecy”.

Transparency is not just an end in itself, but an interim step on the road to intelligibility.”

***So why does this all matter? It matters because authority is increasingly expressed algorithmically.22 Decisions that used to be based on human refl ection are now made automatically. 

Software encodes thousands of rules and instructions computed in a fraction of a second. Such automated pro cesses have long guided our planes, run the physical backbone of the Internet, and interpreted our GPSes. In short, they improve the quality of our daily lives in ways both noticeable and not

The same goes for status updates on Facebook, trending topics on Twitter, and even network management practices at telephone and cable companies. All these are protected by laws of secrecy and technologies of obfuscation.***

Though this book is primarily about the private sector, I have called it The Black Box Society (rather than The Black Box Economy) because the distinction between state and market is fading****CAPITALISM IS TURNING INTO GOVERN. PROXYS

We are increasingly ruled by what former politi cal insider Jeff Connaughton called “The Blob,” a shadowy network of actors who mobilize money and media for private gain, whether acting offi cially on behalf of business or of government.24 In one policy area (or industry) after another, these insiders decide the distribution of society’s benefi ts (like low- interest credit or secure employment) and burdens (like audits, wiretaps, and precarity).

But a market- state increasingly dedicated to the advantages of speed and stealth crowds out even the most basic efforts to make these choices fairer. 

Obfuscation involves deliberate attempts at concealment when secrecy has been compromised. For example, a fi rm might respond to a request for information by delivering 30 million pages of documents, forcing its investigator to waste time looking for a needle in a haystack.17 And 

the end result of both types of secrecy, and obfuscation, is opacity, my blanket term for remediable incomprehensibility.18

However, transparency may simply provoke complexity that is as effective at defeating understanding as real or legal secrecy. 

 

Technology + Race

Safiya Noble

We are on a quest to “curating human information needs”.
Why don’t we practice and promote cross disciplinary involvements and discussions, like in example of this institution?
These capitalistic institutions, such as UIUC, keep us separate. I remember us talking in Interactions I about how difficult it is to double major in two majors that are not from the same “sides” of campus, such as New Media alongside Pre-Med. Trying to schedule classes around these curriculums that are not designed to be flexible between different majors is made even more complicated, when the websites algorithm denies you access of registration in some cases, and is programmed with restrictions that cannot be fixed by anyone but the programmers, and who wants to program a new site if they don’t understand the implications of the old one? Computer engineers, and STEM majors in general, are trained to think critically in terms of “X, Y, Z” ideology and theories, not socially, politically, or communally.

Ruha Benjamin

The idea that Ruha keeps exfoliating is that nothing, especially technological programming, is made without intention. She said that even programs that are portrayed as compassionate and proactive, can be the most dangerous, because they hide the larger inputs’ desires for intentions that are not based around an abolitionist commitment. 
This idea that putting out all knowledge ever, “MORE MORE MORE, capitalistic and democratic tendencies of having everything out in the open”, yet there are politics within that that restrict and withhold information. This ideology of more is better and putting it all out there is a homogenous way of thinking about everyone involved, (or not thinking about everyone involved, intentionally judging that all those who participate have shared equity, and not that they are coming from different social environments).
“We can’t lose sight of the larger structures that continue to fuel the problem”, and we especially cannot ignore the programmers and CEO’s fueling the fire. We can try doing this by breaking down how we are taught to think in very systematic ways, and teaching others that you cannot create algorithms for culture and society if you do not know anything about those things.

Lisa Nakamura

Lisa’s presentation was interesting because it talked of how open-game virtual reality is, there are, “no eggs in the basket”, or more so that there are, but they have not been studied to their full extents. She talked about how VR is the “harbinger” of the third industrial revolution, and how it redresses the problem of the second industrial revolution, the “immiseration of humans as machines taking our jobs (…) by making available the last kind of work that machines can’t do, create the right kinds of emotions that humans need”.
She talked a lot about emotional labor, argues that VR, “automates the labor of feeling pain and sadness on behalf of another”. This emotional labor that VR is able to hold, puts into place feelings of compassion and empathy, that creates an “alibi” for “material conditions of labour for racialized and gendered people”, that have always been present. In the case of Travon Martins VR starring in One Dark Night, giving viewers the experience of witnessing how Travon got murdered on that night, Lisa emphasizes that we need to think further of VR than just a source of empathy, shock, or compassion. VR needs to “invite you to be with you, instead of as, in a virtual space/experience”.

NOTES:

First talk (45)


Technology is not flat, it is the construction of human beings, what are those human beings putting in? What are their experiences

Critical race theory
Things that are actionable

Hyper-sexualization is not a product or an observation of the NATURE of these black women
These ideologies are tied to old media practices

Hundred year history
Counter narrative of what its been portrayed as

In these stereotypes of black women as Jezebel
Ofc, There has to be a mass justification for the reproduction of the slave labor force
Part of why that mass justification of the labor force comes into existence as characterizing black women as sexual, women who like sex and who want to give babies to the “labor force”
Racist capitalistic stereotypes used as economic subjugation of black people and women

When the enslavement of black labor force became illegal
How that justification was imagined and instilled

Hyperlinks, that have capital underneath them
They are well trafficked images

We are on a quest to “curating human information needs”.
Why don’t we practice and promote cross disciplinary involvements and discussions, like in example of this institution?
These capitalistic institutions, such as UIUC, keep us separate. I remember us talking in Interactions I about how difficult it is to double major in two majors that are not from the same “sides” of campus, such as New Media alongside Pre-Med. Trying to schedule classes around these curriculums that are not designed to be flexible between different majors is made even more complicated, when the websites algorithm denies you access of registration in some cases, and is programmed with restrictions that cannot be fixed by anyone but the programmers, and who wants to program a new site if they don’t understand the implications of the old one? Computer engineers, and STEM majors in general, are trained to think critically in terms of “X, Y, Z” ideology and theories, not socially, politically, or communally.
———


Imagination is a contested (to gain power) feel of action



People that create tech companies aimed to help out social causes like Jayz in Promise, those who don’t have an abolitionist commitment(..)
Seen as empathetic to the cause because his decarcerarion startup addresses the problem of pre-trial detention, but his app sells their GPS data (gps is in business with them) that tracks those individuals, trapping them further in the industrial prison complex surveillance system.
Promise exemplifies the new jim code
It’s insidious because it’s packaged as betterment

AI drone strikes more effect
ICE CONTRACT – microsoft, They basically said – we will not work alongside these people that support development of warfare and surveillance in the war.
Workers efforts to sway the industry- we can’t wait for them to change the system.

Professionalism, individualism, and reformism
to contribute to radical labor organizing

Racism is not the “white-boogey man” that everyone thinks is hiding behind the screen
She is trying to distinguish that racism can indulge systematic oppressions by having ulterior motives
(Are they trying to be racist?***notes)
You cannot design something without intention…
Someone designed the thing to have intentions, and perhaps by not being aware of the political and social environments they are entering by creating these technologies, these make for things like… (weapons of warfare!)
*Its about large and small inputs that cater to the metainterface entrapment of surveillance capitalism.
Its not a singular person that is out to get someone through an app or computer screen, that is what is “malicious”, but its the thoughts and patterns and predictions of the person who is intending this algorithm to have certain behaviors that are based off desires or gains, and/or ignorance, of race, gender, class, economy, professionalism, capital, politics, etc. etc.
We can’t lose sight of the larger structures that continue to fuel the problem





…because if the ways it can be MISUSED and TURNED against them

The idea that Ruha keeps exfoliating is that nothing, especially technological programming, is made without intention. She said that even programs that are portrayed as compassionate and proactive, can be the most dangerous, because they hide the larger inputs’ desires for intentions that are not based around
This idea that putting out all knowledge ever, “MORE MORE MORE, capitalistic and democratic tendencies of having everything out in the open”, yet there are politics within that that restrict and withhold information. This ideology of more is better and putting it all out there is a homogenous way of thinking about everyone involved, (or not thinking about everyone involved, intentionally judging that all those who participate have shared equity, and not that they are coming from different social environments)
“We can’t lose sight of the larger structures that continue to fuel the problem”, and are the main culprits. We can try doing this by breaking down how we are taught to think in very systematic ways, and teaching others that you cannot create algorithms for culture and society if you do not know anything about those things.

Don’t leave it up to the technicality nerds to let us know what is ethical


“Thin description”
Bio’s?



Whether exposure of their practices is necessarily the most prudent way of going about that” – student

_____

Invite you to be with you, instead of as, in a virtual space

Emotional labor
The third industrial revolution
Norbert Weener; Fred Turner

Facebook and Oculus Quest

VR as a harbinger of the third industrial age
Redress problem of second Industrial Age
Immiseration of humans as machines take our jobs, and to create the (…)by making available the last kind of work that machines can’t do, create the right kind of emotion/feelings.
Empathy and compassion, is:
Valuable and fundamental, as perceived by (Lauren Berlin)
VR takes the place of the progress of rights and resources
Refugee women of color disabled seek to not white men proxied in VR to find human recognition
Norbert Wiener
Claims: compassion is something u can make, u can do that.
Automates the labor of feeling pain and sadness on behalf of another.
Empathy into the realm of non-human virtual witnessing and connection, or non-virtual witnessing and connection



Before

Lisa’s presentation was interesting because it talked of how open-game virtual reality is, there are, “no eggs in the basket”, or more so that there are, but they have not been studied to their full extents. She talked about how VR is the “harbinger” of the third industrial revolution, and how it redresses the problem of the second industrial revolution, the “immiseration of humans as machines taking our jobs (…) by making available the last kind of work that machines can’t do, create the right kinds of emotions that humans need”.
She talked a lot about emotional labor, argues that VR, “automates the labor of feeling pain and sadness on behalf of another”. This emotional labor that VR is able to hold, puts into place feelings of compassion and empathy, that creates an “alibi” for “material conditions of labour for racialized and gendered people”, that have always been present. In the case of Travon Martins VR starring in One Dark Night, giving viewers the experience of witnessing how Travon got murdered on that night, Lisa emphasizes that we need to think further of VR than just a source of empathy, shock, or compassion. VR needs to “invite you to be with you, instead of as, in a virtual space/experience”.

Social Interaction / Photography / Media

TedTalk:

Babies and baby monitors
Even things that do not need to be measured, such as when is the best time for babies to fall asleep at and how to make that happen, is normalized in turn to create more perfect schedules that cater to the 9-5 working class. Because we do live in such a capitalistic state, a lot of our forefathers had intentions to “monopolize” time. When the industrial revolution hit, it became systematic that workers and time schedules be kept documented, and in turn commodified. In this way, time has steadily progressed as an item of commodity that can be transformed and utilized through machinery and every day “pleasure” mechanics that prioritize the keeping of a same, daily schedule. It is natural, for people to want consistency in their lives, of course, yet this consistency is going towards an idealized system of production, complacency, and behavior.

Technologies or medicines that are applied for one specific reason might be transformed into an every day usage of different data gathering, one that works more with convenience than necessity. Such as fit bits, that once may have been of relation to step-monitors for people trying to track their activities, are not widely used even for the most healthiest of people, so they can continue being healthy.

Social Photography:

Are we more aware of the present, then? Underneath it all, or aware that the present could very well be the past in a matter of seconds? Are we choosing to see 5 steps ahead instead of living in the moment?
I think this could be a phenomenon thats invoking a lot of quarter-life crisis’. I’ve noticed this in myself, as well as amongst the peers I’ve been talking to, have been feeling like their future is so uncertain because of certain, certains. This could include loans, job prospects, debt, etc. I also think this immortalization of online media that captures and solidifies moments such as births, deaths, or diseases, life events, life falls, etc., makes us more susceptible to be looking into the future, expecting these moments to happen. I don’t think that because we are able to see so many social photos documenting loss or pain, that its exactly making us more empathetic to it. I think we are rather disconnected from the moment because what we see on our social media feeds confirms our worst fears, and perhaps we become desensitized to it, as well as compartmentalize it.

Social Media:

This enumeration of Facebooks metrics makes me think of grade school and how teachers would create a “good-behavior” chart, and would write everyones names down in columns and rows awaiting to be labeled with glittery star stickers. This type of comparison to other students, as well as being able to see who got the most stickers, is a bit cruel to me. For some, this worked, as they saw that others were getting more stars than them. For some, this could be very disheartening, causing them to fight back against the grain. This type of mentality, that we will always be comparing ourselves to others, even though we might not want to, is ever more present in meta interfaces like social media that frame the individual viewing and the subject, as comparable. This is further solidified though the use of numbers, of course, like the article said.

Interface Criticism / Tactical Media / Software Art (Feb 5)

Wendy Chun – Programmed Visions, (book, pp. 1-2, and optionally pp. 3-10)

Software is this intangible but concrete thing. A metaphor that I likened that to, is perhaps how our words are an ephemeral, intangible thing, yet this invisible thing generates incredibly visceral effects. I think about this in how leaders, CEO’s, and teachers are literally able to change the projection of a persons life, sometimes simply through a powerful phrase. I think about how people copyright expressions and phrases, and how even a thing as concrete as language but as fleeting as the moment, can be commodified. I also think about the many people who have perhaps said similar words of similar effects a long time ago, and how that spread of information was naturally spread through conversation, until it got to the hands of someone who could write it down or archive it, or speak it in front of some mass audience that would remember and pass it down to the next generation.

Matthew Fuller – How to be a Geek (book, pp. 12-14, and optionally pp. 63-71)

This article talks about bringing in the conversation of software culture in a way that is accessible to everyone, by using language and forms of expression in ways that relate this “concept” away from just technical experts, and into ways that can be turned into discernible questions and realizations of the problems within. I love the way the Geek is described, and I totally agree with the notion that geeks are in fact running the world. Every company wants an employee that is over-accommodating and zealous in their chase for information, because that chase usually brings new revelations and inventions from within the Geek themselves. This is something rare, because concepts can now be commodified, and it is sad to think about someones life idea or revelation being turned against them in a for-profit if their only way of thinking was to push that idea to the max, simply because of how their brain (and influencing environmental conditions) decided to function around certain ideas. But it is hopeful that there are geeks who geek out to the ideas of problematics, perhaps going beyond their individual scope of understanding their own relation to the issue, to find the relation between us all.

Geert Lovink – Sad by Design (podcast w/ Douglas Rushkoff, 60m)

I believe that the corporations are really trying to separate us, in terms of dividing and conquering our individualities through our more than willing use of social media and metainterfaces. It is easier to take down individuals rather than a whole group of people standing as one, yet even this concept is frightening. Even if we did stand together as one against the elites who perhaps “no longer need us”, and decide to use all the data ever to target individuals. An army is only as strong as its “weakest link”, but by those means, almost everyone is a weak link, because we do not even know what we don’t know. It was really interesting to hear Geert Lovink talk about a metaphor of open-face and honest technology as free-range chickens, as someone is always getting victimized, there is no such thing as open-face in private or public software interaction. It is a “compromise that rubs the wrong way(…) the companies are not in it for the humans (…)”. So it is, in fact, creating a “demilitarized” zone between humans and these companies who still enable conflict on both sides, is just a scapegoat.

Soren Pold – New ways of hiding: towards metainterface realism (article)

The metainterface paradigm is a really scary thing to think about, because it has so many lines of communication as a concept that is limitless, and then it is connected to the industry as a product, while being an entire art/design practice of its own. These core facets of how the meta interface is interacted with today makes it nearly impossible to pinpoint its effects on just one think. It’s a new form of the actual expansion of globalization itself, but one that will continue to surpass itself and become more abstract and complicated, whilst having connections to real tangible forms of mass communication devices. It has already forever changed how globalization has evolved and has planted a seed in peoples minds about how the surface convenience of such meta interfaces are a normal progression of technology. But how can anyone think of this as a normal progression of technology when there is a concept, which can be turned tangible, and then again turned back into a influenceable concept, as just another effect of the progression of computer and software engineering? Something that is able to give influence back is no longer a simple two-to-two interaction, it is a form that is a medium, it has an effect to influence and deceive, not just receive.

Surveillance Capitalism (e-mailed for late login!)

(1)
This article is about the residual data of individuals gathered from facial recognition softwares planted on social media sites such as Facebook, and exploited to third party beneficiaries, discussing the potentials of cyber-warfare, militaristic regime control, and targeted political ads that could influence potential voters. Isn’t it too late to resist surveillance capitalism if the government already has enough data to annex these data’s and use them against us? These technologies have already come out and have been dissembled by many individuals who know how to un-mask and re-program data to replicate or alter its software. You cannot take back an invention once it has come out, making it harder to take off the market and regulate, and easier to get into the hands of positions of power and wealth.
(2)
I am not going to lie, this conference was a big run on sentence for me. Helen talked a lot about how data obfuscation is planted via privacy statements and agreements that are harder to discern and analyze for the average user. Data obfuscation hides behind contracts such as privacy agreements we sign before enlisting an app, sharing a location, buy a purchase on iTunes, etc etc. These incredibly long and dull contracts are hard to read for someone who is trying to download an app for a very temporary use, for example. I have began seeing these privacy agreements when I was young, but quickly became desensitized to them after trying to read one. I believe this type of data obfuscation is ethically okay in surveys, where you are agreeing to be deceived and the terms are easy to understand.
(3)
This article is about Chris Wiley, a gifted entrepreneur who had created an algorithm that determines a user of social medias’ preferences, and uses that information to target information at them that the algorithm finds relatable based on the information the user gives them(?). Wiley had no idea where it was going to go, and he has been trying to bring it out of the shadows for years. It is strange to think about, but a computer only holds as much information as it is given at the beginning by its creator, or so we believe given our technology today. While certain AI’s develop connections to their previous databases of knowledge, I think it is crazy but amazing to think about how some computers can analyze a persons personality better than a human can, perhaps. But this is a different usage of computer intelligence, and it plays on unknowing victims of the web. These “researchers” are using computer programming as proxies to gather personal data about people using methods such as data obfuscation within privacy agreements in order to prompt users into handing over their data. There is nothing academic in the way this data is being sold to third parties, some of which are military regimes, government programs, and unknown international companies. The potential for these vast amounts of personal data being spread to the wrong parties is almost inevitable if you own a phone or even any “smart” appliances or applications, even car GPS.
(4)
This article is about data of the personal locations of millions of people that is being integrated into new technological companies or apps, or collected from existing, and being easily accessible to even those closest to you trying to track your location routes and whereabouts. I think about how technology surveillance is getting more advanced every single day, and how apart from your employers or school being able to track you, this is also very concerning for victims of domestic abuse, or those with unstable home situations. Right now, having a phone on you is an extra security measure. It could save your life on the off chance, but it could also tip you off and make you a potential victim to those that might be wishing you harm. Technological surveillance could affect the individual and their relationship dynamics just as much as it could affect the relationship between them and their employer.
(5)
This article is about how colleges use technologies such as tracker “social credit” apps to monitor students whereabouts, attendance, and habits. Feeling like you’re being watched or monitored at all times, psychological effects could suggest that students are being “infantilized where they’re expected to grow into adults”. This reminds me of that one episode of Black Mirror, where everyone’s lives was based on a “social credit” system, and you were kicked out of society once you reached a certain rating below average to the standard. Apart from that, labeling a student as doing below average and monitoring them more than “average” students does more to lower their self-esteem than to boost it. If you are already in a vulnerable position, and someone comes to your dorm to say that they have been watching you in the state that you’ve been in for the last however many weeks or months, thats a terrifying thought for someone who might be going through something traumatic or depressive. They are already flagged as being “below average”, while they themselves may not already be thinking too highly of their own selves. This categorization and monitoring would only go to reaffirm their worst thoughts, never dispel them. Why not give them more accessible resources around campus and change the very culture instead of trying to micromanage and turn everyone into a “mono crop” (if you’re in the midwest aha).
(6)
This article talks about how the tracking of students and teachers on “social credit apps” is inherently flawed, because the data being extracted is based on society, which is inherently unequal. This makes a lot of sense, and is even harder to think about when I think about data being valued more over a persons own recollection or opinion. In the other article, a student lamented over the fact that faculty did not believe him when he said his app wasn’t working, and that he had in fact been present and early in class, when the monitoring app for his institution had claimed he had not been. I do not want to live in a society where the physical presence of data is deemed as the more tangible truth over a person beliefs and truths.