AI / Predictive Analytics / Recommendation Algorithms

Something is wrong on the internet

I’ve been saying this for YEARS but usually as a pointed gripe pertaining to some quite specific and honestly random if not also menial things. You know what? All those gripes were valid. From the lengthy down time for incredibly expensive services, the extremism on facebook, and all the goddamn Spider-Man & Elsa shit on YouTube. We need to go beyond pointing out that something is wrong and venture into the realm of just flat out saying “dude what the actual fuck is going on???”. I mean I ~know~ whats going on. It speaks to whats immediately apparent and wrong with how we use algorithms. I think we all get the main idea here – if it makes money, companies will not assess flaws unless there is a legal liability. Moralistic liabilities are not contested until we have media coverage and even then – its about….. stocks. That’s what’s happening, and its just how YouTube operates.

Getting approved by AI

Using technology to speed up processes that are highly personal and dictated on a largely subjective basis is pretty much the easiest road to dystopia. Processing applications on a clear concise goalpost basis is perfectly fine! Having a computer discern, based on a video of you talking, whether you’re worthy of a job. This computer cannot discern whether or not you get along with the rest of the team at this company. Much like facial recognition, welfare systems, and literally any given system used in the west, Blackness will correlate with failure and Black people will be harmed by this system.

How TikTok Holds Our Attention

Okay, the Anne Frank thing was funny. But also like, I’m not a Nazi so maybe those guys find it funny in a more racist and less morbid absurdist sense. TikTok doubles as a creative platform and a deeply interesting and maybe concerning example of how algorithms feed content. This is much much faster than YouTube’s content feed, though lower in volume (in minutes I mean). Algorithmic feed is something so new (generally speaking) that it’s effects won’t be understood much for another decade. What does this mean for the future? Users create whatever they want however this algorithm is meant to keep you watching – this is an issue YouTube has where it will supply you things that keep you watching even if those things are extremist, alarmist, or deceptive. Where do we find the control on the runaway effects that this has on the rest of society?

Inadvertent Algorithmic Cruelty

I think this anecdote here is heavily stilted in the sense that it leans into to technology for the void in meaning in a somewhat oblivious way. You fed that algorithm. It’s showing you what you gave it. Just because someone passed away, is a person supposed to know that you’d rather not see their face? If not, how would an algorithm? This is asking for some pretty particular sensitivity in a situation where, if anything, maybe you should seek other assistance in the case that seeing a photograph of your child gives you this much stress. Seriously. I was an abused child who would breakdown at the mention of my abusers name, just the name. I understand some of these suggestions are largely to mitigate unintentional harm on the user end, but frankly I don’t think these systems can be so incredibly smoothed out. I don’t think they should be either. Trying to change the system to fix a problem that maybe a total of 20 users may have is the easy way to break the rest of the system. If your goal is to reduce harm, you need to manage your expectations of what reduction means.

The problem with metrics is a big problem for AI

Metrics, success, failure, optimization. I smell the paycheck I earned for all those high viewing and somehow ad-friendly Spider-Man Elsa videos. So many companies employ metrics (money stats, as I like to call them here) as a means of maximizing profit. This is perfectly reasonable until every game that comes out has macro transactions, a reduce budget, procedural generated content, and weekly updates in accordance with trends across the consumer base. According to the metrics, which feed our algorithms, which determine our decisions as a company, there’s a big issue with everything being fucking trash lately. Maybe we’re not maximizing money stats enough. Maybe our optimization has too many people involved. With the definition of success in media being maximum engagement and maximum profit, addictive, violent, exploitative, or otherwise fucking horrendous content and business practices will win out every single time.

Leave a Reply

Your email address will not be published. Required fields are marked *