Automation and its discontents

(Screenshot/Youtube)

(Screenshot/Youtube)

Long story short: children are being traumatized by videos from a new app that YouTube claims offers videos specially selected to be suitable for children. Several articles have been written on this topic in recent weeks from news sources that include Medium, The Sun and even the New York Times. The reason YouTube has not done a good job of filtering its videos is complicated but is the root problem. It is also an aspect of this situation that has not been sufficiently discussed.

For me to explain to you what is going on, I must first dispel the notion that you, as a citizen of the internet, are free to see what you want to see and to access whatever information you are looking for. On the surface, this is the case. In reality, this is not true at all. The reason for this is the sheer quantity of information available online. Surely you know there’s lots of stuff on the internet, but it is difficult for the human brain to imagine the actual scale of the thing – it simply isn’t designed to deal with numbers that big. For context, consider that, as far as we know, Google has only indexed about 0.004% of this data. This creates a problem that is fairly new to us humans: we have too much information, rather than not enough. As such, we are willing to pay for this content to be curated.

Without someone to sort through this stuff for us and pick out the interesting bits, we’re shipwrecked and drowning in a sea of discontent, the whirlpools of ideology dragging us down into the depths. The problem, I should clarify, is not whether the information you read on the internet is true. There’s so much truth out there already that no one needs lies anymore. Being exposed to selective facts is a much greater danger. Perhaps there is no government agency preventing you from accessing certain content. There doesn’t need to be because there’s something with far greater control over what we do and do not see determining, for example, whether or not you are reading this article right now.

Content on the internet is monetized. Monetized videos, one variety of such content, allow your streaming service to display advertisements in exchange for money. Note that there are very few human elements here. You pay to have your product seen. The ads are shown automatically and your video is suggested to users automatically. Sophisticated algorithms control every interaction between the content creators, advertisers and viewers. This is a task relegated to software, rather than to people. Machine learning, a field that aims to allow computers to learn new things without them being explicitly programmed, is what takes the place of human judgement when the scale of information in question is far beyond the realm of what humans can sort through manually. These algorithms are good at what they do – and they get better and better – but they are not perfect. Videos that shouldn’t slip past the filters still make their way into the collection of videos YouTube deems fit for children.

These algorithms are not new. Some content aggregation websites can even tell their writers how many views they can expect from their articles knowing little more than the title. We are getting very good at knowing what sells. With this in mind, these YouTube videos that are generated purely for profit feature bizarre titles that incorporate the names of popular search terms. Common phrases include “wrong heads,” “learn colors” and “finger family,” as well as the names of children’s television shows. These videos usually feature popular characters acting out songs with sub-par animation. On occasion, a video will feature something bizarre or violent. People are out there making videos of this nature so that children will view them, knowing that a video that looks sufficiently like one designed for children may be overlooked by the algorithms that filter YouTube’s content. This is the problem. There is no simple solution, but if YouTube intends to continue marketing its videos to children, it is a problem that needs to be addressed.


Eli Udler is a contributor for The Daily Campus. He can be reached via email at eli.udler@uconn.edu.