Content Farms and Youtube Algorithms: Match Made in Heaven or Partners in Crime?

Picture of By Quynh (Stephanie) Bui

By Quynh (Stephanie) Bui

As quarantine life looms into its third month, many of us have ventured into the Do-It-Yourself section of YouTube, taking up the challenges of baking, whipping up the dalgona coffee or creating decor from everyday objects. DIY videos have become an entrancing resource where we toy with our skillsets or waste some time. However, what if these Internet creations were created purely for viewing entertainment and are not replicable? Welcome to YouTube’s “content farms,” where these multi-million companies create mass-produced and seemingly harmless DIY video mashups, posing serious risks towards YouTube’s young users and independent creators.

What exactly are content farms?
Although the term initially referred to large companies producing a massive amount of textual information, the advent of YouTube has allowed the video format to be exploited and monetized. The mechanism stays the same: content is designed to satisfy the algorithms for maximum retrieval by automated search engines. Since YouTube prioritizes watch time over clicks, these companies churn out 3-4 videos daily in the format of 10-minute compilations of mini Facebook/Instagram tutorials, sharing “life hacks” that supposedly enhance our “inefficient” lives.

These videos lure bored audiences in with wild clickbait titles like “HACKS YOU NEED TO KNOW!!!” and “TUTORIALS ANYONE CAN DO.” Thus, their success is based on catering to popular SEO terms visible in the titles and featuring visually stimulating, poppy, and vibrant thumbnail images. Some of these major corporate “farms” include 5-Minute Crafts (66.8 million subscribers), Troom Troom (18.6 million), Blossom, and So Yummy. This business model is preposterously lucrative, as 5-Minute Crafts reportedly earns between $2 million and $34 million annually, with 550 employees producing 1,500 videos a month, posted across 40 Facebook pages in 10 languages. 

How YouTube Algorithms Mess with Video Recommendations
The YouTube algorithm has incentivized these content farms to flourish through its recommendation system. According to TechCrunch, these powerhouses supply a plethora of YouTube videos that “drive ad revenue” and “edge out other content.” The rule is simple: “The more videos that are watched, the more ads that are seen, and the more money YouTube makes.” Ever since Google Brain (Google’s AI division) started operating YouTube recommendations, watch time from the YouTube homepage has grown 20-fold, with more than 70 percent of watch time recommended by the algorithms. Thus, big companies with recycled, low-quality content thrive on the platform as long as they suffice to the algorithms’ watch time demands. However, YouTube simply regarded their technology advancement merely as a reflection of users’ tastes based on individual profile and viewing history, or in their words, a “steady stream, almost a synthetic or personalized channel.”

Not only are these “instructional” videos useless, but they are also harmful to children who cannot distinguish the nonsense they consume.

Content Farms, YouTube Algorithms, and the Trivialization of Internet Entertainment
It seems like there’s no problem, where this is a win-win situation for both content farms and YouTube. The channels make clickbaity videos that garner millions of views and are funneled into our mindless YouTube feeds by the algorithms, translating to more watch time and ad revenue for the channels and YouTube. This wouldn’t be problematic until these “instructional” videos backfire: not only are they useless, but they are also harmful to children who cannot distinguish the nonsense they consume.

As the competition for audiences’ attention intensifies, viewers are instantly attracted to eye-catching, absurd, entertaining videos that seem too easy or too crazy to be true. This troubling trend represents where Internet entertainment has ended: the average, mainstream audience no longer cares for well-written, informative content but rather fast-paced, generalized products. Content farms exploit this gratification, as one of Troom Troom’s producers stated their videos are not intended to be replicable and educational, but were made “just for fun.”

However, these videos present a new threat as children who follow these “recipes” or “hacks” get discouraged or physically hurt. As these videos are not based on scientific principles or even common sense, a new sub-genre of debunking videos emerges where they ridicule the stupidity and reveal danger from these DIY tutorials. One of the pioneers in the crusade is Ann Reardon of the baking channel “How to Cook That,” in which she refers to these hacks as “fake news of the baking world.”

Reardon states how these microwave-based tricks trivialize the process of cooking, where children end up wasting ingredients with no fruition. From making flan in a milk carton with marshmallow in a microwave to making gummy bear jello, these unrealistic methods dampen children’s passion for cooking. Some “hacks” are actually concerning, where one video instructs viewers to make swirls with heated caramel (hotter than boiling water) using a hand mixer or dye strawberries with bleach to discolor them. And much like how Donald Trump “sarcastically” urged people to inject disinfectant, this too will result in gruesome consequences if someone follows through.

A New Threat to Independent YouTubers and Viewers
Reardon also stresses how content farm videos and YouTube algorithms have forced many genuine YouTubers out of the platform. These are only a few warning signs of the direction YouTube is heading towards. So long the days where YouTube was made to “broadcast yourself.” The modern-day YouTube now focuses on making money with little consideration for content quality. The Pewdiepie vs. T-Series feud indirectly encapsulates this large-scale battle between creators and YouTube. Independent creators cannot keep up with the massive release of videos from “automated, fully monetized corporate channels” with enormous resources. Reardon specifically discusses how baking channels struggle to continue because the money from YouTube’s ads cannot pay their ingredients costs and efforts to create real recipes. 

When criticized for allowing these dangerous videos on their platform, YouTube dismissed their responsibility as these videos “do not violate [their] guidelines.” Reardon also mentions that leaving negative comments will not solve the problem since comments are one of the popularity signifiers of the algorithms, resulting in the harmful videos being recommended to more viewers. 

The Future of YouTube
So the question remains: what can YouTube do to address these alarming concerns, given that the companies themselves will not try to improve their “cash-cow” content? Many experts have suggested more human involvement in the algorithms. After all, machines cannot overrule human judgment and human moderators are needed to identify problematic content. As Tristan Harris, ex-Google employee and the leader of the Center for Humane Technology states: “You can’t possibly have exponential consequences with exponential responsibility unless you have an exponential amount of human thought to be dedicated to those challenges, and you can’t put that just into an algorithm.”

 

Cover: 5-Minutes Crafts via Gena Radcliffe Does Things


Avsallar escort
Tosmur escort
escort

Join Our Newsletter

New on Medium

Follow us

Google Workspace Google Workspace prijzen Google Workspace migratie Google Workspace Google Workspace