Elsagate is a phenomenon on Youtube of reoccurring themes, animations and videos of inappropriate topics, available and targeted at children.
Origins and a short history
Elsagate existed since at least 2012 but reached its peak in 2017 when the term itself was formed and gained extensive attention. Same year Youtube announced its new guidelines on content and monetization and deleted over 50 channels and thousands of videos not fitting them.
The origins of the Elsagate videos are not officially elucidated to this day, but according to a profound study, a great number of them are linked to Vietnamese media studios, officially producing harmless live-action videos.
The dark side of Youtube
These videos often feature Spiderman, Elsa, Mickey Mouse and other characters popular with kids, either animated or played by adults in cheap costumes. Most times there is no dialogue at all, only background music (nursery rhymes, obviously), and it is difficult to recognize the harmful content at the first sight. Only if we carry on watching for some minutes we encounter several disturbing or even shocking images targeted at toddlers. The list of displayed topics is long and includes everything from insects through violence and gore to the most brutal sexual fetishes.
At the same time, thumbnails featuring characters associated with kids and family makes it difficult to fight this phenomenon by Youtube’s algorithm.
Today’s situation
As I have already mentioned, 90% of the described channels and videos have been shut down by Youtube, but unfortunately, there always seem to be more and more new ways to trick the algorithm and deceive people for some extra clicks.
Wired did an investigation on the Topic pages of the most popular games on Youtube this March. What they have found was various concerning visuals (blood, poop, violence) in the thumbnails of videos of the two most popular games among tweens: Minecraft and Among Us. These are not officially Kids content, but in practice over half of the most-viewed videos on YouTube, proper are marketed to children.
Youtube’s hashtag system also appeared to have a problem: videos under the Among Us hashtag included adult content, like female avatars removing their undergarments, or their male counterparts spanking them or looking up their skirts. Due to the system, this content is easily accessible from any Among Us video that includes the hashtag.
However, as Wired also emphasized, this is not a direct Elsagate repeat. Most times the disturbing content was limited to shocking thumbnails, the animations only minorly featured violence. Also, other games equally popular with kids (Fortnite, Roblox) did not have such a problem.
However, this clarifies that there is still a lot to be done, and we need to pay attention to the content consumed by children. Especially at the age of 13-16 when they are no longer under the protection of the filters of Youtube Kids, but experiencing intense changes in their personality and maybe vulnerable emotionally and mentally.
Obviously, it is impossible and useless to try and keep track of the contents consumed by a teenager on the internet. What we can do instead is to create an atmosphere at home where they feel safe and not ashamed to share if they encounter something disturbing or unsettling online.