I dislike the advertisement business model that currently dominates the Internet. My biggest gripe with it is that it encourages website operators to spy on their users. While nothing so far as been able to supplant advertising as a revenue source, we may see website operators looking more thoroughly for an alternative. Why? Because advertisers are starting to realize the impossibility of perfectly enforcing rules on globally accessible services that allow users to upload content:
YouTube is still grappling with predatory comments on child videos, and it’s once again facing the consequences. Bloomberg has learned that Disney, Fortnite creator Epic Games, Nestle and Oetker have “paused” spending on YouTube ads after video blogger Mark Watson shared a video showing how comments on videos with children were being used to enable an ad hoc softcore child porn ring. Commenters would flag videos where underage girls were performing supposedly suggestive actions, such as gymnastics, while YouTube’s own algorithms would inadvertently suggest similar videos.
The YouTube comments section: where naivety and predatory behavior collide.
Children are able to upload videos to YouTube. Pedophiles are able to mark timestamps and post comments on videos. Needless to say, when a young girl uploads a video of herself modeling swimwear, matters play out exactly as you would expect. Moreover, when another person uploads a video showing this expected outcome, advertisers with reputations to maintain expectedly become skittish and pull their ads.
I’m sure Google, YouTube’s parent company, will remove the offending videos, ban the users who made sexual remarks on said videos, and claim that they’re working behind the scenes to ensure this never happens again… just like it did every other time something like this has happened. It’ll likely appease advertisers enough that they’ll return. But eventually YouTube’s brand could become blackened enough that advertisers refuse to return after yet another one of these controversies.
YouTube, like most websites, is globally accessible. While only a fraction of the seven billion people who inhabit this planet will ever access YouTube, even a tiny fraction of seven billion people is too large of a number of people for a single company to effectively watch over. What makes this problem even worse is the fact that these users are somewhat anonymous (especially if they’re outside of the jurisdictions YouTube exists) and thus permanently banning them is difficult because they can simply create new accounts.
Do I see a hand in the audience going up? I do! It’s a hypothetical Google employee! What’s that hypothetical Google employee? You’ll just “create an algorithm to fix this?” Good luck. Humans have so far proven less clever at creating content filtering algorithms than they have at bypassing content filtering algorithms. Maybe you’ll finally create the perfect algorithm (my money is against you by the way) but, at best, that will just cause the timestamps and comments to move to another site and then it will only be a matter of time until some YouTuber posts a video showing that site and you’re looking at the same controversy all over again (this time without the benefit of control over the offending site).