FTC to fine youtubers 42,000 USD per video if it is "directed towards kids" and has advertisements on it

First of all, holy fucking shit! So many people are going to go bankrupt from this... Not to mention a bunch of young kids who upload to youtube and have no idea what this is! I myself barely heard about it and this is huge!

The video to watch from The Quartering:

What is going on?

The FTC took YouTube/Google to court and got a huge fine slapped on them, about 190 million USD

Also, all youtube creators will be subject to a new system where they need to check a box on their videos that says "this targets kids" or not for every video(including past videos). Every single video on your channel will need to be reviewed accordingly and stated whether or not "it could be considered to be targeting kids"...


Another good video to watch and the change.org petition here:


Some of the insanity that stands out to me

"the FTC has also said that creators who don’t label their content correctly could be held liable for COPPA violations. TubeFilter says that the fine for such violations can be more than $42,000 per video."

"the FTC wants to broaden the scope of content made for kids. "

Let's also remember that the "guidelines" the FTC is going to put into place are not voted on by elected representatives almost ever(as if that actually matters is another discussion). The FTC does whatever the hell it wants and will continue to expand their reach infinitely.

Any content that can be deemed "content targeting kids" could include sports/gaming/how to videos/podcasting/livestreaming/anything really. it is totally subjective and up to the FTCs determination(good luck winning against an org who can make up their own rules!)

Comments 4

Guessing searing in a video would/should automatically mean it's not aimed at kids?
Video creators could use that defence.
"I was swaring, kids shouldn't be watching my video anyway."
Will this lead to an increase in searing on Youtube?

08.11.2019 02:46

Its a government thing with the FTC. I would imagine they will just do whatever they want. The only way to be safe from it is to check the box and say "this content targets children", otherwise every video you upload regardless of what YOU think it is targeting is up to the judgement of the government if it is or not.

Some common examples that WILL be considered are Let's Plays/Gaming/Movies/Music/anything that a "pre-teen" might watch(which could quite litterally be considered EVERYTHING). So its pretty much going to be everything. The government does whatever it wants, it will not be held accountable in any way, and people will go bankrupt for not clicking on a box on youtube(and all other video sites that allow video uploads that have advertising/collect user data, which is all of them!). Its insanity and theres nothing we can do.

09.11.2019 16:49

I don't think that will be seen that way. The FTC is looking at if it would appeal to kids as well as if it is aimed at kids. The FTC's examples of things they would consider indicates something would appeal to kids includes stuff like if there are animated characters on the website which I hope they don't but suspect they may carry over to videos too. Most animation will appeal to kids even if it isn't aimed at them (I'm not sure about the art style of shows like Archer, Bojack Horseman or Ugly Americans but most animation would - Bojack might still just due to being animals).

YouTube is also kind of concerning in this regard if they don't have an actual person doing this stuff as their algorithm doesn't work very well regarding this sort of thing - Charlie the Unicorn got the first part of its finale this year and I re-watched the old ones on my partner's Youtube to refresh my memory of them and YouTube was recommending toilet training videos alongside it as well as nursery rhymes. I've had comedy songs with animated music videos trigger recommendations for nursery rhymes before too.

Regardless of YouTube's algorithm issues, it's the FTC that is ultimately doing this and they were stated as focusing on if it would appeal to kids, not just if it is aimed at kids, which is concerning as that will cover a lot of content.

Mind you, this might not be the biggest problem. YouTube's just released some newer guidelines that will be current in December that covers them to remove accounts deemed to not be profitable. This could just be an easy way to deal with people who paint the platform poorly in a quick manner (which might mean less future adpocalypses) but it could be quite a problem and lead to accounts being removed because they don't like their views or because they aren't monetisable due to either being advertiser unfriendly or being kid's content which can't be advertised on now or any other reason. It depends on what the intention is behind this line. It's a good line to cover themselves to respond quickly to the worst situations (but really they don't do that anyway, they manually reviewed the suicide forest video and kept it up before people kicked up a stink enough for them to remove it - meanwhile punishing Pewdiepie much worse for something that wasn't good but was still nowhere near as bad as what Logan Paul did), but it's a concerning term they have put in there. I hope it's just for the worst case scenarios and not for giving them an easier out to remove any account they want.

12.11.2019 02:52

This is one a few concerning things with YouTube atm. They are also looking at if it appeals to children not just actually targeted at them which makes a concerning situation for both animation and gaming videos.

12.11.2019 02:54