While Facebook might be adding ways to engage with an audience, YouTube is focusing down on what content that creators can share. The company's new monetization guidelines have been released, and they have definitely left a wide definition for what is considered acceptable to the platform.
The three main types of content that are now considered non-monitizable are: inappropriate use of family entertainment characters, hateful content and incendiary and demeaning content. The first is pretty clear - don't use kids cartoons in suggestive scenarios. If you do, such as in parody content, you will be able to keep it online, but will not be able to run any advertising alongside.
The second, hateful content, is defined by YouTube as content that
promotes discrimination or disparages or humiliates an individual or group of people on the basis of the individual's or group's race, ethnicity, or ethnic origin, nationality, religion, disability, age, veteran status, sexual orientation, gender identity, or other characteristic associated with systematic discrimination or marginalization.
The last, however, is very open to interpretation, in a way that seems incredibly purposeful. This gives YouTube content monitors the ability to pull advertising from nearly any video that it needs to in order to preserve the relationship with an advertiser. Google recently hit a snag with its previous rules, regulations and content processes, which resulted in major advertisers pulling their campaigns from all non-search platforms.
With the help of these new guidelines, and their purposeful vagueness, Google hopes to bring those advertisers back to YouTube.