Go to Top

A Commentary Gets Caught In A Crypto-Crackdown

keyboard modified with Facebook-style thumbs-up and thumbs-down keys

In the 1980s hit “Pee-wee’s Playhouse,” each episode included a secret word. If someone said the secret word, the show would stop dead as everyone stood around and screamed.

This seems to be the basis for some major social platforms’ approach to ad content these days.

Advertising is Facebook’s main engine, so businesses trying to reach people using the platform increasingly have to pay to play. That’s as true for us at Palisades Hudson as it is for major brands like Coca-Cola or Walmart. But recently we inadvertently said the secret word in our attempt to reach a broader audience – in this case, “bitcoin.”

Linda Elkin, our director of marketing, decided to boost a post featuring a link to my blog entry about rapper 50 Cent’s forgotten bitcoin stash. (Since that post, 50 Cent has denied ever owning any bitcoin through his legal team.) Our post’s text was mainly a statement of what was then perceived as fact: A musician who had once been bankrupt was now reportedly millions of dollars richer than he thought. But Facebook rejected the ad, stating, “Your ad isn’t approved because it doesn’t comply with our Advertising Policies.”

Those policies forbid promoting financial products or services “that are frequently associated with misleading or deceptive promotional practices,” specifically including cryptocurrency. Our ad, however, was promoting commentary about a news item that happened to mention bitcoin, not promoting bitcoin itself. In fact, my colleagues have been overtly critical of bitcoin, initial coin offerings and cryptocurrency’s use in investing generally. Even in my original post about 50 Cent, I observed, “…I hope that his first step is to convert his reported 700 or so bitcoins into a more stable form of value, if he hasn’t already.” Hardly a ringing endorsement.

Linda attempted to appeal the ad’s rejection, hoping that a pair of human eyes might catch the difference that an algorithm would miss. However, the Facebook representative who responded held firm, simply re-quoting the prohibited financial products and services policy.

For us, this incident was frustrating but not truly a big deal. At Palisades Hudson, we don’t make our living on social media; while we occasionally use it to reach out to existing and potential clients, there will be plenty of other blog posts on plenty of other topics that we can promote instead. But some people – including some of our clients – use Facebook, Instagram, YouTube and other online platforms as significant components of their livelihoods. Such harsh policies may create major problems.

For instance, a variety of advertisers have argued that they see a gender bias in the way Facebook accepts or rejects images of the human body. April Ray, a book blogger, reported that a photo of her reading in a dark room was incorrectly tagged as adult content. Though she was able to eventually get it approved, the length of time it took to reach a human employee meant it was too late to use the photo for its intended promotional purpose. George Stamelos, a co-founder of a fashion company, said that his brand couldn’t advertise swimwear on Facebook because such ads kept getting rejected for showing too much of the models’ bodies, even in a clearly nonsexualized context.

Facebook can, of course, set whatever limits on ads and other content it deems fit. But when those standards are applied inconsistently or without nuance, the company risks alienating the very advertisers it works so hard to attract.

The situation on YouTube is slightly different, but similarly frustrating for users. Last fall, many YouTubers bemoaned the “adpocalypse,” in which an automated system determined whether a video could be sponsored by all advertisers, only some, or none. Users complained that, from a human perspective, these distinctions often seemed arbitrary. Many LGBTQ creators have said their videos have been incorrectly flagged simply because of the gender or sexual orientation of their creator. Those who discuss mental health or disabilities have made similar claims. YouTube has denied that this is the case, but given the ongoing user complaints about the lack of transparency in the process, we more or less have to take the company’s word for it.

YouTube encourages creators to appeal when a video is flagged, but as many users have pointed out, even a successful appeal often means missing the financially important window after a video is first posted. And as of this writing, YouTube does not offer manual reviews to everyone: Users must have a channel with more than 10,000 subscribers or the video itself must have at least 1,000 views in the seven days prior to the request. Accusations that the biggest creators can monetize even questionable content were thrown into stark relief earlier this year after the popular vlogger Logan Paul broadcast a video including the image of an apparent suicide victim. The video remained on the site until Paul himself took it down.

As our company’s chief compliance officer, I certainly understand the importance of being careful about financial claims made in advertisements. Palisades Hudson’s asset management affiliate is subject to the Securities and Exchange Commission’s rules prohibiting investment advisers from using advertisements containing untrue statements of material fact, or that are otherwise false or misleading. Our compliance policy is designed to make sure we adhere to this rule, and a key element is a human being’s review.

It makes sense that massive platforms like Facebook and YouTube automatically flag certain types of content. These companies are under significant pressure to clean up objectionable posts, especially those connected to ad revenue, given the proliferation of hateful or deceptive material they are battling. But today’s outcomes suggest the pendulum has swung too far toward trusting machine learning.

From the outside, I can’t say for sure what happens when a human being receives an appeal like ours. Maybe employees don’t feel empowered to make fine distinctions between content mentioning a topic and promoting that topic. Maybe they are so worried about losing their jobs over making the wrong call that they will always play it safe. Big, high-profile creators and advertisers may move up to someone more senior who can actually help. But many smaller posts and videos that are truly innocuous may fall by the wayside because companies like Facebook are unwilling or unable to apply some common sense, or their appeals departments are significantly understaffed, or both.

There is nothing wrong with enforcing advertising rules for your platform. But companies need to write smarter algorithms and empower employees to overrule them when necessary. It is unfair to penalize creators attempting to monetize harmless content just because it happens to discuss – or even appears to discuss – potentially controversial subjects.

, , , , , ,