May 13, 2022

Top management at LinkedIn and Meta need to ask whether growing controversies about the ability of their moderators to use their authority responsibly could constitute business risks. Advertisers should, in the absence of corrective action, take a hard look as to whether they want to associate their brand names with some of the behavior depicted below. While LinkedIn is privately held, managers of pension and investment funds with stakes in Meta should assess any risks to their portfolios that might arise from the conduct of the moderators in question. (Disclosure: I do not own stock in Meta directly, but some of the mutual funds I own may have stakes in Meta.)

‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609268089992-0’); }); }

Putinist Trolls and LinkedIn

“Russia’s Weaponization and Manipulation of LinkedIn” reported in 2019 that Vladimir Putin’s troll farms were infesting LinkedIn with propaganda. “How Russia Is Using LinkedIn as a Tool of War Against Its U.S. Enemies” adds, “[On LinkedIn], however, Newsweek has found that pro-Moscow forces have put constant pressure on the company to suspend or evict adversaries, many with long, distinguished careers in the U.S. military or its intelligence agencies.” A distinguished soldier alleged this month that some of his content was removed, and his account possibly suspended, because of material he posted about the behavior of the Russian Federation. If moderators are in fact responding to “constant pressure” rather than LinkedIn’s posted rules, the company needs to retrain, reassign or, as a last resort, fire them. The Newsweek article adds that Putin’s secret police have used LinkedIn to cyberstalk Putin’s critics and threaten them with physical violence. The latter is however criminal conduct by Putin’s Oprichniki or Chekists, for which LinkedIn is not responsible.

I have personally seen numerous examples of questionable moderation during the past two months. The position that Russian behavior in Ukraine rises to the level of genocide and/or Nazism was recently adopted by, among others, President Biden and the governments of the Czech Republic and Lithuania. LinkedIn’s moderators have said, however, that depiction of Putin’s actions as genocide, and/or comparable to those of Hitler and Stalin, constitute “bullying and harassment.” LinkedIn allegedly (by the user affected) cited as a violation of its Professional Community Policy a depiction of Putin’s invasion of Ukraine as an illegal act by a totalitarian government. A picture of three Ukrainian soldiers with masks photo-manipulated onto their faces (possibly to conceal their identities from Russian trolls) allegedly “goes against our Professional Community Policies.” Yet another person alleges credibly that his/her condemnation of a Russian poster that says “We are against Nazism, but they are not” with “they” referring to prominent Swedes including the creator of Pippi Longstocking, constituted “bullying and harassment.” These are but samples of the complaints I have seen about LinkedIn moderators censoring anti-Putin content, and even suspending people’s ability to post for saying bad things about Putin and Russia.

‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609270365559-0’); }); }

Depiction of Zionism as Nazism does not however seem to violate what LinkedIn calls its Professional Community Policies, just as a page called “Jewish Ritual Murder” did not violate what Facebook calls its community standards until widespread publicity compelled its removal. More content of this nature can be found with a Google search on site:linkedin.com and “zionazis,” “zio-nazis, “Zionist Nazis,” “Jewish Nazis,” and so on. I have personally seen the following material although, in fairness to LinkedIn, I do not know if it was ever flagged for the moderators. The fact that supporters of “Palestine” believe they are free to post this kind of material on LinkedIn, however, says quite a bit about them personally, their agenda, and also what they believe, whether rightly or wrongly, LinkedIn will tolerate.

  • The Executive Director at Al-Awda, The Palestine Right of Return Movement, calls for the destruction of Israel: “From the river to the sea, Palestine will be free.”
  • “Zio Zelensky openly flaunts neo-Nazi Azov to the Greek Parliament and confirming what Putin has been saying all along.”  
  • “The real criminal Israeli Jewish Nazis”

Facebook is meanwhile well known for similar issues, with moderators making arbitrary decisions to restrict people’s accounts for violating what passes for Facebook’s community standards and tolerating content such as a 9/11 conspiracy theory that blames the Mossad for 9/11. This was still online as of May 11 2022. Raymond Ibrahim adds (American Thinker, May 11 2022), “Facebook Censors Persecuted Christians” which reinforces what I have said here. In addition, while Facebook’s moderators were busy sanctioning users for pretty much whatever they felt like, a child sex trafficking ring was using the site to target juvenile victims

The Daily Beast reports, “Facebook a Hotbed of ‘Child Sexual Abuse Material’ With 20.3 Million Reports, Far More Than Pornhub.” This BBC article contends meanwhile, “Facebook failed to remove sexualised images of children” and adds, “When provided with examples of the images, Facebook reported the BBC journalists involved to the police and cancelled plans for an interview.” The article adds that the BBC reported 100 images to Facebook by means of the reporting function, and only 18 were removed.

Social Media Advertisers Need to Look Into This

LinkedIn is apparently privately held so this kind of moderation is a risk only to LinkedIn’s owners should advertisers decide they do not want to associate their names with it. Meta is not privately held so its investors, including mutual and pension funds with billions of dollars in assets, need to take a very close look at how Facebook’s moderators conduct themselves while representing their employer. Facebook said (before bad press forced it to take action) that the “Jewish Ritual Murder” page did not violate its so-called community standards so it would have been entirely fair to share this page with Facebook’s advertisers.

Remember, “You keep what you don’t delete,” which means, the instant you decide what users can and cannot post (other than clearly illegal material and spam) and delete content and/or sanction users for posting what you don’t like, you are responsible for everything you allow to remain including anti-Semitic hate speech, 9/11 conspiracy theories, and support for Putin’s invasion of Ukraine. When you ban people like Donald Trump, you are also responsible for those you don’t ban such as Iran’s “Supreme Leader” Khamenei who is still on both Twitter and Facebook. Vladimir Putin still has (as of May 12) what looks like an authentic Facebook account, but with only 5200 followers. Even if it’s a fan page, though, it promotes the Russian agenda. If Facebook can ban Donald Trump then it ought to be able to ban Putin.