The ethics of internet culture: a conversation with Taylor Lorenz

Taylor Lorenz was in high demand this week. As a prolific journalist at The Atlantic and about-to-be member of Harvard’s prestigious Nieman Fellowship for journalism, that’s perhaps not surprising. Nor was this the first time she’s had a bit of a moment: Lorenz has already served as an in-house expert on social media and the internet for several major companies, while having written and edited for publications as diverse as The Daily Beast, The Hill, People, The Daily Mail, and Business Insider, all while remaining hip and in touch enough to currently serve as a kind of youth zeitgeist translator, on her beat as a technology writer for The Atlantic.

Lorenz is in fact publicly busy enough that she’s one of only two people I personally know to have openly ‘quit email,’ the other being my friend Russ, an 82 year-old retired engineer and MIT alum who literally spends all day, most days, working on a plan to reinvent the bicycle.

I wonder if any of Lorenz’s previous professional experiences, however, could have matched the weight of the events she encountered these past several days, when the nightmarish massacre in Christchurch, New Zealand brought together two of her greatest areas of expertise: political extremism (which she covered for The Hill), and internet culture. As her first Atlantic piece after the shootings said, the Christchurch killer’s manifesto was “designed to troll.” Indeed, his entire heinous act was a calculated effort to manipulate our current norms of Internet communication and connection, for fanatical ends.

Taylor Lorenz

Lorenz responded with characteristic insight, focusing on the ways in which the stylized insider subcultures the Internet supports can be used to confuse, distract, and mobilize millions of people for good and for truly evil ends:

Before people can even begin to grasp the nuances of today’s internet, they can be radicalized by it. Platforms such as YouTube and Facebook can send users barreling into fringe communities where extremist views are normalized and advanced. Because these communities have so successfully adopted irony as a cloaking device for promoting extremism, outsiders are left confused as to what is a real threat and what’s just trolling. The darker corners of the internet are so fragmented that even when they spawn a mass shooting, as in New Zealand, the shooter’s words can be nearly impossible to parse, even for those who are Extremely Online.”

Such insights are among the many reasons I was so grateful to be able to speak with Taylor Lorenz for this week’s installment of my TechCrunch series interrogating the ethics of technology.

As I’ve written in my previous interviews with author and inequality critic Anand Giridharadas, and with award-winning Google exec turned award-winning tech critic James Williams, I come to tech ethics from 25 years of studying religion. My personal approach to religion, however, has essentially always been that it plays a central role in human civilization not only or even primarily because of its theistic beliefs and “faith,” but because of its culture — its traditions, literature, rituals, history, and the content of its communities.

And because I don’t mind comparing technology to religion (not saying they are one and the same, but that there is something to be learned from the comparison), I’d argue that if we really want to understand the ethics of the technologies we are creating, particularly the Internet, we need to explore, as Taylor and I did in our conversation below, “the ethics of internet culture.”

What resulted was, like Lorenz’s work in general, at times whimsical, at times cool enough to fly right over my head, but at all times fascinating and important.

Editor’s Note: we ungated the first of 11 sections of this interview. Reading time: 22 minutes / 5,500 words.

Joking with the Pope

Greg Epstein: Taylor, thanks so much for speaking with me. As you know, I’m writing for TechCrunch about religion, ethics, and technology, and I recently discovered your work when you brought all those together in an unusual way. You subtweeted the Pope, and it went viral.

Taylor Lorenz: I know. [People] were freaking out.

Greg: What was that experience like?

Taylor: The Pope tweeted some insane tweet about how Mary, Jesus’ mother, was the first influencer. He tweeted it out, and everyone was spamming that tweet to me because I write so much about influencers, and I was just laughing. There’s a meme on Instagram about Jesus being the first influencer and how he killed himself or faked his death for more followers.

Because it’s fluid, it’s a lifeline for so many kids. It’s where their social network lives. It’s where identity expression occurs.

I just tweeted it out. I think a lot of people didn’t know the joke, the meme, and I think they just thought that it was new & funny. Also [some people] were saying, “how can you joke about Jesus wanting more followers?” I’m like, the Pope literally compared Mary to a social media influencer, so calm down. My whole family is Irish Catholic.

A bunch of people were sharing my tweet. I was like, oh, god. I’m not trying to lead into some religious controversy, but I did think whether my Irish Catholic mother would laugh. She has a really good sense of humor. I thought, I think she would laugh at this joke. I think it’s fine.

Greg: I loved it because it was a real Rorschach test for me. Sitting there looking at that tweet, I was one of the people who didn’t know that particular meme. I’d like to think I love my memes but …

Taylor: I can’t claim credit.

Greg: No, no, but anyway most of the memes I know are the ones my students happen to tell me about. The point is I’ve spent 15 plus years being a professional atheist. I’ve had my share of religious debates, but I also have had all these debates with others I’ll call Professional Strident Atheists.. who are more aggressive in their anti-religion than I am. And I’m thinking, “Okay, this is clearly a tweet that Richard Dawkins would love. Do I love it? I don’t know. Wait, I think I do!”

Taylor: I treated it with the greatest respect for all faiths. I thought it was funny to drag the Pope on Twitter .

The influence of Instagram

Alexander Spatari via Getty Images

Daily Crunch: Social media struggles with shooting tragedy

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Videos of shooting tragedy in New Zealand continue resurfacing on social media

Earlier today there was a horrendous mass mosque shooting in New Zealand that killed 49 people — and because this is 2019, social media was used by the apparent murderers to plan, announce, broadcast and virally resonate what they did.

Some of that — such as the Facebook and Twitter accounts of the perpetrator — have been deleted. Yet nearly 12 hours later, you can still find multiple copies of the shooting videos on YouTube and Twitter, with some being used to promote other things.

2. Facebook loses CPO Chris Cox and WhatsApp VP Chris Daniels

Chief Product Officer Chris Cox is departing the company after two years of supposedly seeking to do something new. More surprising is today’s departure of Chris Daniels, an eight-year employee who was moved from being head of Internet.org to VP of WhatsApp just last May.

3. Apple addresses Spotify’s claims, but not its demands

In a lengthy statement on its site called “Addressing Spotify’s Claims,” Apple walks through and dismantles some of the key parts of Spotify’s accusations about how the App Store works — covering app store approval times, Spotify’s actual cut on subscription revenues and Spotify’s rise as a result of its presence on iOS.

Tesla CEO Elon Musk views the new Tesla Model Y at its unveiling in Hawthorne, California on March 14, 2019.

4. The Tesla Model Y is a 300-mile-range Model 3 doppelgänger coming in fall 2020

After years of teasers and hints, Tesla CEO Elon Musk finally unveiled the Model Y, a mid-sized all-electric vehicle that is slated to hit the marketplace in fall 2020.

5. Bird lays off up to 5 percent of workforce

“As we establish local service centers and deeper roots in cities where we provide service, we have shifting geographic workforce needs,” a Bird spokesperson told us.

6. Slack removes 28 accounts linked to hate groups

To date, Slack has managed to stay out of the conversation around what happens when sometimes violent politically extreme organizations use popular social platforms to organize.

7. Apple’s iCloud recovers after a four-hour outage

Facebook has only just recovered from one of its worst outages to date, and Gmail and Google Drive also experienced a worldwide outage this week. Now, apparently, it was Apple’s turn.

Videos of shooting tragedy in New Zealand continue resurfacing on social media

This is a tragedy that doesn’t deserve a snappy lede, but it is one that needs to be highlighted because tech companies should be held to account.

Earlier today, there was a horrendous mass mosque shooting in New Zealand that killed 49 people — and because this is 2019, social media was used by the apparent murderers to plan, announce, broadcast and virally resonate what they did.

Some of that — such as the Facebook and Twitter accounts of the perpetrator — have been deleted. Yet nearly 12 hours later, you can still find multiple copies of the shooting videos on YouTube and Twitter, with some being used to promote other things.

YouTube issued a statement several hours ago condemning the snuff videos, adding that it is removing them as soon as they are made aware:

“Our hearts go out to the victims of this terrible tragedy. Shocking, violent and graphic content has no place on our platforms, and is removed as soon as we become aware of it,” said a spokesperson in an email to TechCrunch. “As with any major tragedy, we will work cooperatively with the authorities.”

The turnaround time of “as soon as we become aware” isn’t quite as fast as you might think. Randomly browsing on the most basic of searches in YouTube at around 9am Eastern, we found a number of copies of the shooting incident — the same ones posted as livestreams on Facebook — and reported all of them.

As of writing, three of the first four we clicked on are still up. One of them is even used to promote other videos — gaming-related as it happens.

To be clear, the YouTube links we clicked on are re-uploads of the primary source video from the event, not any allegedly legitimate “news” coverage that’s been uploaded in response to it, much of it coming from chancers who are essentially just hoping to make a little click-money from people browsing for more information. Those don’t only include no-name video posters. The likes of the Daily Mail sickeningly used clips of the video in the name of news.

On Twitter, it was just as easy to find embedded video clips, by way of the New Zealand hashtag plus a keyword or two. These included clips of the shooting, plus several taken by motorists of the police chase for the perpetrators, which present their own casual eeriness and off-color comments from the recorders.

Twitter also issued a statement the mirrored YouTube’s.

“We are deeply saddened by the shootings in Christchurch today,” a spokesperson said in an email to us. “Twitter has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. We also cooperate with law enforcement to facilitate their investigations as required.”

Like YouTube, Twitter is also monitoring the platform and has both human and computer-based screenings to field reports. But the situation highlights how — despite the stated commitments from companies that work in social media to track malicious or harmful content on their platforms, and despite all of the tracking algorithms and teams of humans that they have built to help — social media services continue to fail the public when it comes to keeping their platforms from getting exploited for horrendous ends.

Meanwhile, the lead comment on the Reddit thread about the news states, “New Zealand Police has requested the Footage not be shared on social media. Please do not post the Videos. If you see the Videos, bring it to the moderators attention,” and judging from the first couple pages, this appears to be working.

A Reddit spokesperson sent us the following statement: “We are actively monitoring the situation in Christchurch, New Zealand. Any content containing links to the video stream are being removed in accordance with our site-wide policy.”

It’s also worth noting that after a shooter apparently encouraged viewers of his livestream to “subscribe to PewDiePie,” the YouTube star (who has attracted controversy for anti-Semitic messages in the past) tweeted, “I feel absolutely sickened having my name uttered by this person. My heart and thoughts go out to the victims, families and everyone affected by this tragedy.”

Camelot lets Twitch and YouTube audiences pay for what they want to see

As the streaming world continues to grow, startups are looking to take advantage of the opportunity and grab a slice of the pie, and indeed create new revenue models around it entirely. 

Camelot, a YC-backed startup, is one of them.

Camelot allows viewers to place bounties on their favorite streamers, putting a monetary value on the things they want to see on stream. This could include in-game challenges like “win with no armor,” as well as stream bounties like “Play Apex” or “add a heartbeat monitor to the stream.”

When a viewer posts a bounty, other viewers can join in and contribute to the overall value, and the streamer can then choose whether or not to go through with it from an admin dashboard.

Because internet platforms can often be used for evil alongside good, cofounder and CEO Jesse Zhang has thought through ways to minimize inappropriate requests.

There is an option for streamers to see and approve the bounty before it’s ever made public to ensure that they avoid inappropriate propositions. Bounties are also paid for up front by viewers, and either returned if the creator declines the bounty or pushed through when the streamer completes the task, raising the barrier to entry for nefarious users.

Camelot generates revenue by taking a five percent stake in every bounty completed.

The platform isn’t just for Twitch streamers — YouTubers can also get in on the mix using Camelot and making asynchronous videos around each bounty. Not only does it offer a new way to generate revenue, but it also offers content creators the chance to get new insights on what their viewers want to see and what they value.

Cofounder and CEO Jesse Zhang believes there is opportunity to expand to streamers and YouTube content creators outside of the gaming sphere in the future.

For now, however, Camelot is working to bring on more content creators. Thus far, streamers and viewers have already come up with some interesting use cases for the product. One streamer’s audience bought his dog some treats, and one viewer of Sa1na paid $100 to play against the streamer himself.

Camelot declined to share how much funding it has received thus far, but did say that lead investors include Y Combinator, the Philadelphia 76ers, Soma Capital, and Plaid cofounders William Hockey and Zach Perret.

UK Far Right activist circumvents Facebook ban to livestream threats

Stephen Yaxley-Lennon, a Far Right UK activist who was permanently banned from Facebook last week for repeatedly breaching its community standards on hate speech, was nonetheless able to use its platform to livestream harassment of an anti-fascist blogger whom he doorstepped at home last night.

UK-based blogger Mike Stuchbery detailed the intimidating incident in a series of tweets earlier today, writing that Yaxley-Lennon appeared to have used a friend’s Facebook account to circumvent the ban on his own Facebook and Instagram pages.

In recent years Yaxley-Lennon, who goes by the moniker ‘Tommy Robinson’ on social media, has used online platforms to raise his profile and solicit donations to fund Far Right activism.

He has also, in the case of Facebook and Twitter, fallen foul of mainstream tech platforms’ community standards which prohibit use of their tools for hate speech and intimidation. Earning himself a couple of bans. (At the time of writing Yaxley-Lennon has not been banned from Google-owned YouTube .)

Though circumventing Facebook’s ban appears to have been trivially easy for Yaxley-Lennon, who, as well as selling himself as a Far Right activist called “Tommy Robinson”, previously co-founded the Islamophobic Far Right pressure group, the English Defence League.

Giving an account of being doorstepped by Yaxley-Lennon in today’s Independent, Stuchbery writes: “The first we knew of it was a loud, frantic rapping on my door at around quarter to 11 [in the evening]… That’s when notifications began to buzz on my phone — message requests on Facebook pouring in, full of abuse and vitriol. “Tommy” was obviously livestreaming his visit, using a friend’s Facebook account to circumvent his ban, and had tipped off his fans.”

A repost (to YouTube) of what appears to be a Facebook Live stream of the incident corroborates Stuchbery’s account, showing Yaxley-Lennon outside a house at night where can be seen shouting for “Mike” to come out and banging on doors and/or windows.

At another point in the same video Yaxley-Lennon can be seen walking away when he spots a passerby and engages them in conversation. During this portion of the video Yaxley-Lennon publicly reveals Stuchbery’s address — a harassment tactic that’s known as doxxing.

He can also be heard making insinuating remarks to the unidentified passerby about what he claims are Stuchbery’s “wrong” sexual interests.

In another tweet today Stuchbery describes the remarks are defamatory, adding that he now intends to sue Yaxley-Lennon.

Stuchbery has also posted several screengrabs to Twitter, showing a number of Facebook users who he is not connected to sending him abusive messages — presumably during the livestream.

During the video Yaxley-Lennon can also be heard making threats to return, saying: “Mike Stuchbery. See you soon mate, because I’m coming back and back and back and back.”

In a second livestream, also later reposted to YouTube, Yaxley-Lennon can be heard apparently having returned a second time to Stuchbery’s house, now at around 5am, to cause further disturbance.

Stuchbery writes that he called the police to report both visits. In another tweet he says they “eventually talked ‘Tommy’ into leaving, but not before he gave my full address, threatened to come back tomorrow, in addition to making a documentary ‘exposing me'”.

We reached out to Bedfordshire Police to ask what it could confirm about the incidents at Stuchbery’s house and the force’s press office told us it had received a number of enquiries about the matter. A spokeswoman added that it would be issuing a statement later today. We’ll update this post when we have it.  

Stuchbery also passed us details of the account he believes was used to livestream the harassment — suggesting it’s linked to another Far Right activist, known by the moniker ‘Danny Tommo’, who was also banned by Facebook last week.

Though the Facebook account in question was using a different moniker — ‘Jack Dawkins’. This suggests, if the account did indeed belong to the same banned Far Right activist, he was also easily able to circumvent Facebook’s ban by creating a new account with a different (fake) name and email.

We passed the details of the ‘Jack Dawkins’ account to Facebook and since then the company appears to have suspended the account. (A message posted to it earlier today claimed it had been hacked.)

The fact of Yaxley-Lennon being able to use Facebook to livestream harassment a few days after he was banned underlines quite how porous Facebook’s platform remains for organized purveyors of hate and harassment. Studies of Facebook’s platform have previously suggested as much.

Which makes high profile ‘Facebook bans’ of hate speech activists mostly a crisis PR exercise for the company. And indeed easy PR for Far Right activists who have been quick to seize on and trumpet social media bans as ‘evidence’ of mainstream censorship of their point of view — liberally ripping from the playbook of US hate speech peddlers, such as the (also ‘banned’) InfoWars conspiracy theorist Alex Jones. Such as by posting pictures of themselves with their mouths gagged with tape.

Such images are intended to make meme-able messages for their followers to share. But the reality for social media savvy hate speech activists like Jones and Yaxley-Lennon looks nothing like censorship — given how demonstrably easy it remains for them to circumvent platform bans and carry on campaigns of hate and harassment via mainstream platforms.

We reached out to Facebook for a response to Yaxley-Lennon’s use of its livestreaming platform to harass Stuchbery, and to ask how it intends to prevent banned Far Right activists from circumventing bans and carrying on making use of its platform.

The company declined to make a public statement, though it did confirm the livestream had been flagged as violating its community standards last night and was removed afterwards. It also said it had deleted one post by a user for bullying. It added that it has content and safety teams which work around the clock to monitor Live videos flagged for review by Facebook users.

It did not confirm how long Yaxley-Lennon’s livestream was visible on its platform.

Stuchbery, a former history teacher, has garnered attention online writing about how Far Right groups have been using social media to organize and crowdfund ‘direct action’ in the offline world, including by targeting immigrants, Muslims, politicians and journalists in the street or on their own doorsteps.

But the trigger for Stuchbery being personally targeted by Yaxley-Lennon appears to be a legal letter served to the latter’s family home at the weekend informing him he’s being sued for defamation.

Stuchbery has been involved in raising awareness about the legal action, including promoting a crowdjustice campaign to raise funds for the suit.

The litigation relates to allegations Yaxley-Lennon made online late last year about a 15-year-old Syrian refugee schoolboy called Jamal who was shown in a video that went viral being violently bullied by white pupils at his school in Northern England.

Yaxley-Lennon responded to the viral video by posting a vlog to social media in which he makes a series of allegations about Jamal. The schoolboy’s family have described the allegations as defamatory. And the crowdjustice campaign promoted by Stuchbery has since raised more than £10,000 to sue Yaxley-Lennon for defaming the teen.

The legal team pursuing the litigation has also written that it intends to explore “routes by which the social media platforms that provide a means of dissemination to Lennon can also be attached to this action”.

The video of Yaxley-Lennon making claims about Jamal can still be found on YouTube.

As indeed can Yaxley-Lennon’s own channel — despite the equivalent pages having been removed from Facebook and Twitter (the latter pulled the plug on Yaxley-Lennon’s account a year ago).

We asked YouTube why it continues to provide a platform for Yaxley-Lennon to amplify hate speech and solicit donations for campaigns of targeted harassment but the company declined to comment publicly on the matter.

It did point out it demonetized Yaxley-Lennon’s channel last month, having determined it breaches its advertising policies.

YouTube also told us that it removes any video content that violates its hate speech policies — which do prohibit the incitement of violence or hatred against members of a religious community.

But by ignoring the wider context here — i.e. Yaxley-Lennon’s activity as a Far Right activist — and allowing him to continue broadcasting on its platform YouTube is leaving the door open for dog whistle tactics to be used to signal to and stir up ‘in the know’ followers — as was the case with another Internet savvy operator, InfoWars’ Alex Jones (until YouTube eventually terminated his channel last year).

Until last week Facebook was also ignoring the wider context around Yaxley-Lennon’s Far Right activism — a decision that likely helped him reach a wider audience than he would otherwise have been able to.

So now Facebook has another full-blown hate speech ‘influencer’ going rogue on its platform and being cheered by an audience of fans its tools helped amass.

There is, surely, a lesson here.

Yet it’s clear mainstream platforms are unwilling to pro-actively and voluntarily adapt their rules to close down malicious users who seek to weaponize social media tools to spread hate and sew division via amplified harassment.

If platforms won’t do it, it’ll be left to governments to curb social media’s ‘antisocial’ impacts with regulation. And in the UK there’s no shortage of appetite to try; the government has a White Paper on social media and safety coming this winter.

While the official opposition has said it wants to create a new regulator to rein in online platforms and even look at breaking up tech giants. So watch this space.

Public attitudes to (anti)social media have certainly soured — and with livestreams of hate it’s little wonder.

“Perhaps the worst thing, in the cold light of day, is the near certainty that the “content” “Tommy” produced during his stunt will now be used as a fundraising tool,” writes Stuchbery, concluding his account of being on the receiving end of a Facebook Live spewing hate and harassment. “If you dare to call him out on his cavalcade of hate, he usually tries to monetize you. It is a cruel twist.

“But most of all, I wonder how we got in this mess. I wonder how we got to a place where those who try to speak out against hatred and those who peddle it are threatened at their homes. I despair at how social media has become a weapon wielded by some, seemingly with impunity, to silence.”

‘Momo’ videos on YouTube cannot be monetized…but that’s not a new policy

Be warned, YouTube creators: making videos about the latest viral hoax, the “Momo challenge,” will not make you money. Over the past couple of days, the Momo challenge has gone viral once again, leading to a sharp increase news coverage and the number of YouTube videos discussing the topic of the creepy character and the supposed “challenge” that encourages kids to commit acts of self-harm.

The Momo challenge itself isn’t real, to be clear.

As meticulously documented by Taylor Lorenz at The Atlantic, it’s just the latest resurgence of an urban myth that has reared its head repeatedly over the years.  In reality, “Momo” was a sculpture created by the artist Keisuke Aisawa. Photographs of its frightening form made their way to Instagram and Reddit after being exhibited in Tokyo a couple of years ago. Thus, an urban legend was born, Lorenz explained.

According to one version of the myth, Momo sends kids instructions to harm themselves on WhatsApp. But urban legends take on many variations over time.

For example, my child’s entire 3rd grade class currently believes that Momo will randomly appear in YouTube videos and then come out of your sink drain. (This, also, is not true!)

Over the past few days, a social media post from Kim Kardashian and a lot of irresponsible reporting by local news outlets amplified the hoax, warning parents and schools of the dangerous “self harm” challenge. That, in turn, led to more “Momo” videos on YouTube, and a flood of posts across all other social media sites.

The Verge reported this morning that YouTube had begun demonetizing Momo videos on YouTube.

However, a spokesperson at YouTube clarified to TechCrunch that it wasn’t taking action against Momo videos as some sort of new policy or decision on the company’s part. It was simply enforcing its current policies.

The company’s existing advertiser-friendly guidelines, which govern the kinds of videos it shows ads on, do not allow any videos that discuss a harmful or dangerous act to be monetized. That includes any videos from news outlets referencing the Momo challenge, or those from other YouTube creators. This is the same policy that prevented prior YouTube videos about other dangerous challenges and hoaxes from showing advertising, they also noted. For example, any video about the Tide Pods challenge or the choking challenge could not show ads.

Demonetizing videos, to be clear, is not the same thing as disallowing the videos from showing on YouTube. The site today permits news stories and videos that are intended to raise awareness of and educate against the challenge, the spokesperson explained – like those from news outlets.

However, content that promotes the Momo challenge that is not news, educational, or documentary footage is prohibited on the site.

YouTube additionally reaffirmed that the company hadn’t seen any evidence of Momo videos on its platform until widespread media coverage began. And it had not received any links flagged or otherwise shared with the company about videos that either showed or promoted the Momo challenge directly.

“Contrary to press reports, we’ve not received any recent evidence of videos showing or promoting the Momo challenge on YouTube. Content of this kind would be in violation of our policies and removed immediately,” YouTube said, in a statement.

In addition, no Momo videos should be discoverable on YouTube’s kid-friendly app, YouTube Kids, the spokesperson said. And no such content has ever been found in the YouTube Kids app, to date.

Though YouTube hasn’t implemented a new policy here, simply having its name in the press around unsafe, scary content targeting children comes at a bad time for the company, which only yesterday turned off comments on videos of children after reports of a pedophile ring operating within the comments sections of videos. And it’s the latest in a longer string of controversies around advertiser-unfriendly content and false information which has led to other changes around its policies, including, most recently, the demonetization of anti-vaccination videos.

But in the case of Momo, YouTube isn’t the only platform afflicted by the hoax – the topic is being discussed across social media sites, including Facebook, Instagram, and Twitter.

Facebook wants up to 30% of fan subscriptions vs Patreon’s 5%

Facebook will drive a hard bargain with influencers and artists judging by the terms of service for the social network’s Patreon-like Fan Subscriptions feature that lets people pay a monthly fee for access to a creator’s exclusive content. The policy document attained by TechCrunch shows Facebook plans to take up to a 30 percent cut of subscription revenue minus fees, compared to 5 percent by Patreon, 30 percent by YouTube, which covers fees and 50 percent by Twitch.

Facebook also reserves the right to offer free trials to subscriptions that won’t compensate creators. And Facebook demands a “non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use” creators’ content and “This license survives even if you stop using Fan Subscriptions.”

Distrust of Facebook could scare creators away from the platform when combined with its significant revenue share and ability to give away or repurpose creators’ content. Facebook has consistently shown that it puts what it thinks users want and its own interests above those of partners. It cut off game developers from viral channels, inadequately warned Page owners their reach would drop over time, decimated referral traffic to news publishers and, most recently, banished video makers from the feed. If Facebook wants to win creators’ trust and the engagement of their biggest fans, it may need a more competitive offering with larger limits on its power.

“Facebook reached out to offer Hard Drive early access to a ‘fan subscription’ product” tweeted Matt Saincome, who also runs satirical news site The Hard Times. “I asked my editors about it and the complete distrust amongst our team was kinda funny. We read through the terms and found a couple things that were hilarious when compared to Patreon’s 5% . . . Up to 30% and the rights to all our stuff? From the people who let us build an audience on their platform before pulling it out from under our feet? Hilarious. Here’s a crazy alternative: let people who signed up to see our content see it and then we can monetize that hahah.”

Instagram is refocusing on creators too. Instagram’s Android app reveals the prototype of a feature that lets users switch their profile into a Creator Account, similar to the Business Profiles it launched in 2016. Instagram first told The Hollywood Reporter about Creator Accounts in December, but now it’s showing up in the code. Reverse-engineering specialist Jane Manchun Wong generated this screenshot showing the option for Creator Accounts to hide their contact info or profile category. Fellow code digger Ishan Agarwal gave TechCrunch an exclusive look at the Instagram code that shows the Creator Accounts are “Best for public figures, content producers, artists, and influencers.” Creator Accounts give users “more advanced insights and reach more people with promotions,” “more growth tools” and “a new inbox that makes it easier to manage message requests and connect with fans.”

Trading control for subscribers

Facebook began testing Fan Subscriptions a year ago to give creators a financial alternative to maximizing ad views after watching the rise of Patreon, which now has 3 million patrons who’ll pay 100,000 artists, comedians, models and makers more than $500 million this year. This month Facebook expanded the test to the U.K., Spain, Germany and Portugal to allow users to pay $4.99 per month to a creator for exclusive content, live videos and a profile badge that highlights them as a subscriber. While Twitch owns gamers, YouTube rules amongst videographers and Patreon is a favorite with odd-ball creators, Facebook may see an opportunity to popularize Fan Subscriptions internationally and turn mainstream consumers into paid supporters.

The terms for Fan Subscriptions are not publicly available, and only visible on Facebook’s site to Pages it’s invited to test the feature. But TechCrunch has published the full policy document below.

Thankfully, Facebook isn’t taking a cut of Fan Subscription revenue during the test phase, and creators get to keep 100 percent of the money paid by any patrons it signs up before the official launch. Facebook tells me that it hasn’t finalized its percentage cut, though the terms permit it to take as much as 30 percent. That would qualify, given Facebook tells me its rake will be in line with industry standards and creators will retain the majority of their earnings.

But whatever cut it takes will be after processing fees and the 15 to 30 percent tax Apple and Google levy on iOS and Android in-app purchases. We’ll see if Facebook tries a workaround that pushes users to their mobile browser where it can take their subscription money tax-free. And if Facebook decides it want to give users a free one-month trial or discount to any creator, they can’t stop it even if that lets people download all their exclusive content and then cancel without ever paying.

But what’s sure to raise the most hairs is the clause about “Supplemental Data” that gives Facebook a license to display a creator’s content as they might expect, but also a royalty-free license to use it however they want, even after a creator abandons Facebook Fan Subscriptions. A Facebook spokesperson confirmed that Supplemental Data does in fact cover all content provided by the creator. They claim it’s so if a creator made a custom fan sticker, a subscriber could use it in their own Facebook post, but the rule gives Facebook vast power beyond that. Patreon has a similar clause, but gets the benefit of the doubt in a way Facebook doesn’t after so many scandals.

Facebook’s spokesperson claimed that the Supplemental Data terms were similar to Facebook’s standard terms, but the normal Facebook terms say “You can end this license any time by deleting your content or account.” Not so with Fan Subscriptions. I don’t expect Facebook is going to try to outright steal and resell creators’ content, but it will have jurisdiction to use their art however it wants to fuel its war with Patreon, Twitch and YouTube.

Creators will have to decide whether access to Facebook’s 2.3 billion users is worth the platform risk of building a following somewhere they don’t control and that has other business priorities. If Facebook’s strategy suddenly veers away from Fan Subscriptions, it could be hard for creators to score new signups or retain their old ones. At least with a dedicated site like Patreon, creators know the platform can’t abuse them without the threaten of ruin.

Here’s the full Terms of Service for Facebook’s Patreon competitor Fan Subscriptions:

Fan Subscriptions creator terms

The fan funding feature (“Fan Subscriptions”) allows Facebook users to support their favorite pages, creators, group administrators, gamers, or others (“Pages”) through a monthly subscription with Facebook (“Subscription”) that gives those people (“fans”) access to digital content offered by Pages, such as exclusive digital content, fan recognition, and merchandise discounts. These Terms (“Terms”) govern how Pages use Fan Subscriptions.

With regard to your use of Fan Subscriptions, you agree to the following:

  1. Your use of the Platform with respect to Fan Subscriptions is subject to, and you agree to comply with, the Platform Policy currently available at https://developers.facebook.com/policy/.
  2. Your use of Fan Subscriptions to offer digital content and/or services to Facebook fans is subject to, and you agree to comply with the (a) Monetization Eligibility Standards currently available at https://www.facebook.com/help/publisher/169845596919485, and (b) Content Guidelines for Monetization currently available at https://www.facebook.com/facebookmedia/get-started/monetization_contentguidelines. You agree to follow any additional instructions and/or technical documentation we provide to you for Fan Subscriptions.
  3. You will provide accurate information to fans in connection with your use of Fan Subscriptions, including but not limited as part of any digital content or services you choose to offer to them. You must clearly and conspicuously disclose all material terms regarding your offer and the nature of content or services you will provide to fans once they choose to subscribe. You agree to comply with all laws applicable to your use of Fan Subscriptions.
  4. You confirm that the content you offer via Subscription does not infringe upon the intellectual property rights of any third party and that you have secured all rights necessary to distribute, copy, display, publicly perform, or otherwise use the content.
  5. You will not use, incorporate, or provide any music or physical goods in connection with your use of Fan Subscriptions without FB’s prior written approval (email is sufficient).
  6. You will not offer discounts on physical goods that exceed 80 percent of the offered goods’ retail value.
  7. Your fans’ Subscriptions may be processed as payments to Facebook via Apple’s In-App Purchase or Google’s In-App Billing services, which are subject to Apple’s and Google’s separate payment terms and conditions. Apple and Google may charge Facebook a revenue share and/or other fees for such payment services according to their respective terms and conditions. FB will pay you a revenue share calculated as a percentage (“Your Share”) of what’s left after deduction of those fees/charges and of any other fees or taxes incurred by Facebook. As a the date of these Terms, Your Share of that net revenue is 100%. However, Facebook may in the future change these Terms such that Facebook keeps a revenue share of up to 30%. We will give 30 days’ notice of any such change.
  8. Facebook reserves the right to offer discounted and free trials for fans from time to time in our discretion, whether to incentivize Subscription sign-ups or otherwise. Where we do so in relation to your Fan Subscriptions, your revenue share will be reduced accordingly: we only pay you a revenue share based on the amounts fans actually pay (less fees and taxes we incur).
  9. If you have accurately completed and timely provided to FB any forms or documentation that FB reasonably determines are required to set up payment to you, and subject to and only in the event that you are in compliance in all respects with these Terms, payment of any net amounts FB receives from Apple and Google for Subscriptions made by your fans during your use of Fan Subscriptions will be on a monthly basis, within approximately 60 days after the end of the applicable month. Facebook will not be responsible for any subsequent fees applied by your financial institution to complete payment of the Monthly Fee to you. Furthermore, in the event the payment due to you would be less than One Hundred U.S. Dollars ($100.00), Facebook reserves the right to roll such payment over month to month until such payment threshold is met, at which time Facebook will make the applicable payment to you. Facebook also reserves the right to set off and/or withhold any amounts that Facebook reasonably considers are or are likely to be payable by you to Facebook under these Terms (including under any indemnities).
  10. .If you are providing (or allowing us to access) any data, content, or other information in connection with your use of Fan Subscriptions (collectively, “Supplemental Data”), then you grant us (and our affiliates) a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use such Supplemental Data. This license survives even if you stop using Fan Subscriptions. You are responsible for obtaining the necessary rights from all applicable rights holders to grant this license.
  11. You are responsible for paying any applicable taxes owed with respect to any amounts you receive through your use of Fan Subscriptions. Facebook will charge you taxes with respect to such amounts if it is required to do so under applicable law.
  12. .Facebook can terminate or suspend your use of Fan Subscriptions at any time in our sole discretion, and we may change or stop offering Fan Subscriptions at any time in our sole discretion. In no event will we be liable in any way for terminating or suspending your use of Fan Subscriptions, for the discontinuation of Fan Subscriptions, for the removal of or disabling of access to content, or for the withdrawal of the content or Fan Subscriptions.
  13. .If you change what’s included in a Subscription in a way that could be considered material, you must give fans reasonable prior notice such that they have a reasonable opportunity to cancel their Subscriptions if they so choose, with the change only taking effect after their next Subscription fee payment.
  14. .If you are using Fan Subscription on behalf of a third party (including, but not limited to, as an agent or representative of a Creator), you represent and warrant that you have the authority as agent of such party to use such features on their behalf, agree to these Terms, and hereby bind such party to these Terms. You agree to indemnify and hold Facebook harmless from any claims, suits, losses, liabilities, damages, costs, and expenses resulting from your breach of the Terms. If you are accepting these Terms as admin of Facebook Business Manager for your business, these Terms shall apply to Content on all Facebook Pages and profiles owned or operated by your business at the time of acceptance and thereafter.
  15. .By using the Fan Subscriptions feature, you agree that we may communicate with you electronically any important information regarding your use of Fan Subscriptions, including without limitation as to Your Share and any payments. We may also provide notices to you by posting them on our website, or by sending them to an email address or street address that you previously provided to us. Website and email notices shall be considered received by you within 24 hours of the time posted or sent; notices by postal mail shall be considered received within three (3) business days of the time sent.
  16. .Facebook reserves the right to update these Terms from time to time. If any change to these Terms will materially disadvantage you, or materially affect the availability of the Subscription, we will provide you with notice before the changes become effective and you can choose to cancel your Subscription. Your continued use of this feature constitutes acceptance of those changes.
  17. .Fan Subscriptions is part of the “Facebook Products” under Facebook’s Terms of Service (“Facebook Terms”), and your use of Fan Subscriptions is deemed part of your use of Facebook Products. In the event of any express conflict between these Terms and the Facebook Terms, these Terms will govern solely with respect to your use of Fan Subscriptions and solely to the extent of the conflict. These Terms do not alter in any way the terms or conditions of any other agreement you may have with Facebook. Facebook reserves the right to monitor or audit your compliance with these Terms and to update these terms from time to time, and your continued use of Fan Subscriptions constitutes acceptance of those changes.

Patreon’s future and potential exits

Through the Extra Crunch EC-1 on Patreon, I dove into Patreon’s founding story, product roadmap, business model and metrics, underlying thesis, and competitive threats. The six-year-old company last valued around $450 million and likely to soon hit $1 billion is the leading platform for artists to run membership businesses for their superfans.

As a conclusion to my report, I have three core takeaways and some predictions on the possibility of an IPO or acquisition in the company’s future.

The future is bright for creators

First, the future is promising for independent content creators who are building engaged, passionate fanbases.

There is a surge of interest from the biggest social media platforms in creating more features to help them directly monetize their fans — with each trying to one-up the others. There are also a growing number of independent solutions for creators to use as well (Patreon and Memberful, Substack, Pico, etc.).

We live in an economy where a soaring number of people are self-employed, and the rise of more monetization tools for creators to earn a stable income will open the door to more people turning their creative talents into a part-time or full-time business pursuit.

Membership is a niche market and it’s unclear how big the opportunity is

Patreon’s play is to own a niche category of SMB who it recognizes has particular needs and provide them with the comprehensive suite of tools and services they need to manage their businesses. A large portion of creators’ incomes will need to go to Patreon for it to someday earn billions of dollars in annual revenue.

The market for content creators to build membership businesses appears to be growing, however, membership will be only one piece of the fan-to-creator monetization wave. The number of creators who are a fit for the membership business model and could generate $1,000-500,000 per month through Patreon (its target customer profile) is likely measured in the tens of thousands or low hundreds of thousands right now, rather than in the millions.

To get a sense of the revenue math here, Patreon will generate about $35 million this year from the 5,000-6,000 creators who fit its target customer profile; if you believe this market is expanding at a fast clip, capturing 10% of the revenue (Patreon’s current commission) from 20,000 such creators could bring in $140 million. And that’s without factoring in the potential success of Patreon implementing premium pricing options, which is a high priority. If Patreon can increase its commission from 10% to 15%, it would need around 47,500 creators in the $1,000-$500,000/month range (9.5x its current number) to reach $500 million in revenue from them.

There is a compelling opportunity for a company to provide the dominant business hub for creators, with tools to manage their fan (i.e. customer) relationships across platforms and to manage back-office logistics. At a certain point it taps out though.

That’s one of the reasons why Patreon’s vision includes extending into areas like business loans and healthcare. For companies targeting small and medium businesses like Shopify, Salesforce and Dropbox, there is so much more growth tied to their core products that there is no need for them to consider such unrelated offerings as business loans. Patreon has to both expand its market share and also expand the services it offers to those customers if it wants to reach massive scale.

Patreon faces serious competition but is evolving in the right direction

Patreon is the leading contender in this market, and there’s a role for an independent player even if Facebook, YouTube, and other distribution platforms push directly competing functionality. Patreon will need to make three important changes to compete effectively: more aggressively segment its customers, make the consumer-facing side of its platform more customizable by creators, and build out more lightweight talent management services.

What’s next for Patreon?

Having raised over $100 million in funding over the last six years, what is the path to a liquidity event for investors and employees? 

In a worst case scenario, it is unlikely the company would go out of business even if it fell into disarray because it would be strategic for several large companies to takeover at a discount. Patreon may be on the path to IPO (as CEO Jack Conte hopes), but I find it more likely that the company gets acquired sometime in the next couple years.

Path to IPO?

If a public offering is in Patreon’s future, it’s several years out. It now defines itself as a SaaS company and has a plan to earn a higher blended commission on the sales of its customers through premium pricing options. It is a frequently misunderstood company, however, and needs to prove that a big market exists for mid-tail creators building membership businesses. 

According to a summary by Spark Capital’s Alex Clayton, SaaS companies who went public in 2018 typically:

    • had $100-200 million in revenue over the prior twelve months,
    • were 14 years old,
    • had an average year-over-year revenue growth rate of ~40%,
    • earned 90% of revenue from subscriptions,
    • had a median gross margin of 73%,
    • ranged from roughly 500 to 2500 employees,
    • had a raised a median of $300 million in VC funding,
    • and IPO’d with a median market cap of $2 billion

Public market companies to benchmark it against will be Shopify (as SaaS infrastructure for small businesses selling to, and managing payments from, consumers) and Zuora (Patreon can be viewed as a media-specific SMB alternative to Zuora’s “Subscription Relationship Management” system). Compared to Shopify, whose market of SMB e-commerce businesses globally is easily understood to be enormous, Patreon would face more skepticism from public investors about the market size of mid-tail content creators.

Patreon’s gross margins can’t be much more than 50% given that almost half of revenue is going toward payment processing. Patreon mirrors Shopify’s topline revenue growth in the run up to its 2015 IPO: Shopify reported $23.7 million for 2012, $50.3 million for 2013, $105 million for 2014 and I estimate Patreon brought in $15 million for 2017, $30 million for 2018, and will hit $55 million for 2019. Most of Shopify’s revenue came from subscriptions, however, with only 37% coming from the “merchant solutions” services where Shopify had to pay out payment processing fees. Patreon’s revenue net of payment processing fees is closer to $7.5 million for 2017, $15 million for 2018, and $27 million (predicted) for 2019.

There’s a lot of capital chasing late-stage startups right now. How long that remains the case is unknown, but Patreon can likely raise the funding to operate unprofitably a few more years — getting topline revenue closer to $150-200 million, proving creators will adopt premium pricing, and showcasing its ability to compete with Facebook and YouTube in a growing market. In that case, it could become a strong IPO candidate.

The acquisition route

The other scenario, of course, is that a larger company buys Patreon. In particular, one of the large social media platforms building directly competitive features may decide it is easier to buy their expansion into membership than build it from scratch. Patreon is the dominant platform without any noteworthy direct competitor among independent companies, so acquiring it would immediately put the parent company in a market-leading position. Competing social platforms wouldn’t have another large Patreon-like startup to acquire in response.

There are three companies that jump out as both the most likely acquirers. Each of these M&A scenarios would be mutually beneficial: advancing Patreon’s mission and providing strategic value to the parent. The first two companies are probably obvious, but the last one may be less known to TechCrunch readers.

Facebook

I highlighted Facebook as the top competitive threat to Patreon. This is also why it’s a natural acquirer. Patreon would bring fan relationship management to the Facebook ecosystem and particularly the company’s Creator App with CRM and analytics specifically fit for creators’ needs. It would also bring a stable of 130,000 creators of all types to make Facebook the primary infrastructure through which they engage their core fans.

Facebook is prioritizing human relationships more and clickbait content less. A natural replacement for the flood of news articles and viral videos is deeper engagement with the creators that Facebook users care the most about.

Since the annual churn rate of Patreon creators who earn $500 per month or more is under 1%, the ~9,200 creators who fit that category would likely stick around as Patreon’s infrastructure integrates with Facebook’s; the vast majority probably already have Facebook pages and possibly use the Creator App.

Facebook’s data on who fans are, what they like, and who their friends are is unrivalled. The insights Facebook could provide Patreon’s creators on their fans could help them substantially grow their number of patrons and build stronger relationships with them.

Like all major social media platforms, Facebook has partnership teams vying to get major celebrities to use its products. Patreon could lock the mid-tail of smaller (but still established) creators into its ecosystem, which means more consumer engagement, more time well spent, and more revenue through both ads and fan-to-creator transactions. Owning and integrating Patreon could have a much bigger financial benefit than solely revenue from the core Patreon product.

As a Facebook subsidiary, Patreon would stick more closely to being a software solution; it wouldn’t develop as robust of a creator support staff and the vision that it may expand to offer business loans and health insurance to creators would almost surely be cut. Facebook would also probably discontinue supporting the roughly 23% of Patreon creators who make not-safe-for-work (NSFW) content.

Given Patreon’s mission to help creators get paid, it may make a bigger impact as part of Facebook nonetheless. Facebook’s ecosystem of apps is where creators and their fans already are. Tens of thousands of creators could start using Patreon’s CRM infrastructure overnight and activating fan memberships to earn stable income.

A Facebook-Patreon deal could happen at any point. I think a deal could just as likely happen in a few months as in a few years. The key will be Facebook’s business strategy: does it want to build serious infrastructure for creators? And does it believe paywalled access to some content and groups fits the future of Facebook? The company is experimenting with both of those right now, but doesn’t appear to be committed as of yet.

YouTube

The other most likely acquirer is Google-owned YouTube. Patreon was birthed by a YouTuber to support himself and fellow creators after their AdSense income dropped substantially. YouTube is becoming a direct competitor through YouTube Memberships and merchandise integrations.

If Patreon shows initial success in getting creators to adopt premium pricing tiers and YouTube sees a strong response to the membership functionality it has rolled out, it’s hard to imagine YouTube not making a play to acquire Patreon and make membership a priority in product development. This would create a whole new market for it to dominate, making money by selling business features to creators and encouraging fan-to-creator payments to happen through its platform.

In the meantime, it seems that YouTube is still searching for an answer to whether membership fits within its scope. It previously removed the ability for creators to paywall some videos and it could view fan-to-creator monetization efforts as a distraction from its dominance as an advertising platform and its growing strength in streaming TV online (through the popular $40/month YouTube TV subscription).

YouTube is also a less compelling acquirer than Facebook because the majority of Patreon’s creators don’t have a place on YouTube since they don’t produce video content (as least as their primary content type). Unless YouTube expands its platform to support podcasts and still images as well, it would be paying a premium to acquire the subset of Patreon creators that it wants. Moreover, as much as a quarter of those may be creators of NSFW content that YouTube prohibits.

YouTube is the potential Patreon acquirer people immediately point to, but it’s not as tight of a fit as Facebook would be…or as Endeavor would be.

Endeavor

The third scenario is that a major company in the entertainment and talent representation sphere sees acquiring Patreon as a strategic play to expand into a whole new category of talent representation with a technology-first approach.  There is only one contender here: Endeavor, the $6.3 billion holding company led by Ari Emanuel and Patrick Whitesell that is backed by Silver Lake, Softbank, Fidelity, and Singapore’s GIC and has been on an acquisition spree.

This pairing shows promise. Facebook and YouTube are the most likely companies to acquire Patreon, but Endeavor may be the company best fit to acquire it.

Endeavor is an ecosystem of companies — with the world’s top talent agency WME-IMG at the center — that can each integrate with each other in different ways to collectively become a driving force in global entertainment, sports and fashion. Among the 25+ companies it has bought are sports leagues like the UFC (for $4 billion) and the video streaming infrastructure startup NeuLion (for $250 million). In September, it launched a division, Endeavor Audio, to develop, finance and market podcasts.

Endeavor wants to leverage its talent and evolve its revenue model toward scalable businesses. In 2015, Emanuel said revenue was 60% from representation and 40% from “the ownership of assets” but quickly shifting; last year Variety noted the revenue split as 50/50.

In alignment with Patreon, Endeavor is a big company centered on guiding the business activities of all types of artists and helping them build out (and maximize) new revenue streams. When you hear Emanuel and Whitesell, they reiterate the same talking points that Patreon CEO Jack Conte does: artists are now multifaceted, and not stuck to one activity. They are building their own businesses and don’t want to be beholden to distribution platforms. Patreon could thrive under Endeavor given their alignment of values and mission. Endeavor would want Patreon to grow in line with Conte’s vision, without fearing that it would cannibalize ad revenue (a concern Facebook and YouTube would both have).

In a June interview, Whitesell noted that Endeavor’s M&A is targeted at companies that either expand their existing businesses or ones where they can uniquely leverage their existing businesses to grow much faster than they otherwise could. Patreon fits both conditions.

Patreon would be the scalable asset that plugs the mid-tail of creators into the Endeavor ecosystem. Whereas WME-IMG is high-touch relationship management with a little bit of tech, Patreon is a tech company with a layer of talent relationship management. Patreon can serve tens of thousands of money-making creators at scale. Endeavor can bring its talent expertise to help Patreon provide better service to creators; Patreon would bring technology expertise to help Endeavor’s traditional talent representation businesses better analyze clients’ fanbases and build direct fan-to-creator revenue streams for clients.

If there’s opportunity to eventually expand the membership business model among the top tiers of creators using Patreon.com or Memberful (which Conte hinted at in our interviews), Endeavor could facilitate the initial experiments with major VIPs. If memberships are shown to make more money for top artists, that means more money in the pockets of their agents at WME-IMG and for Endeavor overall, so incentives are aligned.

Endeavor would also rid Patreon of the “starving artist” brand that still accompanies it and could open a lot of doors in for Patreon creators whose careers are gaining momentum. Perhaps other Endeavor companies could access Patreon data to identify specific creators fit for other opportunities.

An Endeavor-Patreon deal would need to occur before Patreon’s valuation gets too high. Endeavor doesn’t have tens of billions in cash sitting on its balance sheet like Google and Facebook do. Endeavor can’t use much debt to buy Patreon either: its leverage ratio is already high, resulting in Moody’s putting its credit rating under review for downgrade in December. Endeavor has repeatedly raised more equity funding though and is likely to do so again; it canceled a $400M investment from the Saudi government at the last minute in October due to political concerns but is likely pitching other investors to take its place.

Patreon has strong revenue growth and the opportunity to retain dominant market share in providing business infrastructure for creators — a market that seems to be growing. Whether it stays independent and can thrive in the public markets sometime or whether it will find more success under the umbrella of a strategic acquirer remains to be seen. Right now the latter path is the more compelling one.

YouTube demonetizes anti-vaccination videos

YouTube will demonetize channels that promote anti-vaccination views, after a report by BuzzFeed News found ads, including from health companies, running before anti-vax videos. The platform will also place a new information panel that links to the Wikipedia entry on “vaccine hesitancy” before anti-vax videos. Information panels (part of YouTube’s efforts to combat misinformation) about the measles, mumps, and rubella (MMR) vaccine had already appeared in front of anti-vaccination videos that mentioned it.

In a statement to BuzzFeed News, a YouTube spokesperson said “we have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content are a violation of those policies. We enforce these policies vigorously, and if we find a video that violates them, we immediately take action and remove ads.”

This is the second issue this week that has prompted YouTube advertisers to suspend their ads BuzzFeed News’ initial report on Feb. 20 came as several major advertisers, including Nestle and Epic Games, said they were pausing ads after YouTube creator Matt Watson revealed how the platform’s recommendation algorithm was being exploited by what he described as a “soft-core pedophilia ring.”

BuzzFeed News found that the top search results for queries about vaccine safety were usually from legitimate sources, like hospitals, but then YouTube’s Up Next algorithm would often recommend anti-vaccination videos. Ads, which are placed by YouTube’s advertising algorithm, appeared in front of many of those videos. YouTube also told BuzzFeed it would implement changes to its Up Next algorithm to prevent the spread of anti-vax videos.

Outbreaks of measles throughout the United States and in other countries have prompted scrutiny into the role of social media and tech companies, including Facebook and Google, in spreading misinformation.

Advertisers contacted by BuzzFeed News who said they will take action to prevent their ads from running in front of anti-vax videos include Nomad Health, Retail Me Not, Grammarly, Brilliant Earth, CWCBExpo, XTIVIA, and SolarWinds. Vitacost told BuzzFeed News that it had already pulled ads after the child exploitation issues became known.

Anto-vax channels now demonetized include VAXXED TV, LarryCook333, and iHealthTube.

Companies including Nestle, Epic and reportedly Disney suspend YouTube ads over child exploitation concerns

Days after a YouTube creator accused the platform of enabling a “soft-core pedophilia ring,” several companies have suspended advertising on the platform, including Nestle, Epic, and reportedly Disney and McDonald’s.

Nestle told CNBC that all of its companies in the U.S. have paused advertising on YouTube, while a spokesperson for Epic, maker of the massively popular game Fortnite, said it has suspended all pre-roll advertising. Other companies that confirmed publicly they are pausing YouTube advertising include Purina, GNC, Fairlife, Canada Goose, and Vitacost. Bloomberg and the Wall Street Journal report that Walt Disney Co. and McDonald’s, respectively, have pulled advertising, too.

Other advertisers, including Peloton and Grammarly, said they are calling on YouTube to resolve the issue.

The latest scandal over YouTube’s content moderation problems took off on Sunday when YouTube creator Matt Watson posted a video and in-depth Reddit post describing how pedophiles are able to manipulate the platform’s recommendation algorithm to redirect a search for “bikini haul” videos, featuring adult women, to exploitative clips of children. Some otherwise innocuous videos also had inappropriate comments, including some with timestamps that captured children in compromising positions.

A YouTube spokesperson sent a statement to TechCrunch that said “Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”

The platform has also reported comments to the National Center for Missing and Exploited Children and is taking further steps against child exploitation, including hiring more experts.

Watson’s report, however, highlights that YouTube continues to struggle with content that violates its own policies, even after a series of reports two years ago led to what creators dubbed the “adpocalpyse.” In an effort to appease advertisers, YouTube gave them more control over what videos their ads would appear before and also enacted more stringent policies for creators. Many YouTubers, however, have complained that the policies are unevenly enforced with little transparency, dramatically lowering their revenue but giving them little recourse to fix issues or appeal the platform’s decisions, even as objectionable content remains on the platform.