YouTube releases its first report about how it handles flagged videos and policy violations

YouTube has released its first quarterly Community Guidelines Enforcement Report and launched a Reporting Dashboard that lets users see the status of videos they’ve flagged for review. The inaugural report, which covers the last quarter of 2017, follows up on a promise YouTube made in December to give users more transparency into how it handles abuse and decides what videos will be removed.

“This regular update will help show the progress we’re making in removing violative content from our platform,” the company said in a post on its official blog. “By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed or removal and policy removal reasons.”

But the report is unlikely to quell complaints from people who believe YouTube’s rules are haphazardly applied in an effort to appease advertisers upset their commercials had played before videos with violent extremist content. The issue came to the forefront last year after a report by The Times, but many content creators say YouTube’s updated policies have made it very difficult to monetize on the platform, even though their videos don’t violate its rules.

YouTube, however, claims that its anti-abuse machine learning algorithm, which it relies on to monitor and handle potential violations at scale, is “paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).”

Its report says that YouTube removed 8.2 million videos during the last quarter of 2017, most of which were spam or contained adult content. Of that number, 6.7 million were automatically flagged by its anti-abuse algorithms first.

Of the videos reported by a person, 1.1 million were flagged by a member of YouTube’s Trusted Flagger program, which includes individuals, government agencies and NGOs that have received training from the platform’s Trust & Safety and Public Policy teams.

YouTube’s report positions views a video received before being removed as a benchmark for the success of its anti-abuse measures. At the beginning of 2017, 8% of videos removed for violent extremist content were taken down before clocking 10 views. After YouTube started using its machine-learning algorithms in June 2017, however, it says that percentage increased to more than 50% (in a footnote, YouTube clarified that this data does not include videos that were automatically and flagged before they could be published and therefore received no views). From October to December, 75.9% of all automatically flagged videos on the platform were removed before they received any views.

During that same period, 9.3 million videos were flagged by people, with nearly 95% coming from YouTube users and the rest from its Trusted Flagger program and government agencies or NGOs. People can select a reason when they flag a video. Most were flagged for sexual content (30.1%) or spam (26.4%).

Last year, YouTube said it wanted to increase the number of people “working to address violative content” to 10,000 across Google by the end of 2018. Now it says it has almost reached that goal and also hired more full-time anti-abuse experts and expanded their regional teams. It also claims that the addition of machine-learning algorithms enables more people to review videos.

In its report, YouTube gave more information about how those algorithms work.

“With respect to the automated systems that detect extremist content, our teams have manually reviewed over two million videos to provide large volumes of training examples, which improve the machine learning flagging technology,” it said, adding that it has started applying that technology to other content violations as well.

YouTube’s report may not ameliorate the concerns of content creators who saw their revenue drop during what they refer to as the “Adpocalpyse” or help them figure out how to monetize successfully again. On the other hand, it is a victory for people, including free speech activists, who have called for social media platforms to be more transparent about how they handle flagged content and policy violations, and may put more pressure on Facebook and Twitter.

YouTube ads for hundreds of brands still running on extremist and white nationalist channels

It’s been more than a year since YouTube promised to improve controls over what content advertisers would find their ads in front of; eight months since it promised to demonetize “hateful” videos; two months since it said it would downgrade offensive channels; and yet CNN reports that ads from hundreds of major brands are still appearing as pre-rolls for actual Nazis.

The ongoing failure to police billions of hours of content isn’t exactly baffling — this is a difficult problem to solve — but it is disappointing that YouTube seems to have repeatedly erred on the side of monetization.

As with previous reports, CNN’s article shows that ads were running on channels that, if YouTube’s content rules are to be believed, should have been demonetized and demoted instantly: Nazis, pedophiles, extremists of the right, left, and everywhere in between. Maybe even Logan Paul.

And the system appears to be working in strange ways: one screenshot shows a video by a self-avowed Nazi, entitled “David Duke on Harvey Weinstein exposing Jewish domination. Black/White genetic differences.” Below it a YouTube warning states that “certain features have been disabled for this video,” including comments and sharing, because of “content that may be inappropriate or offensive to some audiences.”

A cheerful ad from Nissan is running ahead of this enlightening piece of media, and CNN notes that ads also ran on it coming from the Friends of Zion Museum and the Jewish National Fund! Ads from the Toy Association ran on the channel of a guy who argued for the decriminalization of pedophilia!

I can’t really add anything to this. It’s so absurd I can barely believe it myself. Remember, this is after the company supposedly spent a year (at the very least) working to prevent this exact thing from happening. I left the headline in the present tense because I’m so certain that it’s still going on.

The responsibility really is YouTube’s, and if it can’t live up to its own promises, companies are going to leave it behind rather than face viral videos of their logo smoothly fading into a swastika on the wall of some sad basement-dwelling bigot. “Subway — eat fresh! And now, some guy’s thoughts on genocide.”

Some of the other brands that had ads run against offensive content: Amazon, Adidas, Cisco, Hilton, Hershey, LinkedIn, Mozilla, Netflix, Nordstrom, The Washington Post, The New York Times, 20th Century Fox Film, Under Armour, The Centers for Disease Control, Department of Transportation, Customs and Border Protection, Veterans Affairs and the US Coast Guard Academy.

I asked YouTube for comment on how this happened — or rather, how it never stopped happening. The company did not address my specific questions, but offered the following statement:

We have partnered with our advertisers to make significant changes to how we approach monetization on YouTube with stricter policies, better controls and greater transparency. When we find that ads mistakenly ran against content that doesn’t comply with our policies, we immediately remove those ads. We know that even when videos meet our advertiser friendly guidelines, not all videos will be appropriate for all brands. But we are committed to working with our advertisers and getting this right.

Very similar to previous statements over the last year or so. I look forward to hearing what brands it thought Nazism and pedophilia were appropriate for.

YouTube promises expansion of sponsorships, other monetization tools for creators

YouTube says it’s rolling out more tools to help its creators make money from their videos. The changes are meant to address creators’ complaints over YouTube’s new monetization policies announced earlier this year. Those policies were designed to make the site more advertiser-friendly following a series of controversies over video content from top creators, including videos from Logan Paul, who had filmed a suicide victim, and PewDiePie, who repeatedly used racial slurs, for example.

The company then decided to set a higher bar to join its YouTube Partner Program, which is what allows video publishers to make money through advertising. Previously, creators only needed 10,000 total views to join; they now need at least 1,000 subscribers and 4,000 hours of view time over the past year to join. This resulted in wide-scale demonetization of videos that previously relied on ads.

The company has also increased policing of video content in recent months, but its systems haven’t always been accurate.

YouTube said in February it was working on better systems for reviewing video content when a video is demonetized over its content. One such change, enacted at the time, involved the use of machine learning technology to address misclassifications of videos related to this policy. This, in turn, has reduced the number of appeals from creators who want a human review of their video content instead.

According to YouTube CEO Susan Wojcicki, the volume of appeals is down by 50 percent as a result.

Wojcicki also announced another new program related to video monetization which is launching into pilot testing with a small number of creators starting this month.

This system will allow creators to disclose, specifically, what sort of content is in their video during the upload process, as it relates to YouTube’s advertiser-friendly guidelines.

“In an ideal world, we’ll eventually get to a state where creators across the platform are able to accurately represent what’s in their videos so that their insights, combined with those of our algorithmic classifiers and human reviewers, will make the monetization process much smoother with fewer false positive demonetizations,” said Wojcicki.

Essentially, this system would rely on self-disclosure regarding content, which would then be factored in as another signal for YouTube’s monetization algorithms to consider. This was something YouTube had also said in February was in the works.

Because not all videos will be brand-safe or meet the requirements to become a YouTube Partner, YouTube now says it will also roll out alternative means of making money from videos. 

This includes an expansion of “sponsorships,” which have been in testing since last fall with a select group of creators.

Similar to Twitch subscriptions, sponsorships were introduced to the YouTube Gaming community as a way to support favorites creators through monthly subscriptions (at $4.99/mo), while also receiving various perks like custom emoji and a custom badge for live chat.

Now YouTube says “many more creators” will gain access to sponsorships in the months ahead, but it’s not yet saying how those creators will be selected, or if they’ll have to meet certain requirements, as well. It’s also unclear if YouTube will roll these out more broadly to its community, outside of gaming.

Wojcicki gave updates on various other changes YouTube has enacted in recent months. For example, she said that YouTube’s new moderation tools have led to a 75-plus percent decline in comment flags on channels, where enabled, and these will now be expanded to 10 languages. YouTube’s newer social network-inspired Community feature has also been expanded to more channels, she noted.

The company also patted itself on the back for its improved communication with the wider creator community, saying that this year it has increased replies by 600 percent and improved its reply rate by 75 percent to tweets addressed to its official handles: @TeamYouTube, @YTCreators, and @YouTube.

While that may be true, it’s notable that YouTube isn’t publicly addressing the growing number of complaints from creators who – rightly or wrongly – believe their channel has been somehow “downgraded” by YouTube’s recommendation algorithms, resulting in declining views and loss of subscribers.

This is the issue that led the disturbed individual, Nasim Najafi Aghdam, to attack YouTube’s headquarters earlier this month. Police said that Aghdam, who shot at YouTube employees before killing herself, was “upset with the policies and practices of YouTube.”

It’s obvious, then, why YouTube is likely proceeding with extreme caution when it comes to communicating its policy changes, and isn’t directly addressing complaints similar to Aghdam’s from others in the community.

But the creator backlash is still making itself known. Just read the Twitter replies or comment thread on Wojcicki’s announcement. YouTube’s smaller creators feel they’ve been unfairly punished because of the misdeeds of a few high-profile stars. They’re angry that they don’t have visibility into why their videos are seeing reduced viewership – they only know that something changed.

YouTube glosses over this by touting the successes of its bigger channels.

“Over the last year, channels earning five figures annually grew more than 35 percent, while channels earning six figures annually grew more than 40 percent,” Wojcicki said, highlighting YouTube’s growth.

In fairness, however, YouTube is in a tough place. Its site became so successful over the years, that it became impossible for it to police all the uploads manually. At first, this was the cause for celebration and the chance to put Google’s advanced engineering and technology to work. But these days, as with other sites of similar scale, the challenging of policing bad actors among billions of users, is becoming a Herculean task – and one companies are failing at, too.

YouTube’s over-reliance on algorithms and technology has allowed for a lot of awful content to see daylight – including inappropriate videos aimed a children, disturbing videos, terrorist propaganda, hate speech, fake news and conspiracy theories, unlabeled ads disguised as product reviews or as “fun” content, videos of kids that attract pedophiles, and commenting systems that allowed for harassment and trolling at scale.

To name a few.

YouTube may have woken up late to its numerous issues, but it’s not ignorant of them, at least.

“We know the last year has not been easy for many of you. But we’re committed to listening and using your feedback to help YouTube thrive,” Wojcicki said. “While we’re proud of this progress, I know we have more work to do.”

That’s putting it mildly.

 

‘Despacito’ and other popular music videos targeted by hackers

It’s been a rough couple of weeks for hacks. Earlier today, music syndication service Vevo was hit with a doozy through its YouTube page, targeting some of its most high-profile artists. The list includes, notably, “Despacito” by Luis Fonsi and Daddy Yankee, which rocketed to the top of the video service’s most viewed list last year.

The name of the songs were replaced by the words “Hacked by Prosox & Kuroi’sh,” with the words “Free Palestine” posted underneath beneath them, according to The BBC. Other impacted videos read like a who’s who of pop stars, including Taylor Swift, Drake and Katy Petty.

A spokesperson for the video service told TechCrunch, “Vevo can confirm that a number of videos in its catalogue were subject to a security breach today, which has now been contained. We are working to reinstate all videos affected and our catalogue to be restored to full working order. We are continuing to investigate the source of the breach.”

At the time of this writing, it appears that most of the videos have already been restored to their proper pop glory, though a number were up for some time after the hack was first discovered. One account taking credit for the hack claimed it was ‘just for fun,’ saying the whole thing was pulled off with a simple script — though the actual process was likely a fair bit more involved. 

Breaking: Active shooter reported at YouTube HQ

Reports of an active shooter at YouTube’s San Bruno, CA headquarters have been reported on social media. Local news station KRON is reporting a number of 911 calls in the area related to the event. Local police have also issued a warning for bystanders to stay out of the area. It appears as though emergency teams have begun an evacuation of the facility.

The video service’s Bay Area headquarters houses more than 1,100 employees, according to Google. TechCrunch has confirmed with San Bruno Police that there is an on-going incident in the area, but the department has yet to issue an official statement. They have, however, confirmed with The Hollywood Report that the situation constitutes a “very active scene.” 

We’ve reached out to Google for more information. We will update this story as we learn more. 

YouTube TV becomes first-ever presenting partner for the NBA Finals, following similar deal with MLB

YouTube TV announced this morning it will become the first-ever presenting sponsor of the 2018 NBA Finals as the result of a multi-year partnership focused on raising consumer awareness of YouTube’s streaming TV service. In addition to becoming the presenting sponsor of the NBA Finals, the deal will also see YouTube TV becoming the presenting sponsor of the Women’s National Basketball Association (WNBA) Finals and the NBA’s minor league, the NBA G League.

The deal is now one of many YouTube TV has forged in recent weeks. It has also partnered with the Los Angeles Football Club, Seattle Sounders FC, and most notably, the MLB, on similar promotional efforts.

With the Major League Baseball partnership, YouTube TV again becomes the presenting sponsor of the World Series for the next two years. That means the game will be referred to as the 2018 “World Series presented by YouTube TV.” The arrangement includes on-air callouts, national TV spots featuring players, and in-stadium ads, as well.

Similarly, the NBA deal will see the game referred to as “The Finals presented by YouTube TV.” The YouTube TV logo will also be featured on the court and in the area. Plus, YouTube TV will get on-air callouts, and be featured in ABC commercial spots, as well as across the NBA’s social media channels and digital assets.

ABC has aired the Finals since 2003, but today, more people are cutting the cord with traditional broadcast and cable television, and losing access to some sports programming as a result – unless they install an antenna. One firm, eMarketer, estimated there are 22.2 million cord cutters in the U.S., but combined with the “cord nevers” who never sign up to begin with, there are 56.6 million U.S. consumers going without pay TV.

In particular, the younger demographic is giving up TV in greater numbers, as other entertainment options – like the internet itself, mobile apps, and YouTube – have gained their attention. This has led to sports leagues trying to find new ways to attract this audience. Other efforts over the years have included streaming games through social media sites like Facebook and Twitter, and doing deals with Snapchat to produce original shows, among other things.

The NBA previously worked with Twitter on original programming, for example.

Amid all the cord cutting, a new crop of streaming TV services have been trying to bring back TV – including major channels like ABC, Fox, NBC and CBS. YouTube TV is one of several services in this space, where it competes with Dish’s Sling TV, AT&T’s DirecTV Now, Hulu with Live TV, PlayStation Vue, and fubo TV.

The oldest, Sling TV, is the largest of the bunch, with 2.21 million subscribers, followed by the AT&T’s 1 million. YouTube TV is battling with Hulu for third place, which Hulu currently holds.

Going directly after sports fans will all these partnership deals is one way YouTube TV getting the word out to those who don’t yet know the service even exists.

The Finals will air on ABC starting on Thursday, May 31, and will also be streamed on YouTube TV through ABC. The streaming service is currently available in nearly 100 U.S. markets, or over 85 percent of U.S. households.

The deal also follows an expansion of the YouTube TV service, which saw additions of Turner-owned networks, NBA TV and MLB Network, and a price increase of $5, bringing YouTube TV to $40 per month.

In addition, YouTube itself has worked with the NBA for over a decade. The NBA was the first league to launch its own YouTube channel in 2005, and joined YouTube’s “Claim Your Content” program in 2007. Its channel now has over 4.6 billion views.

“The NBA and ESPN have a history of creating innovative sponsorships, and this certainly qualifies,” said Wendell Scott, Senior Vice President, Multimedia Sales, ESPN, in a statement. “YouTube TV is a next generation partner for an ascendant league and we look forward to working with them to maximize the impact of their investment across ABC and the ESPN platform.”

Get ready to start seeing more local ads on YouTube

YouTube’s video ad creation service aimed at helping small business reach YouTube viewers is now available more broadly across the U.S. The company announced this morning that YouTube Director onsite, as the service is called, is now live in over 170 U.S. cities, up from only 9 previously – Atlanta, Boston, Chicago, Los Angeles, San Francisco, Washington D.C., New York, Tampa and Seattle.

This is significant expansion, in terms of reaching potential YouTube advertisers who would have otherwise not had the resources to write, film and edit a professional ad for YouTube.

The service is kind of a bargain for the small businesses, too. Hiring a pro to create a professionally produced video could cost $1,000 or more. But YouTube is basically doing it for free – well, free with a catch.

It’s available at no charge for any business that commits to spending at least $350 to advertise the video on YouTube. However, that’s in line with the low-end of buying airtime for a 30-second local TV ad, which ranges from $200 to $1,500+, depending on time slot.

YouTube Director onsite works by connecting area businesses with YouTube-approved filmmakers, who will schedule call with the advertiser to learn about the business and help them to write a script. The filmmaker then comes to the business to film the video, and returns an edited version the next week. YouTube’s ad experts help get the video upload to the site, and aid the business in crafting their YouTube ad campaign.

The company hasn’t shared any comprehensive metrics on how well these ads perform, but did note in a blog post a single case study where a custom guitar shop saw a 13x return on ad spend, and a 130 percent increase in revenue from the ad. The YouTube Director onsite website also features a number of other ads created via the service, to showcase the professional quality of what can be produced.

The company has claimed for years that YouTube ads are more effective than TV because they allow targeting – but that’s an argument that can be made for may sorts of online ads. In addition, YouTube reaches a younger demographic, so small businesses should keep in mind that they may need other ways to reach to those over the age of 35, for example.

The timing of this U.S. expansion is relevant because YouTube just last week announced new AdWords experiences that tie together Google searches with YouTube advertising and calls-to-action.

“Soon you’ll be able to reach people on YouTube who recently searched for your products or services on Google. For example, an airline could reach people on YouTube who recently searched Google.com for ‘flights to Hawaii.’ We call this custom intent audiences,” explained the recent Google’s announcement.

The company had previously allowed Google account user data to influence YouTube ads, starting in 2017. With custom intent audiences, advertisers can now create a keyword list for their video in AdWords. They can then combine this targeting feature with YouTube’s new direct response video ad format, TrueView, which offers a customizable call-to-action in a video ad.

The ads created by YouTube Director onsite will support this feature as well, allowing the businesses to capture leads or referrals, or something else that’s important to their specific businesses.

In other words, if you thought having the shoes you abandoned in a retailer’s shopping cart following you around the web was weird, wait until YouTube starts showing you ads for local businesses that match up with what you’ve just been googling. (By the way, Google does let you opt out of personalized ads if that’s how you roll.)

Get ready to start seeing more local ads on YouTube

YouTube’s video ad creation service aimed at helping small business reach YouTube viewers is now available more broadly across the U.S. The company announced this morning that YouTube Director onsite, as the service is called, is now live in over 170 U.S. cities, up from only 9 previously – Atlanta, Boston, Chicago, Los Angeles, San Francisco, Washington D.C., New York, Tampa and Seattle.

This is significant expansion, in terms of reaching potential YouTube advertisers who would have otherwise not had the resources to write, film and edit a professional ad for YouTube.

The service is kind of a bargain for the small businesses, too. Hiring a pro to create a professionally produced video could cost $1,000 or more. But YouTube is basically doing it for free – well, free with a catch.

It’s available at no charge for any business that commits to spending at least $350 to advertise the video on YouTube. However, that’s in line with the low-end of buying airtime for a 30-second local TV ad, which ranges from $200 to $1,500+, depending on time slot.

YouTube Director onsite works by connecting area businesses with YouTube-approved filmmakers, who will schedule call with the advertiser to learn about the business and help them to write a script. The filmmaker then comes to the business to film the video, and returns an edited version the next week. YouTube’s ad experts help get the video upload to the site, and aid the business in crafting their YouTube ad campaign.

The company hasn’t shared any comprehensive metrics on how well these ads perform, but did note in a blog post a single case study where a custom guitar shop saw a 13x return on ad spend, and a 130 percent increase in revenue from the ad. The YouTube Director onsite website also features a number of other ads created via the service, to showcase the professional quality of what can be produced.

The company has claimed for years that YouTube ads are more effective than TV because they allow targeting – but that’s an argument that can be made for may sorts of online ads. In addition, YouTube reaches a younger demographic, so small businesses should keep in mind that they may need other ways to reach to those over the age of 35, for example.

The timing of this U.S. expansion is relevant because YouTube just last week announced new AdWords experiences that tie together Google searches with YouTube advertising and calls-to-action.

“Soon you’ll be able to reach people on YouTube who recently searched for your products or services on Google. For example, an airline could reach people on YouTube who recently searched Google.com for ‘flights to Hawaii.’ We call this custom intent audiences,” explained the recent Google’s announcement.

The company had previously allowed Google account user data to influence YouTube ads, starting in 2017. With custom intent audiences, advertisers can now create a keyword list for their video in AdWords. They can then combine this targeting feature with YouTube’s new direct response video ad format, TrueView, which offers a customizable call-to-action in a video ad.

The ads created by YouTube Director onsite will support this feature as well, allowing the businesses to capture leads or referrals, or something else that’s important to their specific businesses.

In other words, if you thought having the shoes you abandoned in a retailer’s shopping cart following you around the web was weird, wait until YouTube starts showing you ads for local businesses that match up with what you’ve just been googling. (By the way, Google does let you opt out of personalized ads if that’s how you roll.)

YouTube is reportedly introducing your kids to conspiracy theories, too

In a recent appearance by YouTube CEO Susan Wojcicki at the South by Southwest Festival, she suggested that YouTube is countering the conspiracy-related videos that have been spreading like wildfire on the platform — including videos telling viewers that high school senior and Parkland, Fl. survivor David Hogg is an actor.

Specifically, Wojcicki outlined YouTube’s plans to add “information cues,” including links to Wikipedia pages that debunk garbage content for viewers if they choose to learn more. (Somewhat strangely, no one had told Wikipedia about this plan.)

Either way, the platform is going to have do much better than that, suggests a new Business Insider report that says YouTube Kids has a huge problem with conspiracy videos, too. To wit, the three-year-old, ostensibly kid-friendly version of YouTube is showing its young viewers videos that preach the nonsensical, including “that the world is flat, that the moon landing was faked, and that the planet is ruled by reptile-human hybrids,” according to BI’s own first-hand findings.

In fact, when BI searched for “UFO” on YouTube Kids, one of the top videos to appear was a nearly five-hour-long lecture by professional conspiracy theorist David Icke, who covers everything in the clip from “reptile human bloodlines,” to the Freemasons, who he credits with building the Statue of Liberty, Las Vegas, Christianity, and Islam, among other things. (The Freemasons also killed President John Kennedy, he tells viewers.).

Business Insider says YouTube removed the videos from YouTube Kids after its editorial team contacted the company. YouTube also issued the following statement: “The YouTube Kids app is home to a wide variety of content that includes enriching and entertaining videos for families. This content is screened using human trained systems. That being said, no system is perfect and sometimes we miss the mark. When we do, we take immediate action to block the videos or, as necessary, channels from appearing in the app. We will continue to work to improve the YouTube Kids app experience.”

That’s not going to be good enough for parents who are paying attention. Hunter Walk, a venture capitalist who previously led product at YouTube and has a young daughter, may have summed it up best in a tweet that he published earlier this afternoon, writing that “when you create and market an app to kids, the level of care and custodial responsibility you need to take is 100x usual. Clean it up or shut it down pls.”

YouTube has been reluctant to tinker with is recommendation algorithm because its “main objective is to keep you consuming YouTube videos for as long as possible” Wired noted this past week. (Crazy theories are apparently quite sticky). Wired also reported that despite a recent uproar about all the conspiracy theory content, YouTube still doesn’t have clear rules around when whether these videos violate its community guidelines, which cover bullying, hate speech, graphic violence, and sexually explicit content.

Wojcicki said during her festival appearance that “People can still watch the videos, but then they have access to additional information.”

Hopefully, YouTube will come up with a more sophisticated solution to the spread of misinformation, especially when it comes to its younger viewers. We don’t yet know the scale of this particular issue (we’ve reached out to YouTube to see if the company is able and willing to discuss it in further detail). But as it is, this editor doesn’t allow her kids to watch YouTube Kids without strict supervision for fear of what they might see. At this point, we’d be surprised if parents at YouTube did otherwise.

Wikipedia wasn’t aware of YouTube’s conspiracy video plan

YouTube has a plan to combat the abundant conspiracy theories that feature in credulous videos on its platform; not a very good plan, but a plan just the same. It’s using information drawn from Wikipedia relevant to some of the more popular conspiracy theories, and putting that info front and center on videos that dabble in… creative historical re-imaginings.

The plan is being criticized from a number of quarters (including this one) for essentially sloughing responsibility about this harmful content on to another, volunteer-based organization. But it turns out that’s not even a responsibility that Wikipedia even know it was taking on.

Wikimedia Foundation exec director Katherine Maher notes that YouTube did this whole thing “independent” of their organization, and an official statement from Wikimedia says that it was “not given advance notice of this announcement.”

Everyone on the Wikimedia side is taking this pretty much in stride, however, expressing happiness at seeing their content used to drive the sharing of “free knowledge,” but it does seem like something that YouTube could’ve flagged in advance before announcing the new feature on stage at SXSW.

Maybe YouTube couldn’t say anything because the Illuminati bound them to secrecy… because of the chemtrails.