Apple and Google bow to pressure in Russia to remove Kremlin critic’s tactical voting app

Apple and Google have removed a tactical voting app created by the organization of jailed Kremlin critic, Alexei Navalny, from their respective mobile app stores in Russia.

Earlier this week Reuters reported that the Russian state had been amping up the pressure on foreign tech giants ahead of federal elections — appropriating the language of “election interference” to push US companies to censor the high profile political opponent to president Putin.

On Twitter today, a key Navalny ally, Ivan Zhdanov, tweeted that his organization is considering suing Apple and Google over removal of the apps — dubbing the act of censorship a “huge mistake”.

Zhdanov has also published what he says is Apple’s response to Team Navalny — in which the tech giant cites the Kremlin’s classification of a number of pro-Navalny organizations as “extremist” groups to justify its removal of the software.

(Image credit: Screengrab of detail from Apple’s notification to the developer, via Zhdanov’s tweet)

Apple and Google routinely say they comply with ‘all local laws’ in the countries where they operate.

However in Russia that stance means they have become complicit in acts of political censorship.

“We note that the Prosecutor’s Office of the Russian Federation and the Prosecutor’s Office of the City of Moscow have also determined that the app violates the legislation of the Russian Federation by enabling interference in elections,” Apple writes in the notification of takedown it sent to the developer of the tactical voting app.

“While your app has been removed from the Russia App Store, it is still available in the App Stores for the other territories you selected in App Store Connect,” Apple adds.

Apple and Google have been contacted for comment on the removal of Navalny’s app.

 

Also via Twitter, Zhdanov urged supporters to focus on the tactical voting mission — tweeting a link to a video hosted on Google-owned YouTube which contains recommendations to Russians on how to cast an anti-Putin vote in the parliamentary elections taking place today until Sunday.

Navalny’s supporters are hoping to mobilize voters across Russia to cast tactical ballots in a bid to unseat Putin by voting for whatever candidate has the best chance of defeating the ruling United Russia party.

Their tactical voting strategy has faced some criticism — given that many of the suggested alternatives are, at best, only very weakly opposed to Putin’s regime.

However Navalny’s supporters would surely point out they are having to operate within a flawed system.

After Apple and Google initially refused to remove Navalny’s ‘Smart Voting’ app, last month, the Russian state has been attempting to block access to his organization’s website.

It has even reportedly targeted Google docs — which supporters of Navalny have also been using to organize tactical voting efforts.

Screengrab of the Smart Voting app on the UK iOS app store (Image credits: Natasha Lomas/TechCrunch)

Earlier this month Reuters reported that Russia’s communications regulator, Roskomnadzor, had threatened Apple and Google with fines if they did not remove the Smart Voting app — warning that failure to comply could be interpreted as election meddling.

Russian press has also reported that Apple and Google were summoned to a meeting at the Federation Council on the eve of the election — as Putin’s regime sought to force them to do his anti-democratic bidding.

According to a report by Kommersant, the tech giants were warned the Russian Federation was preparing to tighten regulations on their businesses — and told to “come to their senses”, facing another warning that they were at a “red line”.

The last ditch effort to force the platforms to remove Navalny’s app did then pay off.

In recent weeks, Roskomnadzor has also been targeting VPN apps in the country for removal — making it hard for Russians to circumvent the local ban on Navalny’s app by accessing the software through the stores of other countries.

Local search giant, Yandex, has also reportedly been ordered not to display search results for the Smart Voting app.

Earlier this year, Putin’s regime also targeted Twitter — throttling the service for failing to remove content it wanted banned, although Roskomnadzor claimed the action was related to non-political content such as minors committing suicide, child sexual exploitation and drug use.

Flipboard rolls out newsfeed personalization tools to save you from doomscrolling

Facebook is preparing to adjust its News Feed to de-emphasize political posts and current events, but news reader Flipboard is instead rolling out an update that puts users in control of their own feeds. The company announced this morning the launch of a new controller on the cover of its own main newsfeed, aka the “For You” feed, which now allows users to select new topics to follow and deselect those they no longer want to hear about. The feature, which Flipboard dubs “an antidote to doomscrolling,” allows users to customize their For You feed to deliver a wider selection of stories related to their various interests, instead of focusing their home page on breaking news and politics.

Given today’s current events — a pandemic that’s dragging on, climate change-induced wildfires and major storms, the fall of Afghanistan and other disasters — it’s no wonder why people want to take a break from the daily news. But for Flipboard, that trend could mean reduced use of its news-reading app, as well.

But while Flipboard notes that millions do use its app to keep up with breaking stories and politics, a majority of its user base also spends their time engaging with other topics — like travel, food, photography, fitness and parenting.

Image Credits: Flipboard

By introducing tools that allow users to customize their own feeds, the company believes users will not only see improved mental health, but will also spend a longer time in the Flipboard app. Already, this appears to be true, based on other recent changes Flipboard has made.

The company recently introduced topic personalization features, which allowed users to zero in on more niche interests — think, not just cooking but keto cooking; not just health, but mindfulness and sleep, for example. Users who customized their preferences spent between nine and 12 minutes per day reading stories about these topics, on average, Flipboard found.

With the launch of For You newsfeed controls, Flipboard wants to bring a similar level of customization and control to users’ own homepages.

The company said the feature also addresses the top request from users — they’ve been asking to have more control over the content selection in their For You feed.

To use the feature, you’ll look for the new filter toggles at the top of the main page. After tapping the icon, you’ll be launched into a window where you can tap and untap a range of topics. You can also use the search bar to discover other interests that may not be listed. When you’re finished customizing, you’ll just tap “Save” to exit back to your newly customized For You feed.

Image Credits: Flipboard

Flipboard hopes its customization capabilities will help it stand out from other news reading experiences — whether that’s browsing news inside social media feeds or even in dedicated news reading apps.

“This level of content control is unique to Flipboard; just think about how hard it is to adjust your feed on any other platform,” noted Flipboard CEO Mike McCue, when introducing the update. “A highly personalized feed empowers people to focus on the things that matter to them, without being distracted by doomscrolling, misinformation or browsing through other people’s lives. We build a platform that lets people take control of their media consumption rather than letting it control them,” he added.

Court orders US Capitol rioter to unlock his laptop ‘with his face’

A federal judge in Washington DC has ordered a man accused of participating in the U.S. Capitol riot on January 6 to unlock his laptop “with his face,” after prosecutors argued that the laptop likely contains video footage that would incriminate him in the attempted insurrection.

Guy Reffitt was arrested in late January, three weeks after he participated in the riot, and has been in jail since. He has pleaded not guilty to five federal charges, including bringing a firearm to the Capitol grounds and a charge of obstructing justice. His Windows laptop was one of several devices seized by the FBI, which investigators said was protected with a password, but that it could also be unlocked using Reffitt’s face.

Prosecutors said forensic evidence suggested that the laptop contained gigabytes of footage from Reffitt’s helmet-worn camera that he allegedly used to record some of the riot, and asked the court if it could compel Reffitt to sit in front of the computer to unlock it.

Reffitt’s lawyer told the court that his client could “not remember” the password, but the court sided with the government and granted the motion to compel his biometrics. Reffitt’s lawyer told CNN, which first reported the court order, that the laptop is now unlocked.

The government took advantage of a loophole in the Fifth Amendment, a constitutional right that grants anyone in the U.S. the right to remain silent, which includes the right to not turn over information that could implicate themselves in a crime, such as a password. But some courts have ruled that those protections don’t extend to a person’s physical attributes that can be used in place of a password, such as a face scan or fingerprint.

In Reffitt’s indictment, the FBI said as such, arguing that compelling Reffitt to unlock his computer by sitting in front of it “would not run afoul of the defendant’s Fifth Amendment right against self-incrimination.”

Courts across the U.S. are still divided on the reading of the Fifth Amendment and whether or not it applies to the compelled use of a person’s biometrics. The U.S. Supreme Court isn’t likely to address the issue any time soon, rejecting two petitions in as many years to rule on the matter, leaving it largely up to the states to decide.

Biden admin will share more info with online platforms on ‘front lines’ of domestic terror fight

The Biden administration is outlining new plans to combat domestic terrorism in light of the January 6 attack on the U.S. Capitol and social media companies have their own part to play.

The White House released a new national strategy on countering domestic terrorism Tuesday. The plan acknowledges the key role that online platforms play in bringing violent ideas into the mainstream, going as far as calling social media sites the “front lines” of the war on domestic terrorism.

“The widespread availability of domestic terrorist recruitment material online is a national security threat whose front lines are overwhelmingly private–sector online platforms, and we are committed to informing more effectively the escalating efforts by those platforms to secure those front lines,” the White House plan states.

The Biden administration committed to more information sharing with the tech sector to fight the tide of online extremism, part of a push to intervene well before extremists can organize violence. According to a fact sheet on the new domestic terror plan, the U.S. government will prioritize “increased information sharing with the technology sector,” specifically online platforms where extremism is incubated and organized.

“Continuing to enhance the domestic terrorism–related information offered to the private sector, especially the technology sector, will facilitate more robust efforts outside the government to counter terrorists’ abuse of Internet–based communications platforms to recruit others to engage in violence,” the White House plan states.

In remarks timed with the release of the domestic terror strategy, Attorney General Merrick Garland asserted that coordinating with the tech sector is “particularly important” for interrupting extremists who organize and recruit on online platforms and emphasized plans to share enhanced information on potential domestic terror threats.

In spite of the new initiatives, the Biden administration admits that that domestic terrorism recruitment material will inevitably remain available online, particularly on platforms that don’t prioritize its removal — like most social media platforms, prior to January 2021 — and on end-to-end encrypted apps, many of which saw an influx of users when social media companies cracked down on extremism in the U.S. earlier this year.

“Dealing with the supply is therefore necessary but not sufficient: we must address the demand too,” the White House plan states. “Today’s digital age requires an American population that can utilize essential aspects of Internet–based communications platforms while avoiding vulnerability to domestic terrorist recruitment and other harmful content.”

The Biden administration will also address vulnerability to online extremism through digital literacy programs, including “educational materials” and “skills–enhancing online games” designed to inoculate Americans against domestic extremism recruitment efforts, and presumably disinformation and misinformation more broadly.

The plan stops short of naming domestic terror elements like QAnon and the “Stop the Steal” movement specifically, though it acknowledges the range of ways domestic terror can manifest, from small informal groups to organized militias.

A report from the Office of the Director of National Intelligence in March observed the elevated threat to the U.S. that domestic terrorism poses in 2021, noting that domestic extremists leverage mainstream social media sites to recruit new members, organize in-person events and share materials that can lead to violence.

Medium sees more employee exits after CEO publishes ‘culture memo’

In April, Medium CEO Ev Williams wrote a memo to his staff about the company’s shifting culture in the wake of a challenging year.

“A healthy culture brings out the best in people,” he wrote. “They feel psychologically safe voicing their ideas and engaging in debate to find the best answer to any question — knowing that their coworkers are assuming good intent and giving them the benefit of the doubt because they give that in return.”

A few paragraphs later, Williams wrote that while counterperspectives and unpopular opinions are “always encouraged” to help make decisions, “repeated interactions that are nonconstructive, cast doubt, assume bad intent, make unsubstantiated accusations, or otherwise do not contribute to a positive environment have a massive negative impact on the team and working environment.”

He added: “These behaviors are not tolerated.”

The internal memo, obtained and verified by TechCrunch, was published nearly one month after Medium staff’s unionization attempt failed to pass, and roughly one week after Williams announced a pivot of the company’s editorial ambitions to focus less on in-house content and more on user-generated work.

Medium’s editorial team got voluntary payouts as part of the shift, with VP of Editorial Siobhan O’Connor and the entire staff of GEN Magazine stepping away.

However, several current and former employees told TechCrunch that they believe Medium’s mass exodus is tied more to Williams’ manifesto, dubbed “the culture memo,” than a pivot in editorial focus. Since the memo was published, many non-editorial staffers — who would presumably not be impacted by a shift in content priorities — have left the company, including product managers, several designers and dozens of engineers.

 

Those departing allege that Williams is trying to perform yet another reset of company strategy, at the cost of its most diverse talent. One pull of internal data that includes engineers, editorial staff, the product team, and a portion of its HR and finance team, suggests that, of the 241 people who started the year at Medium, some 50% of that pool are now gone. Medium, which has hired employees to fill some vacancies, denied these metrics, stating that it currently has 179 employees.

Medium said that 52% of departures were white, and that one third of the company is non-white and non-Asian. The first engineer that TechCrunch spoke to said that minorities are overrepresented in the departures at the company. They also added that, when they joined Medium, there were three transgender engineers. All have since left.

“A beloved dictator vibe”

In February, a number of Medium employees — led by the editorial staff — announced plans to organize into a union. The unionization effort was eventually defeated after falling short by one vote, a shortfall that some employees think was due to Medium executives pressuring staff to vote against the union.

The month after the unionization effort failed, Medium announced an editorial pivot. The company offered new positions or voluntary payouts for editorial staff. A number of employees left, which is not uncommon in the aftermath of a tense time period such as a failed unionization and the offer of a clear, financially safe route out.

In April, Williams posted the culture memo outlining his view on the company’s purpose and operating principles. In the memo, he writes that “there is no growth without risk-taking and no risk-taking without occasional failure” and that “feedback is a gift, and even tough feedback can and should be delivered with empathy and grace.” The CEO also noted the company’s commitment to diversity, and how adapting to “opportunities or threats is a prerequisite for winning.”

Notably, Medium has gone through a number of editorial strategy changes, dipping in and out of subscriptions, in-house content, and now, leaning on user-generated content and paid commissions.

“Team changes, strategy changes and reorganizations are inevitable. Each person’s adaptivity is a core strength of the company,” the memo reads.

The memo doesn’t explicitly address the unionization attempt, but does talk about how Medium will not tolerate “repeated interactions that are nonconstructive, cast doubt, assume bad intent, make unsubstantiated accusations, or otherwise do not contribute to a positive environment [but] have a massive negative impact on the team and working environment.”

Employees that we spoke to think that Williams’ memo, while internal rather than publicly posted, is reminiscent of statements put out by Coinbase CEO Brian Armstrong and Basecamp CEO Jason Fried, which both banned political discussion at work due to its incendiary or “distracting” nature. While the Medium memo doesn’t wholly ban politics, the first engineer said that the “undertone” of the statement creates a “not safe work environment.” Frustrated employees created a side-Slack to talk about issues at Medium.

In a statement to TechCrunch, Medium said that “many employees said they appreciated the clarity and there were directors and managers involved in shaping it.”

The month of the memo, churn tripled at the company compared to the month prior and was 30 times higher than the January metric, using an internal data set obtained by TechCrunch.

The second engineer that spoke to TechCrunch left the company last month and said that the memo didn’t have anything “egregious” at first glance.

“It was more of a beloved dictator vibe, of like, your words are vague enough that they’re not enforceable on anything else, and it looks good on paper,” they said. “If you just saw that memo and nothing else, it’s not a Coinbase memo, it’s not a Basecamp memo.”

But, given the timing of the memo, the engineer said their interpretation of Williams’ message was clear.

“[Medium wants] to enforce good vibes and shut down anything that is questioning ‘the mission,’” they said.

Medium’s extreme 

The same engineer thinks that “very few people left because of the editorial pivot.” Instead, the engineer explained a history of problematic issues at Medium, with a wave of departures that seem to be clearly triggered by the memo.

In July 2019, for example, Medium chose to publish a series that included a profile of Trump supporter Joy Villa with the headline “I have never been as prosecuted for being Black or Latina as I have been for supporting Trump.”

When the Latinx community at Medium spoke to leadership about discomfort in the headline, they claimed that executives from editorial didn’t do anything about the headline until it was mentioned in a public Slack channel. One editor asked anyone who had gone through the immigration process or was a part of the Latinx community to get in a room and explain their side, a moment that felt diminishing to this employee. The headline only changed when employees posted in a public Slack channel about their qualms.

“They think caring is enough,” the employee said. “And that listening is merciful and really caring, and therefore they’re really shocked when that is not enough.”

The third engineer who spoke to TechCrunch joined the company in 2019 because they were looking for a mission-driven company impacting more than just tech. They realized Medium had “deeper issues” during the Black Lives Matter movement last summer.

“There were deeper issues that I just hadn’t heard about because I wasn’t part of them. That just kind of got slid under the rug,” they said, such as the Trump supporter profile. The former employee explained how they learned that HR had ignored a report of an employee saying the N-word during that time, too. Medium said this is false.

“I don’t feel like I needed the memo to really understand their true colors,” they said.

After The Verge and Platformer published a report on Medium’s messy culture and chaotic editorial strategy, the second engineer said that multiple employees who were assumed to be tied to the story were pressured to resign.

“The way I see it, they fought dirty to defeat the union,” the first engineer said. “But it wasn’t a total success because all of these people have decided to leave in the wake of the decision, and that’s the cost. The people who are left basically feel like they have to nod and smile because Medium has made it clear that they don’t want you to bring your full self to work.”

The engineer said that Medium’s culture of reckoning is different from Coinbase because of the mission-oriented promise of the former.

“Some companies, like Coinbase, have said that ‘we want people who are not going to bring politics and social issues to work,’ so if you join Coinbase, that’s what you are expecting, and that’s fine,” they said. “But Medium specifically recruited people who care about the world, and justice, and believe in the freedom of speech and transparency.”

The engineer plans to officially resign soon and already has interviews lined up.

“It’s a good job market out there for software engineers, so why would I work for a company that is treating their own people unfairly?”

Twitter bans James O’Keefe of Project Veritas over fake account policy

Twitter has banned right-wing provocateur James O’Keefe, creator of political gotcha video producer Project Veritas, for violating its “platform manipulation and spam policy,” suggesting he was operating multiple accounts in an unsanctioned way. O’Keefe has already announced that he will sue the company for defamation.

The ban, or “permanent suspension” as Twitter calls it, occurred Thursday afternoon. A Twitter representative the action followed the violation of rules prohibiting “operating fake accounts” and attempting to “artificially amplify or disrupt conversations through the use of multiple accounts,” as noted here.

This suggests O’Keefe was banned for operating multiple accounts, outside the laissez-faire policy that lets people have a professional and a personal account, and that sort of thing.

But sharp-eyed users noticed that O’Keefe’s last tweet unironically accused reporter Jesse Hicks of impersonation, including an image showing an unredacted phone number supposedly belonging to Hicks. This too may have run afoul of Twitter’s rules about posting personal information, but Twitter declined to comment on this when I asked.

Supporters of O’Keefe say that the company removed his account as retribution for his most recent “exposé” involving surreptitious recordings of a CNN employee admitting the news organization has a political bias. (The person he was talking to had, impersonating an nurse, matched with him on Tinder.)

For his part O’Keefe said he would be suing Twitter for defamation over the allegation that he operated fake accounts. I’ve contacted Project Veritas for more information.

Facebook to test downranking political content in News Feed

After years of optimizing its products for engagement, no matter the costs, Facebook announced today it will “test” changes to its News Feed focused on reducing the distribution of political content. The company qualified these tests will be temporary, impact a small percentage of people, and will only run in select markets, including the U.S., Canada, Brazil, and Indonesia.

The point of the experiments, Facebook says, is to explore a variety of ways it can rank political content in the News Feed using different signals, in order to decide on what approach it may take in the future.

It also notes that COVID-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services, will be exempt from being downranked in the News Feed during these tests. Similarly, content from official government agencies will not be impacted.

The tests may also include a survey component, where Facebook asks impacted users about their experience.

Facebook’s announcement of the tests is meant to sound underwhelming because any large-scale changes would be an admission of guilt, of sorts. Facebook has the capacity to make far greater changes — when it wanted to downrank publisher content, it did so, decimating a number of media businesses along the way. In previous years, it also took harder action against low-quality sites, scrapers, clickbait, spam, and more.

The news of Facebook’s tests comes at a time when people are questioning social media’s influence and direction. A growing number of social media users now believe tech platforms have been playing a role in radicalizing people, as their algorithms promote unbalanced views of the world, isolate people into social media bubbles, and allow dangerous speech and misinformation to go viral.

In a poll this week, reported by Axios, a majority of Americans said they now believe social media radicalizes, with 74% also saying misinformation is an an extremely or very serious problem. Another 76% believe social media was at least partially responsible for the Capitol riot, which 7 in 10 think is the result of unchecked extreme behavior online, the report noted.

Meanwhile, a third of Americans regularly get their news from Facebook, according to a study from Pew Research Center, which means they’re often now reading more extreme viewpoints from fringe publishers, a related Pew study had found.

Elsewhere in the world, Facebook has been accused of exacerbating political unrest, including the deadly riots in Indonesia, genocide in Myanmar, the spread of misinformation in Brazil during elections, and more.

Facebook, however, today argues that political content is a small amount of the News Feed (e.g. 6% of what people in the U.S. see) — an attempt to deflect any blame for the state of the world, while positioning the downranking change as just something user feedback demands that Facebook explore.

Microsoft PAC blacklists election objectors and shifts lobbying weight towards progressive organizations

After “pausing” political giving to any politician who voted to overturn the 2020 election, Microsoft has clarified changes to its lobbying policy, doubling down on its original intention and changing gears with an eye towards funding impactful organizations.

Microsoft, along with most other major companies in the tech sector and plenty others, announced a halt to political donations in the chaotic wake of the capitol riots and subsequent partisan clashes over the legitimacy of the election.

At the time, Microsoft said that it often pauses donations during the transition to a new Congress, but in this case it would only resume them “until after it assesses the implications of last week’s events” and “consult[s] with employees.”

Assessing and consulting can take a long time, especially in matters of allocating cash in politics, but Microsoft seems to have accomplished their goal in relatively short order. In a series of sessions over the last two weeks involving over 300 employees who contribute to the PAC, the company arrived at a new strategy that reflects their priorities.

In a word, they’re blacklisting any Senator, Representative, government official, or organization that voted for or supported the attempt to overturn the election. Fortunately there doesn’t seem to be a lot of grey area here, which simplifies the process somewhat. This restriction will remain in place until the 2022 election — which, frighteningly, happens next year.

In fact, as an alternative to donating to individual candidates and politicians in the first place, the PAC will establish a new fund to “support organizations that promote public transparency, campaign finance reform, and voting rights.”

More details on this are forthcoming, but it’s a significant change from direct support of candidates to independent organizations. One hardly knows what a candidate’s fund goes to (Superbowl ads this time of year), but giving half a million bucks to a group challenging voter suppression and gerrymandering in a hotly contested district can make a big difference. (Work like this on a large scale helped tip Georgia from red to blue, for instance, and it didn’t happen overnight, or for free.)

There’s even a hint of a larger change in the offing, as Microsoft’s communications head Frank X. Shaw suggests in the blog post that “we believe there is an opportunity to learn and work together” with like-minded companies and PACs. If that isn’t a sly invitation to create a coalition of the like-minded I don’t know what is.

The company also will be changing the name of the PAC to the Microsoft Corporation Voluntary PAC to better communicate that it’s funded by voluntary contributions from employees and stakeholders and isn’t just a big corporate lobbying slush fund.

As we saw around the time of the original “pause,” and indeed with many other actions in the tech industry over the last year, it’s likely that one large company (in this case Microsoft) getting specific with its political moves will trigger more who just didn’t want to be the first to go. It’s difficult to predict exactly what the long-term ramifications of these changes will be (as they are still quite general and tentative) but it seems safe to say that the political funding landscape of the next election period will look quite a bit different from the last one.

Facebook will give academic researchers access to 2020 election ad targeting data

Starting next month, Facebook will open up academic access to a dataset of 1.3 million political and social issue ads, including those that ran between August 3 and November 3, 2020 — Election Day in the U.S.

Facebook’s Ad Library, launched in 2019, offers a searchable database of all ads running on Facebook and Instagram. Implemented after the 2016 Russian election interference fiasco, the database allows researchers and reporters to drill down into ads by topic, company and candidate, displaying data about when an ad ran, who saw it and how much it cost.

Facebook says the decision to offer a deeper look into ads on the platform comes after feedback from the research community, which specifically requested more information about targeting. Facebook’s extremely granular ad targeting tools are of particular interest to researchers, who will soon have access to why certain people saw ads, including data on location and interest.

“We recognize that understanding the online political advertising landscape is key to protecting elections, and we know we can’t do it alone,” Facebook Product Manager Sarah Clark Schiff wrote in the announcement.

The company’s ad targeting systems have plunged the company into hot water in the past. In 2016, Facebook disabled a targeting option for “ethnic affinity” in credit, housing and employment-related ad categories following reporting on how those tools could be abused for illegal discrimination. In 2018, the company removed 5,000 additional ad targeting options due to similar potential for discriminatory advertising practices. And the extent to which the Trump campaign sailed into the White House on the strength of its microtargeting Facebook ad operations is still a matter of debate.

Regardless of how you feel about the tools themselves, Facebook’s public-facing ad library has been an invaluable tool for reporters, providing both issue-specific deep dives and an easy at-a-glance view of political spending by party, race and candidate. The new targeting data won’t live on the public Ad Library but will instead be limited to the Facebook Open Research & Transparency platform, which is only accessible by university-linked researchers.

VCs dispense with niceties during Capitol riots: “Never talk to me again”

It was hard not to feel emotional today, as the world watched for more than four hours as rioters stormed into and throughout the Capitol building in Washington to disrupt the certification of the election win of incoming U.S. President-Elect Joe Biden. They’d been encouraged earlier in the afternoon by outgoing President Donald Trump to head to the building and protest what he falsely claimed yet again was a stolen election, a lie he began to spread the evening of the U.S. election in November.

While members of Congress called on Trump to make a statement rebuking the rioters’ actions from their undisclosed locations, he instead encouraged his supporters over Twitter, writing of the “sacred landslide election victory” that was “so unceremoniously & viciously stripped away from great patriots” and later posting a video in which repeated his lies about a “landslide election that was stolen from us.”

It was the first time in American history that supporters of the losing presidential candidate forcibly disrupted the official counting of electoral votes, as noted earlier in the evening by PBS. And while Trump’s tweets were later deleted by Twitter for “repeated and severe violations of our Civic Integrity policy,” the move was viewed by many as too little and too late, including by Silicon Valley investors, a wide number of whom let loose their fury toward the outgoing administration and its enablers.


A lingering question is whether the ignominious day — one on which a dozen Senate Republicans and dozens more Republican House members had planned to object to the certification of the election results — will begin to polarize people further or whether, following Trump’s departure, some of that fury begins to subside instead.

Some investors, at least, say their anger has always had more to do with basic human decency, which seemed frequently to take a backseat during the Trump administration.

Deena Shakir of Lux Capital used to work for the Obama administration and is transparent about her political perspective on Twitter. But she says of today’s events that they “are not about politics. What we have witnessed is an affront to democracy, an assault on American history, and a gruesome reflection of the divided nation we live in.”

Hunter Walk — who cofounded the venture firm Homebrew and today tweeted, “don’t be putting [Trump son-in-law and White House advisor] Jared Kushner on cap tables when this is all said and done” — echoes the sentiment. “I’m not afraid to have a strong public voice on issues I consider to be urgent and essential human rights questions.”

As for whether the shock of today might make it harder to fund or partner with a team who supported Trump’s ascendency, Walk suggests it won’t. “We fund wonderful entrepreneurs and employ no purity tests on whether they agree with us 100%. I’m certain we’ve backed people who sit to our political left and to our political right – that’s not an issue for us and not an issue for them.”

To the extent that Walk’s public political stance may turn off some talented founders who “would just prefer their investors shut up and write checks,” that’s “ok,” too, says Walk.

“We don’t believe we need to compromise our values in order to be successful.”

Shakir meanwhile suggests that she doesn’t always have the luxury of tuning out politics entirely.

For one thing, she considers those who terrorized the nation’s capital today “angered perpetrators of a jingoistic, supremacist ideology that is not only normalized but actually incited by the highest branch of our government and amplified via social media.”

More, she notes, “Given my focus on healthcare, so much of my own thesis development and so many of my conversations have inevitably been informed by the pandemic, which—for better or worse—has become politicized.”

Try as she might to bifurcate politics from work, it’s futile right now, Shakir says. “These events and policies inform our present and our future, affect the markets that value our companies, and contribute to trends and white spaces.”

Today, she adds, they also “reflect our values as a nation and as human beings.”