Google rolls back SameSite cookie changes to keep essential online services from breaking

Google today announced that it will temporarily roll back the changes it recently made to how its Chrome browser handles cookies in order to ensure that sites that perform essential services like banking, online grocery, government services and healthcare won’t become inaccessible to Chrome users during the current COVID-19 pandemic.

The new SameSite rules, which the company started rolling out to a growing number of Chrome users in recent months, are meant to make it harder for sites to access cookies from third-party sites and hence track a user’s online activity. These new rules are also meant to prevent cross-site request forgery attacks.

Under Google’s new guidance, developers have to explicitly allow their cookies to be read by third-party sites, otherwise, the browser will prevent these third-party sites from accessing them.

Since this is a pretty major change, Google gave developers quite a bit of time to adapt their applications to it. Still, not every site is ready yet and so the Chrome team decided to halt the gradual rollout and stop enforcing these new rules for the time being.

“While most of the web ecosystem was prepared for this change, we want to ensure stability for websites providing essential services including banking, online groceries, government services and healthcare that facilitate our daily life during this time,” writes Google Chrome engineering director Justin Schuh. “As we roll back enforcement, organizations, users and sites should see no disruption.”

A Google spokesperson also told us that the team saw some breakage in sites “that would not normally be considered essential, but with COVID-19 having become more important, we made this decision in an effort to ensure stability during this time.”

The company says it plans to resume its SameSite enforcement over the summer, though the exact timing isn’t yet clear.

Using AI responsibly to fight the coronavirus pandemic

The emergence of the novel coronavirus has left the world in turmoil. COVID-19, the disease caused by the virus, has reached virtually every corner of the world, with the number of cases exceeding a million and the number of deaths more than 50,000 worldwide. It is a situation that will affect us all in one way or another.

With the imposition of lockdowns, limitations of movement, the closure of borders and other measures to contain the virus, the operating environment of law enforcement agencies and those security services tasked with protecting the public from harm has suddenly become ever more complex. They find themselves thrust into the middle of an unparalleled situation, playing a critical role in halting the spread of the virus and preserving public safety and social order in the process. In response to this growing crisis, many of these agencies and entities are turning to AI and related technologies for support in unique and innovative ways. Enhancing surveillance, monitoring and detection capabilities is high on the priority list.

For instance, early in the outbreak, Reuters reported a case in China wherein the authorities relied on facial recognition cameras to track a man from Hangzhou who had traveled in an affected area. Upon his return home, the local police were there to instruct him to self-quarantine or face repercussions. Police in China and Spain have also started to use technology to enforce quarantine, with drones being used to patrol and broadcast audio messages to the public, encouraging them to stay at home. People flying to Hong Kong airport receive monitoring bracelets that alert the authorities if they breach the quarantine by leaving their home.

In the United States, a surveillance company announced that its AI-enhanced thermal cameras can detect fevers, while in Thailand, border officers at airports are already piloting a biometric screening system using fever-detecting cameras.

Isolated cases or the new norm?

With the number of cases, deaths and countries on lockdown increasing at an alarming rate, we can assume that these will not be isolated examples of technological innovation in response to this global crisis. In the coming days, weeks and months of this outbreak, we will most likely see more and more AI use cases come to the fore.

While the application of AI can play an important role in seizing the reins in this crisis, and even safeguard officers and officials from infection, we must not forget that its use can raise very real and serious human rights concerns that can be damaging and undermine the trust placed in government by communities. Human rights, civil liberties and the fundamental principles of law may be exposed or damaged if we do not tread this path with great caution. There may be no turning back if Pandora’s box is opened.

In a public statement on March 19, the monitors for freedom of expression and freedom of the media for the United Nations, the Inter-American Commission for Human Rights and the Representative on Freedom of the Media of the Organization for Security and Co-operation in Europe issued a joint statement on promoting and protecting access to and free flow of information during the pandemic, and specifically took note of the growing use of surveillance technology to track the spread of the coronavirus. They acknowledged that there is a need for active efforts to confront the pandemic, but stressed that “it is also crucial that such tools be limited in use, both in terms of purpose and time, and that individual rights to privacy, non-discrimination, the protection of journalistic sources and other freedoms be rigorously protected.”

This is not an easy task, but a necessary one. So what can we do?

Ways to responsibly use AI to fight the coronavirus pandemic

  1. Data anonymization: While some countries are tracking individual suspected patients and their contacts, Austria, Belgium, Italy and the U.K. are collecting anonymized data to study the movement of people in a more general manner. This option still provides governments with the ability to track the movement of large groups, but minimizes the risk of infringing data privacy rights.
  2. Purpose limitation: Personal data that is collected and processed to track the spread of the coronavirus should not be reused for another purpose. National authorities should seek to ensure that the large amounts of personal and medical data are exclusively used for public health reasons. The is a concept already in force in Europe, within the context of the European Union’s General Data Protection Regulation (GDPR), but it’s time for this to become a global principle for AI.
  3. Knowledge-sharing and open access data: António Guterres, the United Nations Secretary-General, has insisted that “global action and solidarity are crucial,” and that we will not win this fight alone. This is applicable on many levels, even for the use of AI by law enforcement and security services in the fight against COVID-19. These agencies and entities must collaborate with one another and with other key stakeholders in the community, including the public and civil society organizations. AI use case and data should be shared and transparency promoted.
  4. Time limitation:  Although the end of this pandemic seems rather far away at this point in time, it will come to an end. When it does, national authorities will need to scale back their newly acquired monitoring capabilities after this pandemic. As Yuval Noah Harari observed in his recent article, “temporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon.” We must ensure that these exceptional capabilities are indeed scaled back and do not become the new norm.

Within the United Nations system, the United Nations Interregional Crime and Justice Research Institute (UNICRI) is working to advance approaches to AI such as these. It has established a specialized Centre for AI and Robotics in The Hague and is one of the few international actors dedicated to specifically looking at AI vis-à-vis crime prevention and control, criminal justice, rule of law and security. It assists national authorities, in particular law enforcement agencies, to understand the opportunities presented by these technologies and, at the same time, to navigate the potential pitfalls associated with these technologies.

Working closely with International Criminal Police Organization (INTERPOL), UNICRI has set up a global platform for law enforcement, fostering discussion on AI, identifying practical use cases and defining principles for responsible use. Much work has been done through this forum, but it is still early days, and the path ahead is long.

While the COVID-19 pandemic has illustrated several innovative use cases, as well as the urgency for the governments to do their utmost to stop the spread of the virus, it is important to not let consideration of fundamental principles, rights and respect for the rule of law be set aside. The positive power and potential of AI is real. It can help those embroiled in fighting this battle to slow the spread of this debilitating disease. It can help save lives. But we must stay vigilant and commit to the safe, ethical and responsible use of AI.

It is essential that, even in times of great crisis, we remain conscience of the duality of AI and strive to advance AI for good.

Collibra nabs another $112.5M at a $2.3B valuation for its big data management platform

GDPR and other data protection and privacy regulations — as well as a significant (and growing) number of data breaches and exposées of companies’ privacy policies — have put a spotlight on not just on the vast troves of data that businesses and other organizations hold on us, but also how they handle it. Today, one of the companies helping them cope with that data trove in a better and legal way is announcing a huge round of funding to continue that work. Collibra, which provides tools to manage, warehouse, store and analyse data troves, is today announcing that it has raised $112.5 million in funding, at a post-money valuation of $2.3 billion.

The funding — a Series F from the looks of it — represents a big bump for the startup, which last year raised $100 million at a valuation of just over $1 billion. This latest round was co-led by ICONIQ Capital, Index Ventures, and Durable Capital Partners LIP, with previous investors CapitalG (Google’s growth fund), Battery Ventures, and Dawn Capital also participating.

Collibra, originally a spin-out from Vrije Universiteit in Brussels, Belgium, today works with some 450 enterprises and other large organizations — customers include Adobe, Verizon (which owns TechCrunch), insurers AXA, and a number of healthcare providers. Its products cover a range of services focused around company data, including tools to help customers comply with local data protection policies, store it securely, and to run analytics and more.

These are all tools that have long had a place in enterprise big data IT, but have become increasingly more used and in-demand both as data policies have expanded, and as the prospects of what can be discovered through big data analytics have become more advanced. With that growth, many companies have realised that they are not in a position to use and store their data in the best possible way, and that is where companies like Collibra step in.

“Most large organizations are in data chaos,” Felix Van de Maele, co-founder and CEO, previously told us. “We help them understand what data they have, where they store it and [understand] whether they are allowed to use it.”

As you would expect with a big IT trend, Collibra is not the only company chasing this opportunity. Competitors include Informatica, IBM, Talend, Egnyte, among a number of others, but the market position of Collibra, and its advanced technology, is what has continued to impress investors.

“Durable Capital Partners invests in innovative companies that have significant potential to shape growing industries and build larger companies,” said Henry Ellenbogen, founder and chief investment officer for Durable Capital Partners LP, in a statement (Ellenbogen is formerly an investment manager a T. Rowe Price, and this is his first investment in Collibra under Durable). “We believe Collibra is a leader in the Data Intelligence category, a space that could have a tremendous impact on global business operations and a space that we expect will continue to grow as data becomes an increasingly critical asset.”

“We have a high degree of conviction in Collibra and the importance of the company’s mission to help organizations benefit from their data,” added Matt Jacobson, general partner at ICONIQ Capital and Collibra board member, in his own statement. “There is an increasing urgency for enterprises to harness their data for strategic business decisions. Collibra empowers organizations to use their data to make critical business decisions, especially in uncertain business environments.”

An EU coalition of techies is backing a ‘privacy-preserving’ standard for COVID-19 contacts tracing

A European coalition of techies and scientists drawn from at least eight countries, and led by Germany’s Fraunhofer Heinrich Hertz Institute for telecoms (HHI), is working on contacts-tracing proximity technology for COVID-19 that’s designed to comply with the region’s strict privacy rules — officially unveiling the effort today.

China-style individual-level location-tracking of people by states via their smartphones even for a public health purpose is hard to imagine in Europe — which has a long history of legal protection for individual privacy. However the coronavirus pandemic is applying pressure to the region’s data protection model, as governments turn to data and mobile technologies to seek help with tracking the spread of the virus, supporting their public health response and mitigating wider social and economic impacts.

Scores of apps are popping up across Europe aimed at attacking coronavirus from different angles. European privacy not-for-profit, noyb, is keeping an updated list of approaches, both led by governments and private sector projects, to use personal data to combat SARS-CoV-2 — with examples so far including contacts tracing, lockdown or quarantine enforcement and COVID-19 self-assessment.

The efficacy of such apps is unclear — but the demand for tech and data to fuel such efforts is coming from all over the place.

In the UK the government has been quick to call in tech giants, including Google, Microsoft and Palantir, to help the National Health Service determine where resources need to be sent during the pandemic. While the European Commission has been leaning on regional telcos to hand over user location data to carry out coronavirus tracking — albeit in aggregated and anonymized form.

The newly unveiled Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT) project is a response to the coronavirus pandemic generating a huge spike in demand for citizens’ data that’s intended to offer not just an another app — but what’s described as “a fully privacy-preserving approach” to COVID-19 contacts tracing.

The core idea is to leverage smartphone technology to help disrupt the next wave of infections by notifying individuals who have come into close contact with an infected person — via the proxy of their smartphones having been near enough to carry out a Bluetooth handshake. So far so standard. But the coalition behind the effort wants to steer developments in such a way that the EU response to COVID-19 doesn’t drift towards China-style state surveillance of citizens.

While, for the moment, strict quarantine measures remain in place across much of Europe there may be less imperative for governments to rip up the best practice rulebook to intrude on citizens’ privacy, given the majority of people are locked down at home. But the looming question is what happens when restrictions on daily life are lifted?

Contacts tracing — as a way to offer a chance for interventions that can break any new infection chains — is being touted as a key component of preventing a second wave of coronavirus infections by some, with examples such as Singapore’s TraceTogether app being eyed up by regional lawmakers.

Singapore does appear to have had some success in keeping a second wave of infections from turning into a major outbreak, via an aggressive testing and contacts-tracing regime. But what a small island city-state with a population of less than 6M can do vs a trading bloc of 27 different nations whose collective population exceeds 500M doesn’t necessarily seem immediately comparable.

Europe isn’t going to have a single coronavirus tracing app. It’s already got a patchwork. Hence the people behind PEPP-PT offering a set of “standards, technology, and services” to countries and developers to plug into to get a standardized COVID-19 contacts-tracing approach up and running across the bloc.

The other very European flavored piece here is privacy — and privacy law. “Enforcement of data protection, anonymization, GDPR [the EU’s General Data Protection Regulation] compliance, and security” are baked in, is the top-line claim.

“PEPP-PR was explicitly created to adhere to strong European privacy and data protection laws and principles,” the group writes in an online manifesto. “The idea is to make the technology available to as many countries, managers of infectious disease responses, and developers as quickly and as easily as possible.

“The technical mechanisms and standards provided by PEPP-PT fully protect privacy and leverage the possibilities and features of digital technology to maximize speed and real-time capability of any national pandemic response.”

Hans-Christian Boos, one of the project’s co-initiators — and the founder of an AI company called Arago –discussed the initiative with German newspaper Der Spiegel, telling it: “We collect no location data, no movement profiles, no contact information and no identifiable features of the end devices.”

The newspaper reports PEPP-PT’s approach means apps aligning to this standard would generate only temporary IDs — to avoid individuals being identified. Two or more smartphones running an app that uses the tech and has Bluetooth enabled when they come into proximity would exchange their respective IDs — saving them locally on the device in an encrypted form, according to the report.

Der Spiegel writes that should a user of the app subsequently be diagnosed with coronavirus their doctor would be able to ask them to transfer the contact list to a central server. The doctor would then be able to use the system to warn affected IDs they have had contact with a person who has since been diagnosed with the virus — meaning those at risk individuals could be proactively tested and/or self-isolate.

On its website PEPP-PT explains the approach thus:

Mode 1
If a user is not tested or has tested negative, the anonymous proximity history remains encrypted on the user’s phone and cannot be viewed or transmitted by anybody. At any point in time, only the proximity history that could be relevant for virus transmission is saved, and earlier history is continuously deleted.

Mode 2
If the user of phone A has been confirmed to be SARS-CoV-2 positive, the health authorities will contact user A and provide a TAN code to the user that ensures potential malware cannot inject incorrect infection information into the PEPP-PT system. The user uses this TAN code to voluntarily provide information to the national trust service that permits the notification of PEPP-PT apps recorded in the proximity history and hence potentially infected. Since this history contains anonymous identifiers, neither person can be aware of the other’s identity.

Providing further detail of what it envisages as “Country-dependent trust service operation”, it writes: “The anonymous IDs contain encrypted mechanisms to identify the country of each app that uses PEPP-PT. Using that information, anonymous IDs are handled in a country-specific manner.”

While on healthcare processing is suggests: “A process for how to inform and manage exposed contacts can be defined on a country by country basis.”

Among the other features of PEPP-PT’s mechanisms the group lists in its manifesto are:

  • Backend architecture and technology that can be deployed into local IT infrastructure and can handle hundreds of millions of devices and users per country instantly.
  • Managing the partner network of national initiatives and providing APIs for integration of PEPP-PT features and functionalities into national health processes (test, communication, …) and national system processes (health logistics, economy logistics, …) giving many local initiatives a local backbone architecture that enforces GDPR and ensures scalability.
  • Certification Service to test and approve local implementations to be using the PEPP-PT mechanisms as advertised and thus inheriting the privacy and security testing and approval PEPP-PT mechanisms offer.

Having a standardized approach that could be plugged into a variety of apps would allow for contacts tracing to work across borders — i.e. even if different apps are popular in different EU countries — an important consideration for the bloc, which has 27 Member States.

However there may be questions about the robustness of the privacy protection designed into the approach — if, for example, pseudonymized data is centralized on a server that doctors can access there could be a risk of it leaking and being re-identified. And identification of individual device holders would be legally risky.

Europe’s lead data regulator, the EDPS, recently made a point of tweeting to warn an MEP (and former EC digital commissioner) against the legality of applying Singapore-style Bluetooth-powered contacts tracing in the EU — writing: “Please be cautious comparing Singapore examples with European situation. Remember Singapore has a very specific legal regime on identification of device holder.”

A spokesman for the EDPS told us it’s in contact with data protection agencies of the Member States involved in the PEPP-PT project to collect “relevant information”.

“The general principles presented by EDPB on 20 March, and by EDPS on 24 March are still relevant in that context,” the spokesman added — referring to guidance issued by the privacy regulators last month in which they encouraged anonymization and aggregation should Member States want to use mobile location data for monitoring, containing or mitigating the spread of COVID-19. At least in the first instance.

“When it is not possible to only process anonymous data, the ePrivacy Directive enables Member States to introduce legislative measures to safeguard public security (Art. 15),” the EDPB further noted.

“If measures allowing for the processing of non-anonymised location data are introduced, a Member State is obliged to put in place adequate safeguards, such as providing individuals of electronic communication services the right to a judicial remedy.”

We reached out to the HHI with questions about the PEPP-PT project and were referred to Boos — but at the time of writing had been unable to speak to him.

“The PEPP-PT system is being created by a multi-national European team,” the HHI writes in a press release about the effort. “It is an anonymous and privacy-preserving digital contact tracing approach, which is in full compliance with GDPR and can also be used when traveling between countries through an anonymous multi-country exchange mechanism. No personal data, no location, no Mac-Id of any user is stored or transmitted. PEPP-PT is designed to be incorporated in national corona mobile phone apps as a contact tracing functionality and allows for the integration into the processes of national health services. The solution is offered to be shared openly with any country, given the commitment to achieve interoperability so that the anonymous multi-country exchange mechanism remains functional.”

“PEPP-PT’s international team consists of more than 130 members working across more than seven European countries and includes scientists, technologists, and experts from well-known research institutions and companies,” it adds.

“The result of the team’s work will be owned by a non-profit organization so that the technology and standards are available to all. Our priorities are the well being of world citizens today and the development of tools to limit the impact of future pandemics — all while conforming to European norms and standards.”

The PEPP-PT says its technology-focused efforts are being financed through donations. Per its website, it says it’s adopted the WHO standards for such financing — to “avoid any external influence”.

Of course for the effort to be useful it relies on EU citizens voluntarily downloading one of the aligned contacts tracing apps — and carrying their smartphone everywhere they go, with Bluetooth enabled.

Without substantial penetration of regional smartphones it’s questionable how much of an impact this initiative, or any contacts tracing technology, could have. Although if such tech were able to break even some infection chains people might argue it’s not wasted effort.

Notably, there are signs Europeans are willing to contribute to a public healthcare cause by doing their bit digitally — such as a self-reporting COVID-19 tracking app which last week racked up 750,000 downloads in the UK in 24 hours.

But, at the same time, contacts tracing apps are facing scepticism over their ability to contribute to the fight against COVID-19. Not everyone carries a smartphone, nor knows how to download an app, for instance. There’s plenty of people who would fall outside such a digital net.

Meanwhile, while there’s clearly been a big scramble across the region, at both government and grassroots level, to mobilize digital technology for a public health emergency cause there’s arguably greater imperative to direct effort and resources at scaling up coronavirus testing programs — an area where most European countries continue to lag.

Germany — where some of the key backers of the PEPP-PT are from — being the most notable exception.

What does a pandemic say about the tech we’ve built?

There’s a joke* being reshared on chat apps that takes the form of a multiple choice question — asking who’s the leading force in workplace digital transformation? The red-lined punchline is not the CEO or CTO but: C) COVID-19.

There’s likely more than a grain of truth underpinning the quip. The novel coronavirus is pushing a lot of metaphorical buttons right now. ‘Pause’ buttons for people and industries, as large swathes of the world’s population face quarantine conditions that can resemble house arrest. The majority of offline social and economic activities are suddenly off limits.

Such major pauses in our modern lifestyle may even turn into a full reset, over time. The world as it was, where mobility of people has been all but taken for granted — regardless of the environmental costs of so much commuting and indulged wanderlust — may never return to ‘business as usual’.

If global leadership rises to the occasional then the coronavirus crisis offers an opportunity to rethink how we structure our societies and economies — to make a shift towards lower carbon alternatives. After all, how many physical meetings do you really need when digital connectivity is accessible and reliable? As millions more office workers log onto the day job from home that number suddenly seems vanishingly small.

COVID-19 is clearly strengthening the case for broadband to be a utility — as so much more activity is pushed online. Even social media seems to have a genuine community purpose during a moment of national crisis when many people can only connect remotely, even with their nearest neighbours.

Hence the reports of people stuck at home flocking back to Facebook to sound off in the digital town square. Now the actual high street is off limits the vintage social network is experiencing a late second wind.

Facebook understands this sort of higher societal purpose already, of course. Which is why it’s been so proactive about building features that nudge users to ‘mark yourself safe’ during extraordinary events like natural disasters, major accidents and terrorist attacks. (Or indeed why it encouraged politicians to get into bed with its data platform in the first place — no matter the cost to democracy.)

In less fraught times, Facebook’s ‘purpose’ can be loosely summed to ‘killing time’. But with ever more sinkholes being drilled by the attention economy that’s a function under ferocious and sustained attack.

Over the years the tech giant has responded by engineering ways to rise back to the top of the social heap — including spying on and buying up competition, or directly cloning rival products. It’s been pulling off this trick, by hook or by crook, for over a decade. Albeit, this time Facebook can’t take any credit for the traffic uptick; A pandemic is nature’s dark pattern design.

What’s most interesting about this virally disrupted moment is how much of the digital technology that’s been built out online over the past two decades could very well have been designed for living through just such a dystopia.

Seen through this lens, VR should be having a major moment. A face computer that swaps out the stuff your eyes can actually see with a choose-your-own-digital-adventure of virtual worlds to explore, all from the comfort of your living room? What problem are you fixing VR? Well, the conceptual limits of human lockdown in the face of a pandemic quarantine right now, actually…

Virtual reality has never been a compelling proposition vs the rich and textured opportunity of real life, except within very narrow and niche bounds. Yet all of a sudden here we all are — with our horizons drastically narrowed and real-life news that’s ceaselessly harrowing. So it might yet end up wry punchline to another multiple choice joke: ‘My next vacation will be: A) Staycation, B) The spare room, C) VR escapism.’

It’s videoconferencing that’s actually having the big moment, though. Turns out even a pandemic can’t make VR go viral. Instead, long lapsed friendships are being rekindled over Zoom group chats or Google Hangouts. And Houseparty — a video chat app — has seen surging downloads as barflies seek out alternative night life with their usual watering-holes shuttered.

Bored celebs are TikToking. Impromptu concerts are being livestreamed from living rooms via Instagram and Facebook Live. All sorts of folks are managing social distancing and the stress of being stuck at home alone (or with family) by distant socializing — signing up to remote book clubs and discos; joining virtual dance parties and exercise sessions from bedrooms. Taking a few classes together. The quiet pub night with friends has morphed seamlessly into a bring-your-own-bottle group video chat.

This is not normal — but nor is it surprising. We’re living in the most extraordinary time. And it seems a very human response to mass disruption and physical separation (not to mention the trauma of an ongoing public health emergency that’s killing thousands of people a day) to reach for even a moving pixel of human comfort. Contactless human contact is better than none at all.

Yet the fact all these tools are already out there, ready and waiting for us to log on and start streaming, should send a dehumanizing chill down society’s backbone.

It underlines quite how much consumer technology is being designed to reprogram how we connect with each other, individually and in groups, in order that uninvited third parties can cut a profit.

Back in the pre-COVID-19 era, a key concern being attached to social media was its ability to hook users and encourage passive feed consumption — replacing genuine human contact with voyeuristic screening of friends’ lives. Studies have linked the tech to loneliness and depression. Now we’re literally unable to go out and meet friends the loss of human contact is real and stark. So being popular online in a pandemic really isn’t any kind of success metric.

Houseparty, for example, self-describes as a “face to face social network” — yet it’s quite the literal opposite; you’re foregoing face-to-face contact if you’re getting virtually together in app-wrapped form.

While the implication of Facebook’s COVID-19 traffic bump is that the company’s business model thrives on societal disruption and mainstream misery. Which, frankly, we knew already. Data-driven adtech is another way of saying it’s been engineered to spray you with ad-flavored dissatisfaction by spying on what you get up to. The coronavirus just hammers the point home.

The fact we have so many high-tech tools on tap for forging digital connections might feel like amazing serendipity in this crisis — a freemium bonanza for coping with terrible global trauma. But such bounty points to a horrible flip side: It’s the attention economy that’s infectious and insidious. Before ‘normal life’ plunged off a cliff all this sticky tech was labelled ‘everyday use’; not ‘break out in a global emergency’.

It’s never been clearer how these attention-hogging apps and services are designed to disrupt and monetize us; to embed themselves in our friendships and relationships in a way that’s subtly dehumanizing; re-routing emotion and connections; nudging us to swap in-person socializing for virtualized fuzz that designed to be data-mined and monetized by the same middlemen who’ve inserted themselves unasked into our private and social lives.

Captured and recompiled in this way, human connection is reduced to a series of dilute and/or meaningless transactions. The platforms deploying armies of engineers to knob-twiddle and pull strings to maximize ad opportunities, no matter the personal cost.

It’s also no accident we’re also seeing more of the vast and intrusive underpinnings of surveillance capitalism emerge, as the COVID-19 emergency rolls back some of the obfuscation that’s used to shield these business models from mainstream view in more normal times. The trackers are rushing to seize and colonize an opportunistic purpose.

Tech and ad giants are falling over themselves to get involved with offering data or apps for COVID-19 tracking. They’re already in the mass surveillance business so there’s likely never felt like a better moment than the present pandemic for the big data lobby to press the lie that individuals don’t care about privacy, as governments cry out for tools and resources to help save lives.

First the people-tracking platforms dressed up attacks on human agency as ‘relevant ads’. Now the data industrial complex is spinning police-state levels of mass surveillance as pandemic-busting corporate social responsibility. How quick the wheel turns.

But platforms should be careful what they wish for. Populations that find themselves under house arrest with their phones playing snitch might be just as quick to round on high tech gaolers as they’ve been to sign up for a friendly video chat in these strange and unprecedented times.

Oh and Zoom (and others) — more people might actually read your ‘privacy policy‘ now they’ve got so much time to mess about online. And that really is a risk.

*Source is a private Twitter account called @MBA_ish

Maybe we shouldn’t use Zoom after all

Now that we’re all stuck at home thanks to the coronavirus pandemic, video calls have gone from a novelty to a necessity. Zoom, the popular videoconferencing service, seems to be doing better than most and has quickly become one of, if not the most, popular option going.

But should it be?

Zoom’s recent popularity has also shone a spotlight on the company’s security protections and privacy promises. Just today, The Intercept reported that Zoom video calls are not end-to-end encrypted, despite the company’s claims that they are.

And Motherboard reports that Zoom is leaking the email addresses of “at least a few thousand” people because personal addresses are treated as if they belong to the same company.

It’s the latest examples of the company having to spend the last year mopping up after a barrage of headlines examining the company’s practices and misleading marketing. To wit:

  • Apple was forced to step in to secure millions of Macs after a security researcher found Zoom failed to disclose that it installed a secret web server on users’ Macs, which Zoom failed to remove when the client was uninstalled. The researcher, Jonathan Leitschuh, said the web server meant any malicious website could activate Mac webcam with Zoom installed without the user’s permission. The researcher declined a bug bounty payout because Zoom wanted Leitschuh to sign a non-disclosure agreement, which would have prevented him from disclosing details of the bug.
  • Zoom was quietly sending data to Facebook about a user’s Zoom habits — even when the user does not have a Facebook account. Motherboard reported that the iOS app was notifying Facebook when they opened the app, the device model, which phone carrier they opened the app, and more. Zoom removed the code in response, but not fast enough to prevent a class action lawsuit or New York’s attorney general from launching an investigation.
  • Zoom came under fire again for its “attendee tracking” feature, which, when enabled, lets a host check if participants are clicking away from the main Zoom window during a call.
  • A security researcher found that the Zoom uses a “shady” technique to install its Mac app without user interaction. “The same tricks that are being used by macOS malware,” the researcher said.
  • On the bright side and to some users’ relief, we reported that it is in fact possible to join a Zoom video call without having to download or use the app. But Zoom’s “dark patterns” doesn’t make it easy to start a video call using just your browser.
  • Zoom has faced questions over its lack of transparency on law enforcement requests it receives. Access Now, a privacy and rights group, called on Zoom to release the number of requests it receives, just as Amazon, Google, Microsoft and many more tech giants report on a semi-annual basis.
  • Then there’s Zoombombing, where trolls take advantage of open or unprotected meetings and poor default settings to take over screen-sharing and broadcast porn or other explicit material. The FBI this week warned users to adjust their settings to avoid trolls hijacking video calls.
  • And Zoom tightened its privacy policy this week after it was criticized for allowing Zoom to collect information about users’ meetings — like videos, transcripts and shared notes — for advertising.

There are many more privacy-focused alternatives to Zoom. Three are several options, but they all have their pitfalls. FaceTime and WhatsApp are end-to-end encrypted, but FaceTime works only on Apple devices and WhatsApp is limited to just four video callers at a time. A lesser known video calling platform, Jitsi, is not end-to-end encrypted but it’s open source — so you can look at the code to make sure there are no backdoors — and it works across all devices and browsers. You can run Jitsi on a server you control for greater privacy.

In fairness, Zoom is not inherently bad and there are many reasons why Zoom is so popular. It’s easy to use, reliable and for the vast majority it’s incredibly convenient.

But Zoom’s misleading claims give users a false sense of security and privacy. Whether it’s hosting a virtual happy hour or a yoga class, or using Zoom for therapy or government cabinet meetings, everyone deserves privacy.

Now more than ever Zoom has a responsibility to its users. For now, Zoom at your own risk.

No proof of a Houseparty breach, but its privacy policy is still gatecrashing your data

Houseparty has been a smashing success with people staying home during the coronavirus pandemic who still want to connect with friends.

The group video chat app, interspersed with games and other bells and whistles, raises it above the more mundane Zooms and Hangouts (fun only in their names, otherwise pretty serious tools used by companies, schools and others who just need to work) when it comes to creating engaged leisure time, amid a climate where all of them are seeing a huge surge in growth.

All that looked like it could possibly fall apart for Houseparty and its new owner Epic Games when a series of reports appeared Monday claiming Houseparty was breached, and that malicious hackers were using users’ data to access their accounts on other apps such as Spotify and Netflix.

Houseparty was swift to deny the reports and even go so far as to claim — without evidence — it was investigating indications that the “breach” was a “paid commercial smear to harm Houseparty,” offering a $1 million reward to whoever could prove its theory.

For now, there is no proof that there was a breach, nor proof that there was a paid smear campaign, and when we reached out to ask Houseparty and Epic about this investigation, a spokesperson said: “We don’t have anything to add here at the moment.”

But that doesn’t mean that Houseparty doesn’t have privacy issues.

As the old saying goes, “if the product is free, you are the product.” In the case of the free app Houseparty, the publishers detail a 12,000+ word privacy policy that covers any and all uses of data that it might collect by way of you logging on to or using its service, laying out the many ways that it might use data for promotional or commercial purposes.

There are some clear lines in the policy about what it won’t use. For example, while phone numbers might get shared for tech support, with partnerships that you opt into, to link up contacts to talk with and to authenticate you, “we will never share your phone number or the phone numbers of third parties in your contacts with anyone else.”

But beyond that, there are provisions in there that could see Houseparty selling anonymized and other data, leading Ray Walsh of research firm ProPrivacy to describe it as a “privacy nightmare.”

“Anybody who decides to use the Houseparty application to stay in contact during quarantine needs to be aware that the app collects a worrying amount of personal information,” he said. “This includes geolocation data, which could, in theory, be used to map the location of each user. A closer look at Houseparty’s privacy policy reveals that the firm promises to anonymize and aggregate data before it is shared with the third-party affiliates and partners it works with. However, time and time again, researchers have proven that previously anonymized data can be re-identified.”

There are ways around this for the proactive. Walsh notes that users can go into the settings to select “private mode” to “lock” rooms they use to stop people from joining unannounced or uninvited; switch locations off; use fake names and birthdates; disconnect all other social apps; and launch the app on iOS with a long press to “sneak into the house” without notifying all your contacts.

But with a consumer app, it’s a longshot to assume that most people, and the younger users who are especially interested in Houseparty, will go through all of these extra steps to secure their information.

A Norwegian school quit using video calls after a naked man ‘guessed’ the meeting link

A school in Norway has stopped using popular video conferencing service Whereby after a naked man apparently “guessed” the link to a video lesson.

According to Norwegian state broadcaster NRK, the man exposed himself in front of several young children over the video call. The theory, according to the report, is that the man guessed the meeting ID and joined the video call.

One expert quoted in the story said some are “looking” for links.

Last year security researchers told TechCrunch that malicious users could access and listen in to Zoom and Webex video meetings by cycling through different permutations of meeting IDs in bulk. The researchers said the flaw worked because many meetings were not protected by a passcode.

School and workplaces across the world are embracing remote teaching as the number of those infected by the coronavirus strain, known as COVID-19, continues to climb. There are some 523,000 confirmed cases of COVID-19 across the world as of Thursday, according to data provided by Johns Hopkins University. Norway currently has over 3,300 confirmed cases.

More than 80% of the world’s population is said to be on some kind of lockdown to help limit the spread of the coronavirus in an effort to prevent the overrunning of health systems.

The ongoing global lockdown has forced companies to embrace their staff working from home, pushing Zoom to become the go-to video conferencing platform for not only remote workers but also for recreation, like book clubs and happy hours.

An earlier version of this article incorrectly identified the video service as Zoom. The video conferencing service used by the school was Whereby. We regret the error.

Instagram launches Co-Watching of posts during video chat

Now you can scroll Instagram together with friends, turning a typically isolating, passive experience into something more social and active. Today Instagram launched Co-Watching, which lets friends on a video chat or group video chat browse through feed posts one user has Liked or Saved, or that Instagram recommends.

Co-Watching could let people ooh, ahh, joke, and talk about Instagram’s content instead of just consuming it solo and maybe posting it to a chat thread so friends can do the same. That could lead to long usage sessions, incentivize users to collect a great depository of Saved posts to share, and spur more video calls that drag people into the app. TechCrunch first reported Instagram was testing Co-Watching a year ago, so we’ll see if it managed to work out the technical and privacy questions of operating the feature.

The launch comes alongside other COVID-19 responses from Instagram that include:

-Showing a shared Instagram Story featuring all the posts from you network that include the “Stay Home” sticker

-Adding Story stickers that remind people to wash their hands or keep their distance from others

-Adding coronavirus educational info to the top of results for related searches

-Removing unofficial COVID-19 accounts from recommendations, as well as virus related content from Explore if it doesn’t come from a credible health organization

-Expanding the donation sticker to more countries so people can search for and ask friends for contributions to relevant non-profits

These updates build on Instagram’s efforts from two weeks ago which included putting COVID-19 prevention tips atop the feed, listing official health organizations atop search results, and demoting the reach of coronavirus-related content rated false by fact checkers.

But Co-Watching will remain a powerful feature long after the quarantines and social distancing end. The ability to co-view content while browsing social networks has already made screensharing app Squad popular. When Squad launched in January 2019, I suggested that “With Facebook and Snap already sniffing around Squad, it’s quite possible they’ll try to copy it.” Facebook tested a Watch Together feature for viewing Facebook Watch videos inside Messenger back in April. And now here we are with Instagram.

The question is whether Squad’s first-mover advantage and option to screenshare from any app will let it hold its own, or if Instagram Co-Watching will just popularize the concept and send users searching for more flexible options like Squad. “Everyone knows that the content flooding our feeds is a filtered version of reality” Squad CEO Esther Crawford told me. “The real and interesting stuff goes down in DMs because people are more authentic when they’re 1:1 or in small group conversations.”

Squad, which lets you screenshare anything, including websites and your camera roll, won’t be fully steamrolled. When asked if Instagram would build a full-fledged screensharing feature, Instagram CEO Adam Mosseri says they’re “Not currently working on it . . . screensharing is not at the top of the list for us at Instagram.”

With Co-Watching Instagram users can spill the tea and gossip about posts live and unfiltered over video chat. When people launch a video chat from the Direct inbox or a chat thread, they’ll see a “Posts” button that launches Co-Watching. They’ll be able to pick from their Liked, Saved, or Explore feeds and then reveal it to the video chat, with everyone’s windows lined up beneath the post.

Up to six people can Co-Watch at once on Instagram, consuming feed photos and videos but not IGTV posts. You can share public posts, or private ones that everyone in the chat are allowed to see. If one participant is blocked from viewing a post, it’s inelligible for Co-Watching.

Co-Watching could finally provide an answer to Instagram’s Time Well Spent problem. Research shows how the real danger in social network overuse is passive content consumption like endless solo feed scrolling. It can inspire envy, poor self-esteem, and leave users deflated, especially if the highlights of everyone else’s lives look more interesting than their own day-to-day reality. But active sharing, commenting, and messaging can have a positive effect on well-being, making people feel like they have a stronger support network.

With Co-Watching, Instagram has found a way to turn the one-player experience into a multi-player game. Especially now with everyone stuck at home and unable to crowd around one person’s phone to gab about what they see, there’s a great need for this new feature. One concern is that it could be used for bullying, with people all making fun of someone’s posts.

But in general, the idea of sifting through cute animal photos, dance tutorials, or epic art could take the focus off of the individuals in a video chat. Not having one’s face as the center of attention could make video chat less performative and exhausting. Instead, Co-Watching could let us do apart what we love to do together: just hang out.

Israel passes emergency law to use mobile data for COVID-19 contact tracing

Israel has passed an emergency law to use mobile phone data for tracking people infected with COVID-19 including to identify and quarantine others they have come into contact with and may have infected.

The BBC reports that the emergency law was passed during an overnight sitting of the cabinet, bypassing parliamentary approval.

Israel also said it will step up testing substantially as part of its respond to the pandemic crisis.

In a statement posted to Facebook, prime minister Benjamin Netanyahu wrote: “We will dramatically increase the ability to locate and quarantine those who have been infected. Today, we started using digital technology to locate people who have been in contact with those stricken by the Corona. We will inform these people that they must go into quarantine for 14 days. These are expected to be large – even very large – numbers and we will announce this in the coming days. Going into quarantine will not be a recommendation but a requirement and we will enforce it without compromise. This is a critical step in slowing the spread of the epidemic.”

“I have instructed the Health Ministry to significantly increase the number of tests to 3,000 a day at least,” he added. “It is very likely that we will reach a higher figure, even up to 5,000 a day. To the best of my knowledge, relative to population, this is the highest number of tests in the world, even higher than South Korea. In South Korea, there are around 15,000 tests a day for a population five or six times larger than ours.”

On Monday an Israeli parliamentary subcommittee on intelligence and secret services discussed a government request to authorize Israel’s Shin Bet security service to assist in a national campaign to stop the spread of the novel coronavirus — but declined to vote on the request, arguing more time is needed to assess it.

Civil liberties campaigners have warned the move to monitor citizens’ movements sets a dangerous precedent.

According to WHO data, Israel had 200 confirmed cases of the coronavirus as of yesterday morning. Today the country’s health ministry reported cases had risen to 427.

Details of exactly how the tracking will work have not been released — but, per the BBC, the location data of people’s mobile devices will be collected from telcos by Israel’s domestic security agency and shared with health officials.

It also reports the health ministry will be involved in monitoring the location of infected people to ensure they are complying with quarantine rules — saying it can also send text messages to people who have come into contact with someone with COVID-19 to instruct them to self isolate.

In recent days Netanyahu has expressed frustration that Israel citizens have not been paying enough mind to calls to combat the spread of the virus via voluntary social distancing.

“This is not child’s play. This is not a vacation. This is a matter of life and death,” he wrote on Facebook. “There are many among you who still do not understand the magnitude of the danger. I see the crowds on the beaches, people having fun. They think this is a vacation.”

“According to the instructions that we issued yesterday, I ask you not leave your homes and stay inside as much as possible. At the moment, I say this as a recommendation. It is still not a directive but that can change,” he added.

Since the Israeli government’s intent behind the emergency mobile tracking powers is to combat the spread of COVID-19 by enabling state agencies to identify people whose movements need to be restricted to avoid them passing the virus to others, it seems likely law enforcement agencies will also be involved in enacting the measures.

That will mean citizens’ smartphones being not just a tool of mass surveillance but also a conduit for targeted containment — raising questions about the impact such intrusive measures might have on people’s willingness to carry mobile devices everywhere they go, even during a pandemic.

Yesterday the Wall Street Journal reported that the US government is considering similar location-tracking technology measures in a bid to check the spread of COVID-19 — with discussions ongoing between tech giants, startups and White House officials on measures that could be taken to monitor the disease.

Last week the UK government also held a meeting with tech companies to ask for their help in combating the coronavirus. Per Wired some tech firms offered to share data with the state to help with contact tracing — although, at the time, the government was not pursuing a strategy of mass restrictions on public movement. It has since shifted position.