A US federal court finds suspicionless searches of phones at the border is illegal

A federal court in Boston has ruled that the government is not allowed to search travelers’ phones and devices at the U.S. border without first having reasonable suspicion of a crime.

That’s a significant victory for civil liberties advocates who have said that the government’s own rules that allow its border agents to search electronic devices at the border are unconstitutional.

The court said that the government’s policies on warrantless searches of devices without reasonable suspicion “violate the Fourth Amendment,” which provides constitutional protections against warrantless searches and seizures, the court said.

The case was brought by 11 travelers — ten of which are U.S. citizens — with support from the American Civil Liberties Union and the Electronic Frontier Foundation, who said border agents searched their smartphones and laptops without a warrant, or any suspicion of wrongdoing or criminal activity. But the travelers said the government was overreaching its powers.

The border remains a bizarre legal space, where the government asserts powers that it cannot claim against citizens or residents within the United States. The government has long said it doesn’t need a warrant to search devices at the border.

Any data collected by Customs & Border Protection without a warrant can still be shared with federal, state, local and foreign law enforcement.

Esha Bhandari, staff attorney with the ACLU’s Speech, Privacy, and Technology Project, said the ruling “significantly advances” protections under the Fourth Amendment.

“This is a great day for travelers who now can cross the international border without fear that the government will, in the absence of any suspicion, ransack the extraordinarily sensitive information we all carry in our electronic devices,” said Sophia Cope, a senior staff attorney at the EFF.

Millions of travelers arrive into the U.S. every day. Last year, border officials searched 33,000 travelers’ devices — a fourfold increase since 2015 — without any need for reasonable suspicion. In recent months, travelers have been told to inform the government of any social media handles they have, all of which are subject to inspection. But some have been denied entry to the U.S. for content on their phones shared by other people.

Earlier this year, a federal appeals court found that traffic enforcement officers using chalk to mark car tires was deemed unconstitutional.

A spokesperson for Customs & Border Protection did not immediately comment.

EU-US Privacy Shield passes third Commission ‘health check’ — but litigation looms

The third annual review of the EU-US Privacy Shield data transfer mechanism has once again been nodded through by Europe’s executive.

This despite the EU parliament calling last year for the mechanism to be suspended.

The European Commission also issued US counterparts with a compliance deadline last December — saying the US must appoint a permanent ombudsperson to handle EU citizens’ complaints, as required by the arrangement, and do so by February.

This summer the US senate finally confirmed Keith Krach — under secretary of state for economic growth, energy, and the environment — in the ombudsperson role.

The Privacy Shield arrangement was struck between EU and US negotiators back in 2016 — as a rushed replacement for the prior Safe Harbor data transfer pact which in fall 2015 was struck down by Europe’s top court following a legal challenge after NSA whistleblower Edward Snowden revealed US government agencies were liberally helping themselves to digital data from Internet companies.

At heart is a fundamental legal clash between EU privacy rights and US national security priorities.

The intent for the Privacy Shield framework is to paper over those cracks by devising enough checks and balances that the Commission can claim it offers adequate protection for EU citizens personal data when taken to the US for processing, despite the lack of a commensurate, comprehensive data protection region. But critics have argued from the start that the mechanism is flawed.

Even so around 5,000 companies are now signed up to use Privacy Shield to certify transfers of personal data. So there would be major disruption to businesses were it to go the way of its predecessor — as has looked likely in recent years, since Donald Trump took office as US president.

The Commission remains a staunch defender of Privacy Shield, warts and all, preferring to support data-sharing business as usual than offer a pro-active defence of EU citizens’ privacy rights.

To date it has offered little in the way of objection about how the US has implemented Privacy Shield in these annual reviews, despite some glaring flaws and failures (for example the disgraced political data firm, Cambridge Analytica, was a signatory of the framework, even after the data misuse scandal blew up).

The Commission did lay down one deadline late last year, regarding the ongoing lack of a permanent ombudsperson. So it can now check that box.

It also notes approvingly today that the final two vacancies on the US’ Privacy and Civil Liberties Oversight Board have been filled, meaning it’s fully-staffed for the first time since 2016.

Commenting in a statement, commissioner for justice, consumers and gender equality, Věra Jourová, added: “With around 5,000 participating companies, the Privacy Shield has become a success story. The annual review is an important health check for its functioning. We will continue the digital diplomacy dialogue with our U.S. counterparts to make the Shield stronger, including when it comes to oversight, enforcement and, in a longer-term, to increase convergence of our systems.”

Its press release characterizes US enforcement action related to the Privacy Shield as having “improved” — citing the Federal Trade Commission taking enforcement action in a grand total of seven cases.

It also says vaguely that “an increasing number” of EU individuals are making use of their rights under the Privacy Shield, claiming the relevant redress mechanisms are “functioning well”. (Critics have long suggested the opposite.)

The Commission is recommending further improvements too though, including that the US expand compliance checks such as concerning false claims of participation in the framework.

So presumably there’s a bunch of entirely fake compliance claims going unchecked, as well as actual compliance going under-checked…

“The Commission also expects the Federal Trade Commission to further step up its investigations into compliance with substantive requirements of the Privacy Shield and provide the Commission and the EU data protection authorities with information on ongoing investigations,” the EC adds.

All these annual Commission reviews are just fiddling around the edges, though. The real substantive test for Privacy Shield which will determine its long term survival is looming on the horizon — from a judgement expected from Europe’s top court next year.

In July a hearing took place on a key case that’s been dubbed Schrems II. This is a legal challenge which initially targeted Facebook’s use of another EU data transfer mechanism but has been broadened to include a series of legal questions over Privacy Shield — now with the Court of Justice of the European Union.

There is also a separate litigation directly targeting Privacy Shield that was brought by a French digital rights group which argues it’s incompatible with EU law on account of US government mass surveillance practices.

The Commission’s PR notes the pending litigation — writing that this “may also have an impact on the Privacy Shield”. “A hearing took place in July 2019 in case C-311/18 (Schrems II) and, once the Court’s judgement is issued, the Commission will assess its consequences for the Privacy Shield,” it adds.

So, tl;dr, today’s third annual review doesn’t mean Privacy Shield is out of the legal woods.

California’s Privacy Act: What you need to know now

This week California’s attorney general, Xavier Becerra, published draft guidance for enforcing the state’s landmark privacy legislation.

The draft text of the regulations under the California Consumer Privacy Act (CCPA) will undergo a public consultation period, including a number of public hearings, with submissions open until December 6 this year.

The CCPA itself will take effect in the state on January 1, with a further six months’ grace period before enforcement of the law begins.

“The proposed regulations are intended to operationalize the CCPA and provide practical guidance to consumers and businesses subject to the law,” writes the State of California’s Department of Justice in a press release announcing the draft text. “The regulations would address some of the open issues raised by the CCPA and would be subject to enforcement by the Department of Justice with remedies provided under the law.”

Translation: Here’s the extra detail we think is needed to make the law work.

The CCPA was signed into law in June 2018 — enshrining protections for a sub-set of US citizens against their data being collected and sold without their knowledge.

The law requires businesses over a certain user and/or revenue threshold to disclose what personal data they collect; the purposes they intend to use the data for; and any third parties it will be shared with; as well as requiring that they provide a discrimination-free opt-out to personal data being sold or shared.

Businesses must also comply with consumer requests for their data to be deleted.

Europe’s top court says active consent is needed for tracking cookies

Europe’s top court has ruled that pre-checked consent boxes for dropping cookies are not legally valid.

Consent must be obtained prior to storing or accessing non-essential cookies, such as tracking cookies for targeted advertising. Consent cannot be implied or assumed.

It’s a decision that — at stroke — plunges websites into legal hot water in Europe if their cookie notices don’t ask for consent first. As many don’t, preferring not to risk their ability to track users for ad targeting.

Now they could be risking a big fine under EU privacy laws if they don’t obtain valid consent for tracking.

Sites that have relied upon opting EU users into ad-tracking cookies in the hopes they’ll just click okay to make the cookie banner go away are in for a rude awakening.

Or, to put it another way, the ruling should put a stop to some, er, ‘creative’ interpretations of the rules around cookies that manage to completely miss the point of the law…

ehem

The decision is also likely to influence the ongoing reform of ePrivacy rules — which govern online tracking.

While the outcome of that very heavily lobbied piece of legislation remains to be seen today’s ruling is clearly a win for privacy.

Planet49 case

The backstory to today’s ruling is that a German court asked the CJEU for a decision in a case relating to a lottery website, Planet49, which had required users to consent to the storage of cookies in order to play a promotional game.

In an earlier opinion an influential advisor to the court also took the view that affirmative action not simple inaction must be necessary to constitute consent.

Today the CJEU agreed, handing down a final judgement which makes it plain that consent can’t be assumed — it requires an active opt-in from users.

In a punchily brief press release the court writes:

In today’s judgment, the Court decides that the consent which a website user must give to the storage of and access to cookies on his or her equipment is not validly constituted by way of a prechecked checkbox which that user must deselect to refuse his or her consent.

That decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data. EU law aims to protect the user from any interference with his or her private life, in particular, from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge.

The Court notes that consent must be specific so that the fact that a user selects the button to participate in a promotional lottery is not sufficient for it to be concluded that the user validly gave his or her consent to the storage of cookies.

Furthermore, according to the Court, the information that the service provider must give to a user includes the duration of the operation of cookies and whether or not third parties may have access to those cookies.

So, to sum up, pre-checked consent boxes (or cookie banners that tell you a cookie has already been dropped and pointlessly invite you to click ‘ok’) aren’t valid under EU law. 

Furthermore cookie consent can’t be bundled with another purpose (in the Planet49 case the promotional lottery) — at least if that fuzzy signal is being used to stand for consent.

There’s also an interesting new requirement which looks set to shrink the ability of service operators to obfuscate how persistently they’re tracking Internet users.

For consent to cookies to be legally valid the court now says the user must be provided with some specific information on the tracking, namely: How long the cookie will operate, and who their data will be shared with. So, er, awkward…

“Extending information requirement to include cookie configuration details is an interesting twist that will provide more information to users,” Dr. Lukasz Olejnik, an independent cybersecurity advisor and research associate at the Center for Technology and Global Affairs at Oxford University, told us.

“Sites will need to be wary to be sure that the user-facing text matches the actually used values of max-age or expires attributes. It is also interesting to wonder if sites will want to provide similar information about other cookie attributes.”

Safe to say, there will be some long faces in the ad industry today.

“The Court has made clear that consent should always be manifested in an active manner, and may not be presumed. Therefore, online operators should ensure that they do not collect consent by asking users to unclick a pre-formulated declaration of consent,” said Luca Tosoni, a research fellow in computers and law at the University of Oslo, also commenting on the court ruling.

ePrivacy reform

As we’ve reported before very many sites and services in Europe have, at best, been playing lip-service to EU cookie consent requirements — despite the advent of tighter rules coming into force last year under the General Data Protection Regulation (GDPR), which says that consent must be specific, informed and freely given to be a valid legal basis. And despite — more recently — further guidance from DPAs clarifying the rules around cookie consent.

So the CJEU ruling should lift a fair few heads out of the sand.

“Before the entry into force of the GDPR, the conditions for consent were interpreted differently across Europe. Today’s judgment is important as it brings some clarity on what should be considered valid consent under EU data protection law,” Tosoni also told us, saying he expects the ruling to result in changes to many cookie notifications.

“National courts and data protection authorities across the EU will need to follow the Court’s interpretation when assessing whether controllers have validly obtained consent. In turn, this should lead to more harmonization in enforcement across Europe, in particular with regard to cookie notices. Thus, I would expect many operators to change their non-compliant consents to conform with the ruling.”

EU law on cookie consent dates back much earlier than the GDPR — to the prior Data Protection Directive and the still in force ePrivacy Directive — Article 5(3) of which specifies that for cookies to be used users must give opt-in consent after being provided with clear and comprehensive information (with only a limited exception for ‘strictly necessary’ cookies).

Although European legislators have been trying for years to agree on an update to the ePrivacy Directive.

A draft proposal for an ePrivacy Regulation was introduced by the Commission at the start of 2017. But negotiations have been anything but smooth — with a blitz of lobbying from the adtech and telecoms industries pushing against a firm requirement for opt-in consent to tracking.

The CJEU’s clarity that consent is required to store and access cookies pushes in the opposite direction. And that firm legal line protecting individual privacy from background tracking technologies should be harder for legislators to ignore.

“Today’s ruling is likely to have a significant impact on the ongoing negotiations on the ePrivacy Regulation which is set to regulate cookie usage, an issue on which European legislators are struggling to find an agreement,” Tosoni said, adding: “In the past, the Court’s rulings have had an important impact on the development of the GDPR.”

In the meanwhile, the judgement should at least force some of the more cynical and/or stupid cookie banners to be quietly replaced with something that at least asks for consent.

Cookie walls

That said, the ruling does not resolve all the problems around cookie consent.

Specifically the court has not waded into the contentious forced consent/cookie wall issue. This is where a site requires consent to advertising cookies as the ‘price’ for accessing the sought for service, with the only other option being to leave.

Earlier this year the Dutch DPA deemed cookie walls to be illegal. But the agency’s interpretation is open to legal challenge. Only the CJEU can have the final word.

In the Planet49 case the court sidestepped the issue — saying the referring court did not ask it to rule on the question of “whether it is compatible with the requirement that consent be ‘freely given’, within the meaning of Article 2(h) of Directive 95/46 and of Article 4(11) and Article 7(4) of Regulation 2016/679, for a user’s consent to the processing of his personal data for advertising purposes to be a prerequisite to that user’s participation in a promotional lottery, as appears to be the case in the main proceedings”.

“In those circumstances, it is not appropriate for the Court to consider that question,” it wrote.

Likely it’s doing so because another case is already set to consider that question. Tosoni says he expects the Orange Romania case — which is pending before the court — to further clarify the requirements of valid consent in the context of it being ‘freely given’.

“Some uncertainty on the requirements of valid consent remains. Indeed, in today’s judgment, the Court has primarily clarified what constitutes unambiguous and specific consent, but the Court has, for example, not clarified what degree of autonomy a data subject should enjoy when choosing whether or not to give consent for the latter to be considered “freely given”,” he said.

“Today’s judgment does not provide an answer on the legality of cookie walls, which require consent to access the underlying service.  The Court found that it was unable to address this point, as the referring German court had not asked the ECJ to assess the legality of making participation in a lottery — the service at issue in the case — subject to giving advertising cookie consent.  Further clarity on this issue may come from the Orange Romania case, which is currently pending before the ECJ.”

We’ve reached out to the IAB Europe for a response to the ruling and to ask what advice it will be issuing to its members. At the time of writing it had not yet responded to these questions. 

America’s largest companies push for federal online privacy laws to circumvent state regulatory efforts

As California moves ahead with what would be the most restrictive online privacy laws in the nation, the chief executives of some of the nation’s largest companies are taking their case to the nation’s capitol to plead for federal regulation.

Chief executives at Amazon, AT&T, Dell, Ford, IBM, Qualcomm, Walmart and other leading financial services, manufacturing and technology companies have issued an open letter to congressional leadership pleading with them to take action on online privacy, through the pro-industry organization, The Business Roundtable.

“Now is the time for Congress to act and ensure that consumers are not faced with confusion about their rights and protections based on a patchwork of inconsistent state laws. Further, as the regulatory landscape becomes increasingly fragmented and more complex, U.S. innovation and global competitiveness in the digital economy are threatened,” the letter says.

The subtext to this call to action is the California privacy regulations that are set to take effect by the end of this year.

As we noted when the bill was passed last year there are a few key components of the California legislation, including the following requirements:

  • Businesses must disclose what information they collect, what business purpose they do so for and any third parties they share that data with.

  • Businesses would be required to comply with official consumer requests to delete that data.

  • Consumers can opt out of their data being sold, and businesses can’t retaliate by changing the price or level of service.

  • Businesses can, however, offer “financial incentives” for being allowed to collect data.

  • California authorities are empowered to fine companies for violations.

There’s a reason why companies would push for federal regulation to supersede any initiatives from the states. It is more of a challenge for companies to adhere to a patchwork of different regulatory regimes at the state level. But it’s also true that companies, following the lead of automakers in California, could just adhere to the most stringent requirements, which would clarify any confusion.

Indeed, many of these companies are already complying with strict privacy regulations thanks to the passage of the GDPR in Europe.

UK’s health data guardian sets a firm line for app development using patient data

The UK’s health data watchdog, the National Data Guardian (NDG), has published correspondence between her office and the national privacy watchdog which informed the ICO’s finding in 2017 that a data-sharing arrangement between an NHS Trust and Google-owned DeepMind broke the law.

The exchange was published following a Freedom of Information request by TechCrunch.

In fall 2015 the Royal Free NHS Trust and DeepMind signed a data-sharing agreement which saw the medical records of 1.6 million people quietly passed to the AI company without patients being asked for their consent.

The scope of the data-sharing arrangement — ostensibly to develop a clinical task management app — was only brought to light by investigative journalism. That then triggered regulatory scrutiny — and the eventual finding by the ICO that there was no legal basis for the data to have been transferred in the first place.

Despite that, the app in question, Streams — which does not (currently) contain any AI but uses an NHS algorithm for detecting acute kidney injury — has continued being used in NHS hospitals.

DeepMind has also since announced it plans to transfer its health division to Google. Although — to our knowledge — no NHS trusts have yet signed new contracts for Streams with the ad giant.

In parallel with releasing her historical correspondence with the ICO, Dame Fiona Caldicott, the NDG, has written a blog post in which she articulates a clear regulatory position that the “reasonable expectations” of patients must govern non-direct care uses for people’s health data — rather than healthcare providers relying on whether doctors think developing such and such an app is a great idea.

The ICO had asked for guidance from the NDG on how to apply the common law duty of confidentiality, as part of its investigation into the Royal Free NHS Trust’s data-sharing arrangement with DeepMind for Streams.

In a subsequent audit of Streams that was a required by the regulator, the trust’s law firm, Linklaters, argued that a call on whether a duty of confidentiality has been breached should be judged from the point of view of the clinician’s conscience, rather than the patient’s reasonable expectations.

Caldicott writes that she firmly disagrees with that “key argument”.

“It is my firm view that it is the patient’s perspective that is most important when judgements are being made about the use of their confidential information. My letter to the Information Commissioner sets out my thoughts on this matter in some detail,” she says, impressing the need for healthcare innovation to respect the trust and confidence of patients and the public.

“I do champion innovative technologies and new treatments that are powered by data. The mainstreaming of emerging fields such as genomics and artificial intelligence offer much promise and will change the face of medicine for patients and health professionals immeasurably… But my belief in innovation is coupled with an equally strong belief that these advancements must be introduced in a way that respects people’s confidentiality and delivers no surprises about how their data is used. In other words, the public’s reasonable expectations must be met.”

“Patients’ reasonable expectations are the touchstone of the common law duty of confidence,” she adds. “Providers who are introducing new, data-driven technologies, or partnering with third parties to help develop and test them, have called for clearer guidance about respecting data protection and confidentiality. I intend to work with the Information Commissioner and others to improve the advice available so that innovation can be undertaken safely: in compliance with the common law and the reasonable expectations of patients.

“The National Data Guardian is currently supporting the Health Research Authority in clarifying and updating guidance on the lawful use of patient data in the development of healthcare technologies.”

We reached out to the Royal Free NHS Trust and DeepMind for comment on the NDG’s opinion. At the time of writing neither had responded.

In parallel, Bloomberg reported this week that DeepMind co-founder, Mustafa Suleyman, is currently on leave from the company. (Suleyman has since tweeted that the break is temporary and for “personal” reasons, to “recharge”, and that he’s “looking forward to being back in the saddle at DeepMind soon”.)

The AI research company recently touted what it couched as a ‘breakthrough’ in predictive healthcare — saying it had developed an AI model for predicting the same condition that the Streams app is intended to alert for. Although the model was built using US data from the Department of Veterans Affairs which skews overwhelmingly male.

As we wrote at the time, the episode underscores the potential value locked up in NHS data — which offers population-level clinical data that the NHS could use to develop AI models of its own. Indeed, a 2017 government-commissioned review of the life sciences sector called for a strategy to “capture for the UK the value in algorithms generated using NHS data”.

The UK government is also now pushing a ‘tech-first’ approach to NHS service delivery.

Earlier this month the government announced it’s rerouting £250M in public funds for the NHS to set up an artificial intelligence lab that will work to expand the use of AI technologies within the service.

Last fall health secretary, Matt Hancock, set out his tech-first vision of future healthcare provision — saying he wanted “healthtech” apps and services to support “preventative, predictive and personalised care”.

So there are certainly growing opportunities for developing digital healthcare solutions to support the UK’s National Health Service.

As well as — now — clearer regulatory guidance that app development that wants to be informed by patient data must first win the trust and confidence of the people it hopes to serve.

Most EU cookie ‘consent’ notices are meaningless or manipulative, study finds

New research into how European consumers interact with the cookie consent mechanisms which have proliferated since a major update to the bloc’s online privacy rules last year casts an unflattering light on widespread manipulation of a system that’s supposed to protect consumer rights.

As Europe’s General Data Protection Regulation (GDPR) came into force in May 2018, bringing in a tough new regime of fines for non-compliance, websites responded by popping up legal disclaimers which signpost visitor tracking activities. Some of these cookie notices even ask for consent to track you.

But many don’t — even now, more than a year later.

The study, which looked at how consumers interact with different designs of cookie pop-ups and how various design choices can nudge and influence people’s privacy choices, also suggests consumers are suffering a degree of confusion about how cookies function, as well as being generally mistrustful of the term ‘cookie’ itself. (With such baked in tricks, who can blame them?)

The researchers conclude that if consent to drop cookies was being collected in a way that’s compliant with the EU’s existing privacy laws only a tiny fraction of consumers would agree to be tracked.

The paper, which we’ve reviewed in draft ahead of publication, is co-authored by academics at Ruhr-University Bochum, Germany, and the University of Michigan in the US — and entitled: (Un)informed Consent: Studying GDPR Consent Notices in the Field.

The researchers ran a number of studies, gathering ~5,000 of cookie notices from screengrabs of leading websites to compile a snapshot (derived from a random sub-sample of 1,000) of the different cookie consent mechanisms in play in order to paint a picture of current implementations.

They also worked with a German ecommerce website over a period of four months to study how more than 82,000 unique visitors to the site interacted with various cookie consent designs which the researchers’ tweaked in order to explore how different defaults and design choices affected individuals’ privacy choices.

Their industry snapshot of cookie consent notices found that the majority are placed at the bottom of the screen (58%); not blocking the interaction with the website (93%); and offering no options other than a confirmation button that does not do anything (86%). So no choice at all then.

A majority also try to nudge users towards consenting (57%) — such as by using ‘dark pattern’ techniques like using a color to highlight the ‘agree’ button (which if clicked accepts privacy-unfriendly defaults) vs displaying a much less visible link to ‘more options’ so that pro-privacy choices are buried off screen.

And while they found that nearly all cookie notices (92%) contained a link to the site’s privacy policy, only a third (39%) mention the specific purpose of the data collection or who can access the data (21%).

The GDPR updated the EU’s long-standing digital privacy framework, with key additions including tightening the rules around consent as a legal basis for processing people’s data — which the regulation says must be specific (purpose limited), informed and freely given for consent to be valid.

Even so, since May last year there has been an outgrown in cookie ‘consent’ mechanisms popping up or sliding atop websites that still don’t offer EU visitors the necessary privacy choices, per the research.

“Given the legal requirements for explicit, informed consent, it is obvious that the vast majority of cookie consent notices are not compliant with European privacy law,” the researchers argue.

“Our results show that a reasonable amount of users are willing to engage with consent notices, especially those who want to opt out or do not want to opt in. Unfortunately, current implementations do not respect this and the large majority offers no meaningful choice.”

The researchers also record a large differential in interaction rates with consent notices — of between 5 and 55% — generated by tweaking positions, options, and presets on cookie notices.

This is where consent gets manipulated — to flip visitors’ preference for privacy.

They found that the more choices offered in a cookie notice, the more likely visitors were to decline the use of cookies. (Which is an interesting finding in light of the vendor laundry lists frequently baked into the so-called “transparency and consent framework” which the industry association, the Internet Advertising Bureau (IAB), has pushed as the standard for its members to use to gather GDPR consents.)

“The results show that nudges and pre-selection had a high impact on user decisions, confirming previous work,” the researchers write. “It also shows that the GDPR requirement of privacy by default should be enforced to make sure that consent notices collect explicit consent.”

Here’s a section from the paper discussing what they describe as “the strong impact of nudges and pre-selections”:

Overall the effect size between nudging (as a binary factor) and choice was CV=0.50. For example, in the rather simple case of notices that only asked users to confirm that they will be tracked, more users clicked the “Accept” button in the nudge condition, where it was highlighted (50.8% on mobile, 26.9% on desktop), than in the non-nudging condition where “Accept” was displayed as a text link (39.2% m, 21.1% d). The effect was most visible for the category-and vendor-based notices, where all checkboxes were pre-selected in the nudging condition, while they were not in the privacy-by-default version. On the one hand, the pre-selected versions led around 30% of mobile users and 10% of desktop users to accept all third parties. On the other hand, only a small fraction (< 0.1%) allowed all third parties when given the opt-in choice and around 1 to 4 percent allowed one or more third parties (labeled “other” in 4). None of the visitors with a desktop allowed all categories. Interestingly, the number of non-interacting users was highest on average for the vendor-based condition, although it took up the largest part of any screen since it offered six options to choose from.

The key implication is that just 0.1% of site visitors would freely choose to enable all cookie categories/vendors — i.e. when not being forced to do so by a lack of choice or via nudging with manipulative dark patterns (such as pre-selections).

Rising a fraction, to between 1-4%, who would enable some cookie categories in the same privacy-by-default scenario.

“Our results… indicate that the privacy-by-default and purposed-based consent requirements put forth by the GDPR would require websites to use consent notices that would actually lead to less than 0.1 % of active consent for the use of third parties,” they write in conclusion.

They do flag some limitations with the study, pointing out that the dataset they used that arrived at the 0.1% figure is biased — given the nationality of visitors is not generally representative of public Internet users, as well as the data being generated from a single retail site. But they supplemented their findings with data from a company (Cookiebot) which provides cookie notices as a SaaS — saying its data indicated a higher accept all clicks rate but still only marginally higher: Just 5.6%.

Hence the conclusion that if European web users were given an honest and genuine choice over whether or not they get tracked around the Internet, the overwhelming majority would choose to protect their privacy by rejecting tracking cookies.

This is an important finding because GDPR is unambiguous in stating that if an Internet service is relying on consent as a legal basis to process visitors’ personal data it must obtain consent before processing data (so before a tracking cookie is dropped) — and that consent must be specific, informed and freely given.

Yet, as the study confirms, it really doesn’t take much clicking around the regional Internet to find a gaslighting cookie notice that pops up with a mocking message saying by using this website you’re consenting to your data being processed how the site sees fit — with just a single ‘Ok’ button to affirm your lack of say in the matter.

It’s also all too common to see sites that nudge visitors towards a big brightly colored ‘click here’ button to accept data processing — squirrelling any opt outs into complex sub-menus that can sometimes require hundreds of individual clicks to deny consent per vendor.

You can even find websites that gate their content entirely unless or until a user clicks ‘accept’ — aka a cookie wall. (A practice that has recently attracted regulatory intervention.)

Nor can the current mess of cookie notices be blamed on a lack of specific guidance on what a valid and therefore legal cookie consent looks like. At least not any more. Here, for example, is a myth-busting blog which the UK’s Information Commissioner’s Office (ICO) published last month that’s pretty clear on what can and can’t be done with cookies.

For instance on cookie walls the ICO writes: “Using a blanket approach such as this is unlikely to represent valid consent. Statements such as ‘by continuing to use this website you are agreeing to cookies’ is not valid consent under the higher GDPR standard.” (The regulator goes into more detailed advice here.)

While France’s data watchdog, the CNIL, also published its own detailed guidance last month — if you prefer to digest cookie guidance in the language of love and diplomacy.

(Those of you reading TechCrunch back in January 2018 may also remember this sage plain english advice from our GDPR explainer: “Consent requirements for processing personal data are also considerably strengthened under GDPR — meaning lengthy, inscrutable, pre-ticked T&Cs are likely to be unworkable.” So don’t say we didn’t warn you.)

Nor are Europe’s data protection watchdogs lacking in complaints about improper applications of ‘consent’ to justify processing people’s data.

Indeed, ‘forced consent’ was the substance of a series of linked complaints by the pro-privacy NGO noyb, which targeted T&Cs used by Facebook, WhatsApp, Instagram and Google Android immediately GDPR started being applied in May last year.

While not cookie notice specific, this set of complaints speaks to the same underlying principle — i.e. that EU users must be provided with a specific, informed and free choice when asked to consent to their data being processed. Otherwise the ‘consent’ isn’t valid.

So far Google is the only company to be hit with a penalty as a result of that first wave of consent-related GDPR complaints; France’s data watchdog issued it a $57M fine in January.

But the Irish DPC confirmed to us that three of the 11 open investigations it has into Facebook and its subsidiaries were opened after noyb’s consent-related complaints. (“Each of these investigations are at an advanced stage and we can’t comment any further as these investigations are ongoing,” a spokeswoman told us. So, er, watch that space.)

The problem, where EU cookie consent compliance is concerned, looks to be both a failure of enforcement and a lack of regulatory alignment — the latter as a consequence of the ePrivacy Directive (which most directly concerns cookies) still not being updated, generating confusion (if not outright conflict) with the shiny new GDPR.

However the ICO’s advice on cookies directly addresses claimed inconsistencies between ePrivacy and GDPR, stating plainly that Recital 25 of the former (which states: “Access to specific website content may be made conditional on the well-informed acceptance of a cookie or similar device, if it is used for a legitimate purpose”) does not, in fact, sanction gating your entire website behind an ‘accept or leave’ cookie wall.

Here’s what the ICO says on Recital 25 of the ePrivacy Directive:

  • ‘specific website content’ means that you should not make ‘general access’ subject to conditions requiring users to accept non-essential cookies – you can only limit certain content if the user does not consent;
  • the term ‘legitimate purpose’ refers to facilitating the provision of an information society service – ie, a service the user explicitly requests. This does not include third parties such as analytics services or online advertising;

So no cookie wall; and no partial walls that force a user to agree to ad targeting in order to access the content.

It’s worth point out that other types of privacy-friendly online advertising are available with which to monetize visits to a website. (And research suggests targeted ads offer only a tiny premium over non-targeted ads, even as publishers choosing a privacy-hostile ads path must now factor in the costs of data protection compliance to their calculations — as well as the cost and risk of massive GDPR fines if their security fails or they’re found to have violated the law.)

Negotiations to replace the now very long-in-the-tooth ePrivacy Directive — with an up-to-date ePrivacy Regulation which properly takes account of the proliferation of Internet messaging and all the ad tracking techs that have sprung up in the interim — are the subject of very intense lobbying, including from the adtech industry desperate to keep a hold of cookie data. But EU privacy law is clear.

“[Cookie consent]’s definitely broken (and has been for a while). But the GDPR is only partly to blame, it was not intended to fix this specific problem. The uncertainty of the current situation is caused the delay of the ePrivacy regulation that was put on hold (thanks to lobbying),” says Martin Degeling, one of the research paper’s co-authors, when we suggest European Internet users are being subject to a lot of ‘consent theatre’ (ie noisy yet non-compliant cookie notices) — which in turn is causing knock-on problems of consumer mistrust and consent fatigue for all these useless pop-ups. Which work against the core aims of the EU’s data protection framework.

“Consent fatigue and mistrust is definitely a problem,” he agrees. “Users that have experienced that clicking ‘decline’ will likely prevent them from using a site are likely to click ‘accept’ on any other site just because of one bad experience and regardless of what they actually want (which is in most cases: not be tracked).”

“We don’t have strong statistical evidence for that but users reported this in the survey,” he adds, citing a poll the researchers also ran asking site visitors about their privacy choices and general views on cookies. 

Degeling says he and his co-authors are in favor of a consent mechanism that would enable web users to specify their choice at a browser level — rather than the current mess and chaos of perpetual, confusing and often non-compliant per site pop-ups. Although he points out some caveats.

“DNT [Do Not Track] is probably also not GDPR compliant as it only knows one purpose. Nevertheless  something similar would be great,” he tells us. “But I’m not sure if shifting the responsibility to browser vendors to design an interface through which they can obtain consent will lead to the best results for users — the interfaces that we see now, e.g. with regard to cookies, are not a good solution either.

“And the conflict of interest for Google with Chrome are obvious.”

The EU’s unfortunate regulatory snafu around privacy — in that it now has one modernized, world-class privacy regulation butting up against an outdated directive (whose progress keeps being blocked by vested interests intent on being able to continue steamrollering consumer privacy) — likely goes some way to explaining why Member States’ data watchdogs have generally been loath, so far, to show their teeth where the specific issue of cookie consent is concerned.

At least for an initial period the hope among data protection agencies (DPAs) was likely that ePrivacy would be updated and so they should wait and see.

They have also undoubtedly been providing data processors with time to get their data houses and cookie consents in order. But the frictionless interregnum while GDPR was allowed to ‘bed in’ looks unlikely to last much longer.

Firstly because a law that’s not enforced isn’t worth the paper it’s written on (and EU fundamental rights are a lot older than the GDPR). Secondly, with the ePrivacy update still blocked DPAs have demonstrated they’re not just going to sit on their hands and watch privacy rights be rolled back — hence them putting out guidance that clarifies what GDPR means for cookies. They’re drawing lines in the sand, rather than waiting for ePrivacy to do it (which also guards against the latter being used by lobbyists as a vehicle to try to attack and water down GDPR).

And, thirdly, Europe’s political institutions and policymakers have been dining out on the geopolitical attention their shiny privacy framework (GDPR) has attained.

Much has been made at the highest levels in Europe of being able to point to US counterparts, caught on the hop by ongoing tech privacy and security scandals, while EU policymakers savor the schadenfreude of seeing their US counterparts being forced to ask publicly whether it’s time for America to have its own GDPR.

With its extraterritorial scope, GDPR was always intended to stamp Europe’s rule-making prowess on the global map. EU lawmakers will feel they can comfortably check that box.

However they are also aware the world is watching closely and critically — which makes enforcement a very key piece. It must slot in too. They need the GDPR to work on paper and be seen to be working in practice.

So the current cookie mess is a problematic signal which risks signposting regulatory failure — and that simply isn’t sustainable.

A spokesperson for the European Commission told us it cannot comment on specific research but said: “The protection of personal data is a fundamental right in the European Union and a topic the Juncker commission takes very seriously.”

“The GDPR strengthens the rights of individuals to be in control of the processing of personal data, it reinforces the transparency requirements in particular on the information that is crucial for the individual to make a choice, so that consent is given freely, specific and informed,” the spokesperson added. 

“Cookies, insofar as they are used to identify users, qualify as personal data and are therefore subject to the GDPR. Companies do have a right to process their users’ data as long as they receive consent or if they have a legitimate interest.”

All of which suggests that the movement, when it comes, must come from a reforming adtech industry.

With robust privacy regulation in place the writing is now on the wall for unfettered tracking of Internet users for the kind of high velocity, real-time trading of people’s eyeballs that the ad industry engineered for itself when no one knew what was being done with people’s data.

GDPR has already brought greater transparency. Once Europeans are no longer forced to trade away their privacy it’s clear they’ll vote with their clicks not to be ad-stalked around the Internet too.

The current chaos of non-compliant cookie notices is thus a signpost pointing at an underlying privacy lag — and likely also the last gasp signage of digital business models well past their sell-by-date.

Europe’s top court sharpens guidance for sites using leaky social plug-ins

Europe’s top court has made a ruling that could affect scores of websites that embed the Facebook ‘Like’ button and receive visitors from the region.

The ruling by the Court of Justice of the EU states such sites are jointly responsible for the initial data processing — and must either obtain informed consent from site visitors prior to data being transferred to Facebook, or be able to demonstrate a legitimate interest legal basis for processing this data.

The ruling is significant because, as currently seems to be the case, Facebook’s Like buttons transfer personal data automatically, when a webpage loads — without the user even needing to interact with the plug-in — which means if websites are relying on visitors’ ‘consenting’ to their data being shared with Facebook they will likely need to change how the plug-in functions to ensure no data is sent to Facebook prior to visitors being asked if they want their browsing to be tracked by the adtech giant.

The background to the case is a complaint against online clothes retailer, Fashion ID, by a German consumer protection association, Verbraucherzentrale NRW — which took legal action in 2015 seeking an injunction against Fashion ID’s use of the plug-in which it claimed breached European data protection law.

Like ’em or loath ’em, Facebook’s ‘Like’ buttons are an impossible-to-miss component of the mainstream web. Though most Internet users are likely unaware that the social plug-ins are used by Facebook to track what other websites they’re visiting for ad targeting purposes.

Last year the company told the UK parliament that between April 9 and April 16 the button had appeared on 8.4M websites, while its Share button social plug-in appeared on 931K sites. (Facebook also admitted to 2.2M instances of another tracking tool it uses to harvest non-Facebook browsing activity — called a Facebook Pixel — being invisibly embedded on third party websites.)

The Fashion ID case predates the introduction of the EU’s updated privacy framework, GDPR, which further toughens the rules around obtaining consent — meaning it must be purpose specific, informed and freely given.

Today’s CJEU decision also follows another ruling a year ago, in a case related to Facebook fan pages, when the court took a broad view of privacy responsibilities around platforms — saying both fan page administrators and host platforms could be data controllers. Though it also said joint controllership does not necessarily imply equal responsibility for each party.

In the latest decision the CJEU has sought to draw some limits on the scope of joint responsibility, finding that a website where the Facebook Like button is embedded cannot be considered a data controller for any subsequent processing, i.e. after the data has been transmitted to Facebook Ireland (the data controller for Facebook’s European users).

The joint responsibility specifically covers the collection and transmission of Facebook Like data to Facebook Ireland.

“It seems, at the outset, impossible that Fashion ID determines the purposes and means of those operations,” the court writes in a press release announcing the decision.

“By contrast, Fashion ID can be considered to be a controller jointly with Facebook Ireland in respect of the operations involving the collection and disclosure by transmission to Facebook Ireland of the data at issue, since it can be concluded (subject to the investigations that it is for the Oberlandesgericht Düsseldorf [German regional court] to carry out) that Fashion ID and Facebook Ireland determine jointly the means and purposes of those operations.”

Responding the judgement in a statement attributed to its associate general counsel, Jack Gilbert, Facebook told us:

Website plugins are common and important features of the modern Internet. We welcome the clarity that today’s decision brings to both websites and providers of plugins and similar tools. We are carefully reviewing the court’s decision and will work closely with our partners to ensure they can continue to benefit from our social plugins and other business tools in full compliance with the law.

The company said it may make changes to the Like button to ensure websites that use it are able to comply with Europe’s GDPR.

Though it’s not clear what specific changes these could be, such as — for example — whether Facebook will change the code of its social plug-ins to ensure no data is transferred at the point a page loads. (We’ve asked Facebook and will update this report with any response.)

Facebook also points out that other tech giants, such as Twitter and LinkedIn, deploy similar social plug-ins — suggesting the CJEU ruling will apply to other social platforms, as well as to thousands of websites across the EU where these sorts of plug-ins crop up.

“Sites with the button should make sure that they are sufficiently transparent to site visitors, and must make sure that they have a lawful basis for the transfer of the user’s personal data (e.g. if just the user’s IP address and other data stored on the user’s device by Facebook cookies) to Facebook,” Neil Brown, a telecoms, tech and internet lawyer at U.K. law firm Decoded Legal, told TechCrunch.

“If their lawful basis is consent, then they’ll need to get consent before deploying the button for it to be valid — otherwise, they’ll have done the transfer before the visitor has consented

“If relying on legitimate interests — which might scrape by — then they’ll need to have done a legitimate interests assessment, and kept it on file (against the (admittedly unlikely) day that a regulator asks to see it), and they’ll need to have a mechanism by which a site visitor can object to the transfer.”

“Basically, if organisations are taking on board the recent guidance from the ICO and CNIL on cookie compliance, wrapping in Facebook ‘Like’ and other similar things in with that work would be sensible,” Brown added.

Also commenting on the judgement, Michael Veale, a UK-based researcher in tech and privacy law/policy, said it raises questions about how Facebook will comply with Europe’s data protection framework for any further processing it carries out of the social plug-in data.

“The whole judgement to me leaves open the question ‘on what grounds can Facebook justify further processing of data from their web tracking code?'” he told us. “If they have to provide transparency for this further processing, which would take them out of joint controllership into sole controllership, to whom and when is it provided?

“If they have to demonstrate they would win a legitimate interests test, how will that be affected by the difficulty in delivering that transparency to data subjects?’

“Can Facebook do a backflip and say that for users of their service, their terms of service on their platform justifies the further use of data for which individuals must have separately been made aware of by the website where it was collected?

“The question then quite clearly boils down to non-users, or to users who are effectively non-users to Facebook through effective use of technologies such as Mozilla’s browser tab isolation.”

How far a tracking pixel could be considered a ‘similar device’ to a cookie is another question to consider, he said.

The tracking of non-Facebook users via social plug-ins certainly continues to be a hot-button legal issue for Facebook in Europe — where the company has twice lost in court to Belgium’s privacy watchdog on this issue. (Facebook has continued to appeal.)

Facebook founder Mark Zuckerberg also faced questions about tracking non-users last year, from MEPs in the European Parliament — who pressed him on whether Facebook uses data on non-users for any other uses vs the security purpose of “keeping bad content out” that he claimed requires Facebook to track everyone on the mainstream Internet.

MEPs also wanted to know how non-users can stop their data being transferred to Facebook? Zuckerberg gave no answer, likely because there’s currently no way for non-users to stop their data being sucked up by Facebook’s servers — short of staying off the mainstream Internet.

David and Goliath: Approaching the ‘deal’

It is a simple question with a complex answer. How does a startup get from zero to execution when negotiating contracts with potential customers that are large enterprises? The 800-pound gorillas. Situations in which your negotiating leverage is limited (often severely so).

As a commercial contracts attorney, clients often ask me about the one right way to approach deals. Many are looking for a cheat sheet of universal terms they should push for in contracts. But there is no one answer.

Deals are not cookie-cutter, and neither are the contracts on which they are built. That said, a basic framework can help provide startups with some grounding to better think about negotiations with large enterprises. The idea is to avoid over-lawyering, and instead approach the discussion with a legally prudent yet deal-centric mindset.

There are generally six overarching considerations as you head into negotiations with large, enterprise organizations.

Lexion raises $4.2M to bring AI to contract management

Contract management isn’t exactly an exciting subject, but it’s a real pain point for many companies. It also lends itself to automation, thanks to recent advances in machine learning and natural language processing. It’s no surprise then, that we see renewed interest in this space and that investors are putting more money into it. Earlier this week, Icertis raised a $115 million Series E round, for example, at a valuation of more than $1 billion. Icertis has been in this business for 10 years, though. On the other end of the spectrum, contract management startup Lexion today announced that it has raised a $4.2 million seed round led by Madrona Venture Group and law firm Wilson Sonsini Goodrich & Rosati, which was also one of the first users of the product.

Lexion was incubated at the Allen Institute for Artificial Intelligence (AI2), one of the late Microsoft co-founders’ four scientific research institutes. The company’s co-founder and CEO, Gaurav Oberoi, is a bit of a serial entrepreneur, whose first startup, BillMonk, was first featured on TechCrunch back in 2006. His second go-around was Precision Polling, which SurveyMonkey then acquired shortly after it launched. Oberoi founded the company together with former Microsoft research software development engineering lead Emad Elwany and engineering veteran James Baird.

4 understanding autorenewal clause

“Gaurav, Emad, and James are just the kind of entrepreneurs we love to back: smart, customer obsessed and attacking a big market with cutting-edge technology,” said Madrona Venture Group managing director Tim Porter. “AI2 is turning out some of the best applied machine learning solutions, and contract management is a perfect example — it’s a huge issue for companies at every size and the demand for visibility into contracts is only increasing as companies face growing regulatory and compliance pressures.”

Contract management is becoming a bit of a crowded space, though, something Oberoi acknowledged. But he argues that Lexion is tackling a different market from many of its competitors.

5 extraction in action animation

“We think there’s growing demand and a big opportunity in the mid-market,” he said. “I think similar to how back in the 2000s, Siebel or other companies offered very expensive CRM software and now you have Salesforce — and now Salesforce is the expensive version — and you have this long tail of products in the mid-market. I think the same is happening to contracts. […] We’re working with companies that are as small as post-seed or post-Series A to a publicly traded company.”

Given that it handles plenty of highly confidential information, it’s no surprise that Lexion says that it takes security very seriously. “I think, something that all young startups that are selling into business or enterprise in 2019 need to address upfront,” Oberoi said. “We realized, even before we raised funding and got very serious about growing this business, that security has to be part of our DNA and culture from the get-go.” He also noted that every new feature and product iteration at Lexion goes through a security review.

Like most startups at this stage, Lexion plans to invest the new funding into building out its product — and especially its AI engine — and go-to-market and sales strategy.