GDPR enforcement must level up to catch big tech, report warns

A new report by European consumer protection umbrella group Beuc, reflecting on the barriers to effective cross-border enforcement of the EU’s flagship data protection framework, makes awkward reading for the regional lawmakers and regulators as they seek to shape the next decades of digital oversight across the bloc.

Beuc’s members filed a series of complaints against Google’s use of location data in November 2018 — but some two years on from raising privacy concerns there’s been no resolution of the complaints.

The tech giant continues to make billions in ad revenue, including by processing and monetize Internet users’ location data. Its lead data protection supervisor, under GDPR’s one-stop-shop mechanism for dealing with cross-border complaints, Ireland’s Data Protection Commission (DPC), did finally open an investigation in February this year.

But it could still be years before Google faces any regulatory action in Europe related to its location tracking.

This is because Ireland’s DPC has yet to issue any cross-border GDPR decisions, some 2.5 years after the regulation started being applied. (Although, as we reported recently, a case related to a Twitter data breach is inching towards a result in the coming days.)

By contrast, France’s data watchdog, the CNIL, was able to complete a GDPR investigation into the transparency of Google’s data processing in much quicker order last year.

This summer French courts also confirmed the $57M fine it issued, slapping down Google’s appeal.

But the case predated Google coming under the jurisdiction of the DPC. And Ireland’s data regulator has to deal with a disproportionate number of multinational tech companies, given how many have established their EU base in the country.

The DPC has a major backlog of cross-border cases, with more than 20 GDPR probes involving a number of tech companies including Apple, Facebook/WhatsApp and LinkedIn. (Google has also been under investigation in Ireland over its adtech since 2019.)

This week the EU’s internet market commissioner, Thierry Breton, said regional lawmakers are well aware of enforcement “bottlenecks” in the General Data Protection Regulation (GDPR).

He suggested the Commission has learned lessons from this friction — claiming it will ensure similar concerns don’t affect the future working of a regulatory proposal related to data reuse that he was out speaking in public to introduce.

The Commission wants to create standard conditions for rights-respecting reuse of industrial data across the EU, via a new Data Governance Act (DGA), which proposes similar oversight mechanisms as are involved in the EU’s oversight of personal data — including national agencies monitoring compliance and a centralized EU steering body (which they’re planning to call the European Data Innovation Board as a mirror entity to the European Data Protection Board).

The Commission’s ambitious agenda for updating and expanding the EU’s digital rules framework, means criticism of GDPR risks taking the shine off the DGA before the ink has dried on the proposal document — putting pressure on lawmakers to find creative ways to unblock GDPR’s enforcement “bottleneck”. (Creative because national agencies are responsibility for day to day oversight, and Member States are responsible for resourcing DPAs.) 

In an initial GDPR review this summer, the Commission praised the regulation as a “modern and horizontal piece of legislation” and a “global reference point” — claiming it’s served as a point of inspiration for California’s CCPA and other emerging digital privacy frameworks around the world.

But they also conceded GDPR enforcement is lacking.

The best answer to this concern “will be a decision from the Irish data protection authority about important cases”, the EU’s justice commissioner, Didier Reynders, said in June.

Five months later European citizens are still waiting.

Beuc’s report — which it’s called The long and winding road: Two years of the GDPR: A cross-border data protection case from a consumer perspective — details the procedural obstacles its member organizations have faced in seeking to obtain a decision related to the original complaints, which were filed with a variety of DPAs around the EU.

This includes concerns of the Irish DPC making unnecessary “information and admissibility checks”; as well as rejecting complaints brought by an interested organization on the grounds they lack a mandate under Irish law, because it does not allow for third party redress (yet the Dutch consumer organization had filed the complaint under Dutch law which does…).

The report also queries why the DPC chose to open an own volition enquiry into Google’s location data activities (rather than a complaint-led enquiry) — which Beuc says risks a further delay to reaching a decision on the complaints themselves.

It further points out that the DPC’s probe of Google only looks at activity since February 2020 not November 2018 when the complaints were made — meaning there’s a missing chunk of Google’s location data processing that’s not even being investigated yet.

It notes that three of its member organizations involved in the Google complaints had considered applying for a judicial review of the DPC’s decision (NB: others have resorted to that route) — but they decided not to proceed in part because of the significant legal costs it would have entailed.

The report also points out the inherent imbalance of GDPR’s one-stop-shop mechanism shifting the administration of complaints to the location of companies under investigation — arguing they therefore benefit from “easier access to justice” (vs the ordinary consumer faced with undertaking legal proceedings in a different country and (likely) language).

“If the lead authority is in a country with tradition in ‘common law’, like Ireland, things can become even more complex and costly,” Beuc’s report further notes.

Another issue it raises is the overarching one of rights complaints having to fight what it dubs ‘a moving target’ — given well-resourced tech companies can leverage regulatory delays to (superficially) tweak practices, greasing continued abuse with misleading PR campaigns. (Something Beuc accuses Google of doing.)

DPAs must “adapt their enforcement approach to intervene more rapidly and directly”, it concludes.

“Over two years have passed since the GDPR became applicable, we have now reached a turning point. The GDPR must finally show its strength and become a catalyst for urgently needed changes in business practices,” Beuc goes on in a summary of its recommendations. “Our members experience and that of other civil society organisations, reveals a series of obstacles that significantly hamper the effective application of the GDPR and the correct functioning of its enforcement system.

BEUC recommends to the relevant EU and national authorities to make a comprehensive and joint effort to ensure the swift enforcement of the rules and improve the position of data subjects and their representing organisations, particularly in the framework of cross-border enforcement cases.”

We reached out to the Commission and the Irish DPC with questions about the report. But at the time of writing neither had responded. We’ve also asked Google for comment.

Beuc earlier sent a list of eight recommendations for “efficient” GDPR enforcement to the Commission in May.

Europe sets out the rules of the road for its data reuse plan

European Union lawmakers have laid out a major legislative proposal today to encourage the reuse of industrial data across the Single Market by creating a standardized framework of trusted tools and techniques to ensure what they describe as “secure and privacy-compliant conditions” for sharing data.

Enabling a network of trusted and neutral data intermediaries, and an oversight regime comprised of national monitoring authorities and a pan-EU coordinating body, are core components of the plan.

The move follows the European Commission’s data strategy announcement in February, when it said it wanted to boost data reuse to support a new generation of data-driven services powered by data-hungry artificial intelligence, as well as encouraging the notion of using ‘tech for good’ by enabling “more data and good quality data” to fuel innovation with a common public good (like better disease diagnostics) and improve public services.

The wider context is that personal data is already regulated in the bloc (such as under the General Data Protection Regulation; GDPR), which restricts reuse. While commercial considerations can limit how industrial data is shared.

The EU’s executive believes harmonzied requirements that set technical and/or legal conditions for data reuse are needed to foster legal certainty and trust — delivered via a framework that promises to maintain rights and protections and thus get more data usefully flowing.

The Commission sees major business benefits flowing from the proposed data governance regime. “Businesses, both small and large, will benefit from new business opportunities as well as from a reduction in costs for acquiring, integrating and processing data, from lower barriers to enter markets, and from a reduction in time-to-market for novel products and services,” it writes in a press release.

It has further data related proposals incoming in 2021, in addition to a package of digital services legislation it’s due to lay out early next month — as part of a wider reboot of industrial strategy which prioritises digitalization and a green new deal.

All legislative components of the strategy will need to gain the backing of the European Council and parliament so there’s a long road ahead for implementing the plan.

Data Governance Act

EU lawmakers often talk in shorthand about the data strategy being intended to encourage the sharing and reuse of “industrial data” — although the Data Governance Plan (DGA) unveiled today has a wider remit.

The Commission envisages the framework enabling the sharing of data that’s subject to data protection legislation — which means personal data; where privacy considerations may (currently) restrain reuse — as well as industrial data subject to intellectual property, or which contains trade secrets or other commercially sensitive information (and is thus not typically shared by its creators primarily for commercial reasons). 

In a press conference on the data governance proposals, internal market commissioner Thierry Breton floated the notion of “data altruism” — saying the Commission wants to provide citizens with an organized way to share their own personal data for a common/public good, such as aiding research into rare diseases or helping cities map mobility for purposes like monitoring urban air quality.

“Through personal data spaces, which are novel personal information management tools and services, Europeans will gain more control over their data and decide on a detailed level who will get access to their data and for what purpose,” the Commission writes in a Q&A on the proposal.

It’s planning a public register where entities will be able to register as a “data altruism organisation” — provided they have a not-for-profit character; meet transparency requirements; and implement certain safeguards to “protect the rights and interests of citizens and companies” — with the aim of providing “maximum trust with minimum administrative burden”, as it puts it.

The DGA envisages different tools, techniques and requirements governing how private sector bodies share data vs private companies.

For public sector bodies there may be technical requirements (such as encryption or anonymization) attached to the data itself or further processing limitations (such as requiring it to take place in “dedicated infrastructures operated and supervised by the public sector”), as well as legally binding confidentiality agreements that must be signed by the reuser.

“Whenever data is being transferred to a reuser, mechanisms will be in place that ensure compliance with the GDPR and preserve the commercial confidentiality of the data,” the Commission’s PR says.

To encourage businesses to get on board with pooling their own data-sets — for the promise of a collective economic upside via access to bigger volumes of pooled data — the plan is for regulated data intermediaries/marketplaces to provide “neutral” data-sharing services, acting as the “trusted” go-between/repository so data can flow between businesses.

“To ensure this neutrality, the data-sharing intermediary cannot exchange the data for its own interest (e.g. by selling it to another company or using it to develop their own product based on this data) and will have to comply with strict requirements to ensure this neutrality,” the Commission writes on this.

Under the plan, intermediaries’ compliance with data handling requirements would be monitored by public authorities at a national level.

But the Commission is also proposing the creation of a new pan-EU body, called the European Data Innovation Board, that would try to knit together best practice across Member States — in what looks like a mirror of the steering/coordinating role undertaken by the European Data Protection Board (which links up the EU’s patchwork of data protection supervisory authorities).

“These data brokers or intermediaries that will provide for data sharing will do that in a way that your rights are protected and that you have choices,” said EVP Margrethe Vestager, who heads up the bloc’s digital strategy, also speaking at today’s press conference.

“So that you can also have personal data spaces where your data is managed. Because, initially, when you ask people they say well actually we do want to share but we don’t really know how to do it. And this is not only the technicalities — it’s also the legal certainty that’s missing. And this proposal will provide that,” she added.

Data localization requirements — or not?

The commissioners faced a number of questions over the hot button issue of international data transfers.

Breton was asked whether the DGA will include any data localization requirements. He responded by saying — essentially — that the rules will bake in a series of conditions which, depending on the data itself and the intended destination, may mean that storing and processing the data in the EU is the only viable option.

“On data localization — what we do is to set a GDPR-type of approach, through adequacy decisions and standard contractual clauses for only sensitive data through a cascading of conditions to allow the international transfer under conditions and in full respect of the protected nature of the data. That’s really the philosophy behind it,” Breton said. “And of course for highly sensitive data [such as] in the public health domain it is necessary to be able to set further conditions, depending on the sensitivity, otherwise… Member States will not share them.”

“For instance it could be possible to limit the reuse of this data into public secure infrastructures so that companies will come to use the data but not keep them. It could be also about restricting the number of access in third countries, restricting the possibility to further transfer the data and if necessary also prohibiting the transfer to a third country,” he went on, adding that such conditions would be “in full respect” of the EU’s WTO obligations.

In a section of its Q&A that deals with data localization requirements, the Commission similarly dances around the question, writing: “There is no obligation to store and process data in the EU. Nobody will be prohibited from dealing with the partner of their choice. At the same time, the EU must ensure that any access to EU citizen’s personal data and certain sensitive data is in compliance with its values and legislative framework.”

At the presser, Breton also noted that companies that want to gain access to EU data that’s been made available for reuse will need to have legal representation in the region. “This is important of course to ensure the enforceability of the rules we are setting,” he said. “It is very important for us — maybe not for other continents but for us — to be fully compliant.”

The commissioners also faced questions about how the planned data reuse rules would be enforced — given ongoing criticism over the lack of uniformly vigorous enforcement of Europe’s data protection framework, GDPR.

“No rule is any good if not enforced,” agreed Vestager. “What we are suggesting here is that if you have a data sharing service provider and they have notified themselves it’s then up to the authority with whom they have notified actually to monitor and to supervise the compliance with the different things that they have to live up to in order to preserve the protection of these legitimate interests — could be business confidentiality, could be intellectual property rights.

“This is a thing that we will keep on working on also in the future proposals that are upcoming — the Digital Services Act and the Digital Markets Act — but here you have sort of a precursor that the ones who receive the notification in Member States they will also have to supervise that things are actually in order.”

Also responding on the enforcement point, Breton suggested enforcement would be baked in up front, such as by careful control of who could become a data reuse broker.

“[Firstly] we are putting forward common rules and harmonized rules… We are creating a large internal market for data. The second thing is that we are asking Member States to create specific authorities to monitor. The third thing is that we will ensure coherence and enforcement through the European Data Innovation Board,” he said. “Just to give you an example… enforcement is embedded. To be a data broker you will need to fulfil a certain number of obligations and if you fulfil these obligations you can be a neutral data broker — if you don’t

Alongside the DGA, the Commission also announced an Intellectual Property Action Plan.

Vestager said this aims to build on the EU’s existing IP framework with a number of supportive actions — including financial support for SMEs involved in the Horizon Europe R&D program to file patents.

The Commission is also considering whether to reform the framework for filing standards essential patents. But in the short term Vestager said it would aim to encourage industry to engage in forums aimed at reducing litigation.

“One example could be that the Commission could set up an independent system of third party essentiality checks in view of improving legal certainty and reducing litigation costs,” she added of the potential reform, noting that protecting IP is an important component of the bloc’s industrial strategy.

Australia’s spy agencies caught collecting COVID-19 app data

Australia’s intelligence agencies have been caught “incidentally” collecting data from the country’s COVIDSafe contact tracing app during the first six months of its launch, a government watchdog has found.

The report, published Monday by the Australian government’s inspector general for the intelligence community, which oversees the government’s spy and eavesdropping agencies, said the app data was scooped up “in the course of the lawful collection of other data.”

But the watchdog said that there was “no evidence” that any agency “decrypted, accessed or used any COVID app data.”

Incidental collection is a common term used by spies to describe the data that was not deliberately targeted but collected as part of a wider collection effort. This kind of collection isn’t accidental, but more of a consequence of when spy agencies tap into fiber optic cables, for example, which carries an enormous firehose of data. An Australian government spokesperson told one outlet, which first reported the news, that incidental collection can also happen as a result of the “execution of warrants.”

The report did not say when the incidental collection stopped, but noted that the agencies were “taking active steps to ensure compliance” with the law, and that the data would be “deleted as soon as practicable,” without setting a firm date.

For some, fears that a government spy agency could access COVID-19 contact tracing data was the worst possible outcome.

Since the start of the COVID-19 pandemic, countries — and states in places like the U.S. — have rushed to build contact tracing apps to help prevent the spread of the virus. But these apps vary wildly in terms of functionality and privacy.

Most have adopted the more privacy-friendly approach of using Bluetooth to trace people with the virus that you may have come into contact with. Many have chosen to implement the Apple-Google system, which hundreds of academics have backed. But others, like Israel and Pakistan, are using more privacy invasive techniques, like tracking location data, which governments can also use to monitor a person’s whereabouts. In Israel’s case, the tracking was so controversial that the courts shut it down.

Australia’s intelligence watchdog did not say specifically what data was collected by the spy agencies. The app uses Bluetooth and not location data, but the app requires the user to upload some personal information — like their name, age, postal code, and phone number — to allow the government’s health department to contact those who may have come into contact with an infected person.

Australia has seen more than 27,800 confirmed coronavirus cases and over 900 deaths since the start of the pandemic.

Brexit’s data compliance burden could cost UK firms up to £1.6BN, says think tank

An analysis of the total cost to UK businesses if the country fails to gain an adequacy agreement from the European Commission once it leaves the bloc at the end of the year — creating barriers to inbound data flows from the EU — suggests the price in pure compliance terms could be between £1BN and £1.6BN.

The assessment of the economic impacts if the UK is deemed a third country under EU data rules has been carried out by the New Economics Foundation (NEF) think tank and UCL’s European Institute research hub — with the researchers  conducting interviews with over 60 legal professionals, data protection officers, business representatives, and academics, from the UK and EU.

They are estimating that the average compliance cost for an affected micro business will be £3,000; or £10,000 for a small business; £19,555 for a medium business; and £162,790 for a large business.

“This extra cost stems from the additional compliance obligations – such as setting up standard contractual clauses (SCCs) – on companies that want to continue transferring data from the EU to the UK,” they write in the report. “We believe our modelling is a relatively conservative estimate as it is underpinned by moderate assumptions about the firm-level cost and number of companies affected.”

An adequacy agreement refers to a status that can be conferred on a country outside the European Economic Area (as the UK will be once the Brexit transition is over) — if the EU’s executive deems the levels of data protection in the country are essentially equivalent to what’s provided by European law.

The UK has said it wants to gain an adequacy agreement with the EU as it works on implementing the 2016 referendum vote to leave the bloc. But there are doubts over its chances of obtaining the coveted status — not least because of surveillance powers enshrined in UK law since the 2013 Snowden disclosures (which revealed the extent of Western governments’ snooping on digital data flows).

Broad powers that sanction UK state agencies’ digital surveillance have faced a number of legal challenges under UK and EU law.

The government has also signalled an intention to ‘liberalize’ domestic data laws as it leaves the EU — writing in a national data strategy published in September that it wants to ensure data is not “inappropriately constrained” by regulations “so that it can be used to its full potential”.

But any moves to denude the UK’s data protection standards risk an ‘inadequate’ finding by the Commission.

Europe’s top court, meanwhile, has set a clear line that governments cannot use national security to bypass general principles of EU law, such as proportionality and respect for privacy.

Another major — and highly pertinent — ruling by the CJEU this summer invalidated an adequacy status the Commission had previously conferred on the US, striking down the EU-US Privacy Shield transatlantic data transfer mechanism. It does not bode well for the UK’s chances of adequacy.

The court also made it clear that the most used alternative for international transfers (a legal tool called Standard Contractual Clauses, aka SCCs) must face proactive scrutiny from EU regulators when data is flowing to third countries where citizens’ information could be at risk.

The thousands of companies that had been relying on Privacy Shield to rubberstamp their EU to US data flows are now scrambling for alternatives on a case by case basis — with vastly inflated legal risk, complexity and administration requirements.

The same may be true in very short order for scores of UK-based data controllers that want to continue being able to receive inbound data flows from users in the EU after the end of the Brexit transition.

Earlier this month the European Data Protection Board (EDPB) put out 38 pages of guidance for those trying to navigate new legal uncertainty around SCCs — in which it warned there may be situations where no supplementary measures will suffice to ensure adequate protection for a specific transfer.

The solution in such a case might require relocation of the data processing to a site within the EU, the EDPB said.

“Although the UK has high standards of data protection via the Data Protection Act 2018, which enacted the General Data Protection Regulation (GDPR) in UK law, an EU adequacy decision is not guaranteed,” the NEF/UCL report warns. “Potential EU concerns with UK national security, surveillance and human rights frameworks, as well as a future trade deal with the US, render adequacy uncertain. Furthermore, EUUK data flows are at the whim of the wider Brexit process and negotiations.”

Per their analysis, if the UK does not get an adequacy decision it will face an increased risk of GDPR fines due to increased compliance requirements.

The General Data Protection Regulation sanctions financial penalties for violations of the framework that can scale up to 4% of an entity’s global annual turnover or €20M, whichever is greater.

The report also predicts a reduction in EU-UK trade, especially digital trade; reduced investment (both domestic and international); and the relocation of business functions, infrastructure, and personnel outside the UK.

The researchers argue that more research is needed to support a wider macroeconomic assessment of the value of data flows and adequacy decisions — saying there’s a paucity of research on “the value of data flows and adequacy decisions in general” — before adding: “EU-UK data flows are a crucial enabler for thousands of businesses. These flows underpin core business operations and activities which add significant value. This is not just a digital tech sector issue – the whole economy relies on data flows.”

The report makes a number of recommendations — including urging the UK government to make “relevant data and modelling tools” available to support empirical research on the social and economic impacts of data protection, digital trade, and the value of data flows to help shape better public policy and debate.

It also calls for the government to set aside funds for struggling UK SMEs to help them with the costs of complying with Brexit’s legal data burden.

“Our report concludes that no adequacy decision has the potential to be a contributing factor which undermines the competitiveness of key UK services and digital technology sectors, which have performed extremely strongly in recent years. Although we do not want to exaggerate the impacts — and no adequacy decision is far from economic armageddon — this outcome would not be ideal,” they add.

You can read the full report here.

Digital marketing firms file UK competition complaint against Google’s Privacy Sandbox

Google’s push to phase out third party tracking cookies — aka its ‘Privacy Sandbox’ initiative — is facing a competition challenge in Europe. A coalition of digital marketing companies announced today that it’s filed a complaint with the UK’s Competition and Markets Authority (CMA), calling for the regulator to block implementation of the Sandbox.

The coalition wants Google’s phasing out of third party tracking cookies to be put on ice to prevent the Sandbox launching in early 2021 to give regulators time to devise or propose what it dubs “long term competitive remedies to mitigate [Google’s dominance]”.

“[Our] letter is asking for the introduction of Privacy Sandbox to be delayed until such measures are put in place,” they write in a press release.

The group, which is badging itself as Marketers for an Open Web (MOW), says it’s comprised of “businesses in the online ecosystem who share a concern that Google is threatening the open web model that is vital to the functioning of a free and competitive media and online economy”.

A link on MOW’s website to a list of “members” was not functioning at the time of writing. But, per Companies House, the entity was incorporated on September 18, 2020 — listing James Roswell, CEO and co-founder of UK mobile marketing company, 51 Degrees, as its sole director.

The CMA confirmed to us that it’s received MOW’s complaint, adding that some of the coalition’s concerns reflect issues identified in a detailed review of the online ad market it published this summer.

However it has not yet taken a decision on whether or not to investigate.

“We can confirm we have received a complaint regarding Google raising certain concerns, some of which relate to those we identified in our online platforms and digital advertising market study,” said the CMA spokesperson. “We take the matters raised in the complaint very seriously, and will assess them carefully with a view to deciding whether to open a formal investigation under the Competition Act.

“If the urgency of the concerns requires us to intervene swiftly, we will also assess whether to impose interim measures to order the suspension of any suspected anti-competitive conduct pending the outcome of a full investigation.”

In its final report of the online ad market, the CMA concluded that the market power of Google and Facebook is now so great that a new regulatory approach — and a dedicated oversight body — is needed to address what it summarized as “wide ranging and self reinforcing” concerns.

Although the regulator chose not to take any enforcement action at that point — preferring to wait for the UK government to come forward with pro-competition legislation.

In its statement today, the CMA makes it clear it could still choose to act on related competition concerns if it feels an imperative to do so — including potentially blocking the launch of Privacy Sandbox to allow time for a full investigation — while it waits for legislators to come up with a regulatory framework. Though, again, it has not made any decisions yet.

Reached for a response to the MOW complaint, Google sent us this statement — attributed to a spokesperson:

The ad-supported web is at risk if digital advertising practices don’t evolve to reflect people’s changing expectations around how data is collected and used. That’s why Google introduced the Privacy Sandbox, an open initiative built in collaboration with the industry, to provide strong privacy for users while also supporting publishers.

Also commenting in a statement, MOW’s director Roswell said: “The concept of the open web is based on a decentralised, standards-based environment that is not under the control of any single commercial organisation.  This model is vital to the health of a free and independent media, to a competitive digital business environment and to the freedom and choice of all web users.  Privacy Sandbox creates new, Google-owned standards and is an irreversible step towards a Google-owned ‘walled garden’ web where they control how businesses and users interact online.”

The group’s complaint follows a similar one filed in France last month (via Reuters) — albeit, in that case targeting privacy changes incoming to Apple’s smartphone platform that are also set to limit advertisers access to an iPhone-specific tracking ID that’s generated for that purpose (IDFA).

Apple has said the incoming changes — which it recently delayed until early next year — will give users “greater control over whether or not they want to allow apps to track them by linking their information with data from third parties for the purpose of advertising, or sharing their information with data brokers”. But four online ad associations — IAB France, MMAF, SRI and UDECAM — bringing the complaint to France’s competition regulator argue Apple is abusing its market power to distort competition.

The move by the online ad industry to get European competition regulators to delay Apple’s and Google’s privacy squeeze on third party ad tracking is taking place at the same time as industry players band together to try to accelerate development of their own replacement for tracking cookies — announcing a joint effort called PRAM (Partnership for Responsible Addressable Media) this summer to “advance and protect critical functionalities like customization and analytics for digital media and advertising, while safeguarding privacy and improving consumer experience”, as they put it.

The adtech industry now appears to be coalescing behind a cookie replacement proposal called UnifiedOpen ID 2.0 (UID2).

A document detailing the proposal which had been posted to the public Internet — but was taken down after a privacy researcher drew attention to it — suggests they want to put in place a centralized system for tracking Internet users that’s based on personal data such as an email address or phone number.

“UID2 is based on authenticated PII (e.g. email, phone) that can be created and managed by constituents across advertising ecosystem, including Advertisers, Publishers, DSPs, SSPs,” runs a short outline of the proposal in the paper, which is authored by two people from a Demand Side Platform called The Trade Desk that’s proposing to build the tech but then hand it off to an “independent and non-partial entity” to manage.

One component of the UID2 proposal consists of a “Unified ID Service” that it says would apply a salt and hash process to the PII to generate UID2 and encrypting that to create a UID2 Token, as well as provision login requests from publishers to access the token.

The other component is a user facing website that’s described as a “transparency & consent service” — to handle user requests for data or UID2 logouts etc.

However the proposal by the online ad industry to centralize Internet users’ identity by attaching it to hashed pieces of actual personal data — and with a self-regulating “Trusted Ads Ecosystem” slated to be controlling the mapping of PII to UID2 — seems unlikely to assuage the self-same privacy concerns fuelling the demise of tracking cookies in the first place (to put it mildly).

Trusting the mass surveillance industry to self regulate a centralized ID system for Internet users is for the birds.

But adtech players are clearly hoping they can buy themselves enough time to cobble together a self-serving cookie alternative — and sell it to regulators as a competition remedy. (Their parallel bet is they can buy off inactive privacy regulators with dubious claims of ‘transparency and consent’.)

So it will certainly be interesting to see whether the adtech industry succeeds in forcing competition regulators to stand in the way of platform-level privacy reforms, while pulling off a major reorg and rebranding exercise of privacy-hostile tracking operations.

In a counter move this month, European privacy campaign group, noyb, filed two complaints against Apple for not obtaining consent from users to create and store the IDFA on their devices.

So that’s one bit of strategic pushback.

Real-time bidding, meanwhile, remains under regulatory scrutiny in Europe — with huge questions over the lawfulness of its processing of Internet users’ personal data. Privacy campaigners are also now challenging data protection regulators over their failure to act on those long-standing complaints.

A flagship online ad industry tool for gathering web users’ consent to tracking is also under attack and looks to be facing imminent action under the bloc’s General Data Protection Regulation (GDPR) .

Last month an investigation by Belgium’s data protection agency found the IAB Europe’s so-called Transparency and Consent Framework (TCF) didn’t offer either — failing to meet the GDPR standard for transparency, fairness and accountability, and the lawfulness of data processing. Enforcement action is expected in early 2021.

A bug meant Twitter Fleets could still be seen after they disappear

Twitter is the latest social media site to allow users to experiment with posting disappearing content. Fleets, as Twitter calls them, allows its mobile users post short stories, like photos or videos with overlaying text, that are set to vanish after 24 hours.

But a bug meant that fleets weren’t deleting properly and could still be accessed long after 24 hours had expired. Details of the bug were posted in a series of tweets on Saturday, less than a week after the feature launched.

The bug effectively allowed anyone to access and download a user’s fleets without triggering a notification that the user’s fleet had been read and by whom. The implication is that this bug could be abused to archive a user’s fleets after they expire.

Using an app that’s designed to interact with Twitter’s back-end systems via its developer API. What returned was a list of fleets from the server. Each fleet had its own direct URL, which when opened in a browser would load the fleet as an image or a video. But even after the 24 hours elapsed, the server would still return links to fleets that had already disappeared from view in the Twitter app.

When reached, a Twitter spokesperson said a fix was on the way. “We’re aware of a bug accessible through a technical workaround where some Fleets media URLs may be accessible after 24 hours. We are working on a fix that should be rolled out shortly.”

Twitter acknowledged that the fix means that fleets should now expire properly, it said it won’t delete the fleet from its servers for up to 30 days — and that it may hold onto fleets for longer if they violate its rules. We checked that we could still load fleets from their direct URLs even after they expire.

Fleet with caution.

Apple’s IDFA gets targeted in strategic EU privacy complaints

A unique device identifier that Apple assigns to each iPhone for third parties to track users for ad targeting — aka the IDFA (Identifier for Advertisers) — is itself now the target of two new complaints filed by European privacy campaign not-for-profit, noyb.

The complaints, lodged with German and Spanish data protection authorities, contend that Apple’s setting of the IDFA breaches regional privacy laws on digital tracking because iOS users are not asked for their consent for the initial storage of the identifier.

noyb is also objecting to others’ being able to access the IDFA without prior consent — with one of its complainants writing that they were never asked for consent for third party access yet found several apps had shared their IDFA with Facebook (per their off-Facebook activity page).

We’ve reached out to the data protection agencies in question for comment.

While Apple isn’t the typical target for digital privacy campaigners, given it makes most of its money selling hardware and software instead of profiling users for ad targeting, as adtech giants like Facebook and Google do, its marketing rhetoric around taking special care over user privacy can look awkward when set against the existence of an Identifier for Advertisers baked into its hardware.

In the European Union there’s a specific legal dimension to this awkwardness — as existing laws require explicit consent from users to (non-essential) tracking. noyb’s complaints cite Article 5(3) of the EU’s ePrivacy Directive which mandates that users must be asked for consent to the storage of ad tracking technologies such as cookies. (And noyb argues the IDFA is just like a tracking cookie but for iPhones.)

Europe’s top court further strengthened the requirement last year when it made it clear that consent for non-essential tracking must be obtained prior to storing or accessing the trackers. The CJEU also ruled that such consent cannot be implied or assumed — such as by the use of pre-checked ‘consent’ boxes.

In a press release about the complaints, noyb’s Stefano Rossetti, a privacy lawyer, writes: “EU law protects our devices from external tracking. Tracking is only allowed if users explicitly consent to it. This very simple rule applies regardless of the tracking technology used. While Apple introduced functions in their browser to block cookies, it places similar codes in its phones, without any consent by the user. This is a clear breach of EU privacy laws.”

Apple has long controlled how third parties serving apps on its iOS platform can use the IDFA, wielding the stick of ejection from its App Store to drive their compliance with its rules.

Recently, though, it has gone further — telling advertisers this summer they will soon have to offer users an opt-out from ad tracking in a move billed as increasing privacy controls for iOS users — although Apple delayed implementation of the policy until early next year after facing anger from advertisers over the plan. But the idea is there will be a toggle in iOS 14 that users need to flip on before a third party app gets to access the IDFA to track iPhone users’ in-app activity for ad targeting.

However noyb’s complaint focuses on Apple’s setting of the IDFA in the first place — arguing that since the pseudonymised identifier constitutes private (personal) data under EU law they need to get permission before creating and storing it on their device.

“The IDFA is like a ‘digital license plate’. Every action of the user can be linked to the ‘license plate’ and used to build a rich profile about the user. Such profile can later be used to target personalised advertisements, in-app purchases, promotions etc. When compared to traditional internet tracking IDs, the IDFA is simply a ‘tracking ID in a mobile phone’ instead of a tracking ID in a browser cookie,” noyb writes in one complaint, noting that Apple’s privacy policy does not specify the legal basis it uses to “place and process” the IDFA.

noyb also argues that Apple’s planned changes to how the IDFA gets accessed — trailed as incoming in early 2021 — don’t go far enough.

“These changes seem to restrict the use of the IDFA for third parties (but not for Apple itself),” it writes. “Just like when an app requests access to the camera or microphone, the plans foresee a new dialog that asks the user if an app should be able to access the IDFA. However, the initial storage of the IDFA and Apple’s use of it will still be done without the users’ consent and therefore in breach of EU law. It is unclear when and if these changes will be implemented by the company.”

We reached out to Apple for comment on noyb’s complaints but at the time of writing an Apple spokesman said it did not have an on-the-record statement. The spokesman did tell us that Apple itself does not use unique customer identifiers for advertising.

In a separate but related recent development, last month publishers and advertisers in France filed an antitrust complaint against the iPhone maker over its plan to require opt-in consent for accessing the IDFA — with the coalition contending the move amounts to an abuse of market power.

Apple responded to the antitrust complaint in a statement that said: “With iOS 14, we’re giving users the choice whether or not they want to allow apps to track them by linking their information with data from third parties for the purpose of advertising, or sharing their information with data brokers.”

We believe privacy is a fundamental human right and support the European Union’s leadership in protecting privacy with strong laws such as the GDPR (General Data Protection Regulation),” Apple added then.

That antitrust complaint may explain why noyb has decided to file its own strategic complaints against Apple’s IDFA. Simply put, if no tracker ID can be created — because an iOS user refuses to give consent — there’s less surface area for advertisers to try to litigate against privacy by claiming tracking is a competitive right.

“We believe that Apple violated the law before, now and after these changes,” said Rossetti in another statement. “With our complaints we want to enforce a simple principle: trackers are illegal, unless a user freely consents. The IDFA should not only be restricted, but permanently deleted. Smartphones are the most intimate device for most people and they must be tracker-free by default.”

Another interesting component of the noyb complaints is they’re being filed under the ePrivacy Directive, rather than under Europe’s (newer) General Data Protection Regulation. This means noyb is able to target them to specific EU data protection agencies, rather than having complaints funnelled back to Ireland’s DPC — under the GDPR’s one-stop-shop mechanism for handling cross-border cases.

Its hope is this route will result in swifter regulatory action. These cases are based on the ‘old’ cookie law and do not trigger the cooperation mechanism of the GDPR. In other words, we are trying to avoid endless procedures like the ones we are facing in Ireland,” added Rossetti.

Apple responds to Gatekeeper issue with upcoming fixes

Apple has updated a documentation page detailing the company’s next steps to prevent last week’s Gatekeeper bug from happening again, as Rene Ritchie spotted. The company plans to implement the fixes over the next year.

Apple had a difficult launch day last week. The company released macOS Big Sur, a major update for macOS. Apple then suffered from server-side issues.

Third-party apps failed to launch as your Mac couldn't check the developer certificate of the app. That feature, called Gatekeeper, makes sure that you didn't download a malware app that disguises itself as a legit app. If the certificate doesn’t match, macOS prevents the app launch.

Many have been concerned about the privacy implications of the security feature. Does Apple log every app you launch on your Mac to gain competitive insights on app usage?

It turns out it's easy to answer that question as the server doesn't mandate encryption. Jacopo Jannone intercepted an unencrypted network request and found out that Apple is not secretly spying on you. Gatekeeper really does what it says it does.

“We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices,” the company wrote.

But Apple is going one step further and communicating on the company's next steps. The company has stopped logging IP addresses on its servers since last week. It doesn't have to store this data for Gatekeeper .

“These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs” Apple writes.

Finally, Apple is overhauling the design of the network request and adding a user-facing opt-out option.

“In addition, over the the next year we will introduce several changes to our security checks:

  • A new encrypted protocol for Developer ID certificate revocation checks
  • Strong protections against server failure
  • A new preference for users to opt out of these security protections”

Gifting a gadget? Check its creep factor on Mozilla’s ‘Privacy not included’ list of shame

Buying someone a gadget is a time-honored tradition, but these days it can be particularly fraught, considering you may buy them a fitness tracker that also monitors emotions, or a doorbell that snitches to the cops. Mozilla has put together a helpful list of popular gadgets with ratings on just how creepy they are.

“Privacy not included” has become an annual tradition for the internet rights advocate, and this year has an especially solid crop of creepy devices, given the uptick in smart speakers, smart security cameras and smart litterboxes.

On the “creepy” end of the spectrum is… pretty much everything by Amazon except the Kindle. The devices in question send tons of data to Amazon by design, of course, but Mozilla feels the company hasn’t yet earned the trust to make that sort of thing acceptable. Facebook’s Portal earns a creepy spot for a similar reason.

Image Credits: Mozilla

Some random gadgets like a smart coffee maker and Moleskine smart notebook get creepy ratings because they don’t give the kinds of assurances about data and security that any company collecting that information should give. That sort of thing is common in smart gadgets — they may not be fundamentally creepy, but the company that makes them reserves the right to make it creepy at any time.

On the other end of the spectrum, Withings earns points for its smart devices with reasonable privacy policies and security. Non-Ring smart doorbells get good marks, and Garmin’s smart watches too.

These are informal rankings based on the potential for abuse or exposure of your data, and it doesn’t mean that they’re perfectly safe or private. If you’re buying one of these things, it’s best to immediately go through the settings and preferences and disable anything that smells invasive or creepy. You can always enable features again, but once you’ve put your data out there, it’s hard to get it back.

Check out the rest of the list here.

Europe puts out advice on fixing international data transfers that’s cold comfort for Facebook

Following the landmark CJEU ‘Schrems II’ ruling in July, which invalidated the four-year-old EU-US Privacy Shield, European data protection regulators have today published 38-pages of guidance for businesses stuck trying to navigate the uncertainty around how to (legally) transfer personal data out of the European Union.

The European Data Protection Board’s (EDPB) recommendations focus on measures data controllers might be able to put in place to supplement the use of another transfer mechanism: so-called Standard Contractual Clauses (SCCs) to ensure they are complying with the bloc’s General Data Protection Regulation (GDPR) .

Unlike Privacy Shield, SCCs were not struck down by the court but their use remains clouded with legal uncertainty. The court made it clear SCCs can only be relied upon for international transfers if the safety of EU citizens’ data can be guaranteed. It also said EU regulators have a duty to intervene when they suspect data is flowing to a location where it will not be safe — meaning options for data transfers out of the EU have both reduced in number and increased in complexity.

One company that’s said it’s waiting for the EDPB guidance is Facebook. It’s already faced a preliminary order to stop transferring EU users data to the US. It petitioned the Irish courts to obtain a stay as it seeks a judicial review of its data protection regulator’s process. It has also brought out its lobbying big guns — former UK deputy PM and ex-MEP Nick Clegg — to try to pressure EU lawmakers over the issue.

Most likely the tech giant is hoping for a ‘Privacy Shield 2.0‘ to be cobbled together and slapped into place to paper over the gap between EU fundamental rights and US surveillance law.

But the Commission has warned there won’t be a quick fix this time.

Changes to US surveillance law are slated as necessary — which means zero chance of anything happening before the Biden administration takes the reins next year. So the legal uncertainty around EU-US transfers is set to stretch well into next year at a minimum. (Politico suggests a new data deal isn’t likely in the first half of 2021.)

In the meanwhile, legal challenges to ongoing EU-US transfers are stacking up — at the same time as EU regulators know they have a legal duty to intervene when data is at risk.

“Standard contractual clauses and other transfer tools mentioned under Article 46 GDPR do not operate in a vacuum,” the EDPB warns in an executive summary. “The Court states that controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and, where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools.

“In those cases, the Court still leaves open the possibility for exporters to implement supplementary measures that fill these gaps in the protection and bring it up to the level required by EU law. The Court does not specify which measures these could be. However, the Court underlines that exporters will need to identify them on a case-by-case basis. This is in line with the principle of accountability of Article 5.2 GDPR, which requires controllers to be responsible for, and be able to demonstrate compliance with the GDPR principles relating to processing of personal data.”

The EDPB’s recommendations set out a series of steps for data exporters to take as they go through the complex task of determining whether their particular transfer can play nice with EU data protection law.

Six steps but no one-size-fits-all fix

The basic overview of the process it’s advising is: Step 1) map all intended international transfers; step 2) verify the transfer tools you want to use; step 3) assess whether there’s anything in the law/practice of the destination third country which “may impinge on the effectiveness of the appropriate safeguards of the transfer tools you are relying on, in the context of your specific transfer”, as it puts it; step 4) identify and adopt supplementary measure/s to bring the level of protection up to ‘essential equivalent’ with EU law; step 5) take any formal procedural steps required to adopt the supplementary measure/s; step 6) periodically re-evaluate the level of data protection and monitor any relevant developments.

In short, this is going to involve both a lot of work — and ongoing work. tl;dr: Your duty to watch over the safety of European users’ data is never done.

Moreover, the EDPB makes it clear that there very well may not be any supplementary measures to cover a particular transfer in legal glory.

“You may ultimately find that no supplementary measure can ensure an essentially equivalent level of protection for your specific transfer,” it warns. “In those cases where no supplementary measure is suitable, you must avoid, suspend or terminate the transfer to avoid compromising the level of protection of the personal data. You should also conduct this assessment of supplementary measures with due diligence and document it.”

In instances where supplementary measures could suffice the EDPB says they may have “a contractual, technical or organisational nature” — or, indeed, a combination of some or all of those.

“Combining diverse measures in a way that they support and build on each other may enhance the level of protection and may therefore contribute to reaching EU standards,” it suggests.

However it also goes on to state fairly plainly that technical measures are likely to be the most robust tool against the threat posed by foreign surveillance. But that in turn means there are necessarily limits on the business models that can tap in — anyone wanting to decrypt and process data for themselves in the US, for instance, (hi Facebook!) isn’t going to find much comfort here.

The guidance goes on to include some sample scenarios where it suggests supplementary measures might suffice to render an international transfer legal.

Such as data storage in a third country where there’s no access to decrypted data at the destination and keys are held by the data exporter (or by a trusted entity in the EEA or in a third country that’s considered to have an adequate level of protection for data); or the transfer of pseudonymised data — so individuals can no longer be identified (which means ensuring data cannot be reidentified); or end-to-end encrypted data transiting third countries via encrypted transfer (again data must not be able to be decrypted in a jurisdiction that lacks adequate protection; the EDPB also specifies that the existence of any ‘backdoors’ in hardware or software must have been ruled out, although it’s not clear how that could be done).

Another section of the document discusses scenarios in which no effective supplementary measures could be found — such as transfers to cloud service providers (or similar) which require access to the data in the clear and where “the power granted to public authorities of the recipient country to access the transferred data goes beyond what is necessary and proportionate in a democratic society”.

Again, this is a bit of the document that looks very bad for Facebook.

“The EDPB is, considering the current state of the art, incapable of envisioning an effective technical measure to prevent that access from infringing on data subject rights,” it writes on that, adding that it “does not rule out that further technological development may offer measures that achieve the intended business purposes, without requiring access in the clear”.

“In the given scenarios, where unencrypted personal data is technically necessary for the provision of the service by the processor, transport encryption and data-at-rest encryption even taken together, do not constitute a supplementary measure that ensures an essentially equivalent level of protection if the data importer is in possession of the cryptographic keys,” the EDPB further notes.

It also makes it clear that supplementary contractual clauses aren’t any kind of get-out on this front — so, no, Facebook can’t stick a clause in its SCCs that defuses FISA 702 — with the EDPB writing: “Contractual measures will not be able to rule out the application of the legislation of a third country which does not meet the EDPB European Essential Guarantees standard in those cases in which the legislation obliges importers to comply with the orders to disclose data they receive from public authorities.”

The EDPB does discuss examples of potential clauses data exporters could use to supplement SCCs, depending on the specifics of their data flow situation — alongside specifying “conditions for effectiveness” (or ineffectiveness in many cases, really). And, again, there’s cold comfort here for those wanting to process personal data in the US (or another third country) while it remains at risk from state surveillance.

“The exporter could add annexes to the contract with information that the importer would provide, based on its best efforts, on the access to data by public authorities, including in the field of intelligence provided the legislation complies with the EDPB European Essential Guarantees, in the destination country. This might help the data exporter to meet its obligation to document its assessment of the level of protection in the third country,” the EDPB suggests in one example from a section of the guidance discussing transparency obligations.

However the point of such a clause would be for the data exporter to put up-front conditions on an importer to make it easier for them to avoid getting into a risky contract in the first place — or help them with suspending/terminating a contract if a risk is determined — rather than providing any kind of legal sticking plaster for mass surveillance. Aka: “This obligation can however neither justify the importer’s disclosure of personal data nor give rise to the expectation that there will be no further access requests,” as the EDPB warns.

Another example discussed in the document is the viability of adding clauses to try to get the importer to certify there’s no backdoors in their systems which could put the data at risk.

However the EDPB warns this may just be useless, writing: “The existence of legislation or government policies preventing importers from disclosing this information may render this clause ineffective.” So the example could just be being included to try to kneecap dodgy legal advice that suggests contract clauses are a panacea for US surveillance overreach.

The EDPB’s full guidance can be found here.

We’ve also reached out to Facebook to ask what next steps it’ll be taking over its EU-US data transfers in light of the EDPB guidance and will update this report with any response. Update: Facebook has now sent this statement: “The CJEU ruled that Standard Contractual Clauses are a valid legal mechanism for the transfer of data from the EU, including to the US. We note that new guidelines on supplementary measures have been submitted for consultation and, like many other companies, will be reviewing them carefully.”