After the recent two-year anniversary of the GDPR, one fundamental question remains — who does the GDPR apply to?

By Gail Crawford, Ulrich Wuermeling, and Calum Docherty

Last month marked the two-year anniversary of the General Data Protection Regulation (GDPR), but its territorial reach is still hotly debated. This blog post takes a detailed look at the final guidelines on the territorial scope of the GDPR, which the European Data Protection Board (the EDPB) published on 12 November 2019 following public consultation of its draft guidelines dated 23 November 2018 (the Guidelines).

The Guidelines contain several helpful clarifications around when the GDPR applies to controllers and processors of personal data. At the same time, however, the Guidelines still present latent ambiguity as to when and to what extent the GDPR applies, particularly for multinationals.

Research participants must identify which data sets constitute personal data to ensure compliance with the GDPR.

By Frances Stocks Allen and Mihail Krepchev

The UK Medical Research Council (MRC) has published a useful guidance note on the identifiability, anonymisation, and pseudonymisation of personal data in the context of research activities (the Guidance). The Guidance reminds research organisations that the General Data Protection Regulation (GDPR) applies to health data used in research and contains a number of recommendations that participants in the research process, particularly clinical trial sponsors, should bear in mind. The Guidance has been developed with the participation of the UK privacy regulator, the Information Commissioner’s Office (ICO).

“Business as usual” for UK-EU data protection transition in 2020.  

By Gail E. Crawford and Susan Mann

On 29 January 2020, the EU Parliament approved the UK Withdrawal Agreement after the UK Parliament’s ratification via the EU Withdrawal Act 2020 on 23 January 2020 (Withdrawal Agreement). The Withdrawal Agreement maintains the UK pre-Brexit position and clarifies that the GDPR continues to apply in the UK during the transition period (between 1 February 2020 and 31 December 2020, or any extension agreed by UK and EU), allowing both sides to negotiate the future data protection relationship. The ICO confirmed that the GDPR will continue to apply, and that during the transition it will be “business as usual”.

The provisions of the UK GDPR will be incorporated directly into UK law from the end of the transition period, and will sit alongside the current UK Data Protection Act 2018. At the end of the transition period, there will be the current EU GDPR as well as a UK GDPR. The Withdrawal Agreement includes technical amendments to the current GDPR, so that it will work in a UK-only context.

Despite progress, the online advertising industry and UK regulators are still at odds over the “legitimate interest” definition under the GDPR.

By Olga Phillips and Elizabeth Purcell

Following publication of the UK Information Commissioner’s Office’s (ICO’s) report on adtech and real time bidding in June 2019, the ICO has been working closely with the online advertising industry to improve data protection practices by the end of the year.

Simon McDougall, the ICO’s Executive Director for Technology Policy and Innovation, reportedly stated at the recent AdTech London event that the ICO has made progress with the industry, including through workshops with Google and the Interactive Advertising Bureau Europe (IAB), which were both featured in the June report. However, McDougall noted that there is still “a very big difference” in how the online advertising industry and the ICO view the “legitimate interest” legal basis for processing personal data under the General Data Protection Regulation (GDPR). The ICO has yet to be convinced of the use cases in which the industry is seeking to rely on the legitimate interest basis.

China’s PCPPIC protects children’s personal information in much the same way as COPPA and the GDPR, but with a few differences.

By Wei-Chun (Lex) Kuo, Weina (Grace) Gao, and Cheng-Ling Chen

On August 22, 2019, the Cyberspace Administration of China (CAC) released a new data privacy regulation related to children, the Provisions on Cyber Protection of Personal Information of Children (儿童个人信息网络保护规定)(PCPPIC). The regulation will come into effect on October 1, 2019, and will apply within the People’s Republic of China (PRC).The PCPPIC’s stated purpose is “protecting the security of children’s personal information and promoting the healthy growth of children in the PRC.” In 29 Articles, the PCPPIC sets forth high-level requirements for the collection, storage, use, transfer, and disclosure of the personal information of children within PRC territory.

Recent action by the Hamburg authority may present implications for companies regulated by a lead data protection supervisory authority in Europe.

By Fiona Maclean, Tim Wybitul, Joachim Grittmann, Wolf Böhm, Isabelle Brams, and Amy Smyth

A German supervisory authority has initiated an investigation into Google’s speech recognition practices and language assistant technologies, which are integrated into its Google Assistant product. More specifically, the Hamburg supervisory authority opened proceedings with the intention to “prohibit Google from carrying out corresponding evaluations by employees or third parties for a period of three months. This is intended to protect the personal rights of those concerned for the time being.

This blog post analyzes the procedure against Google in Germany, in the context of recent trends elsewhere in Europe to transfer cases to lead authorities, and the impact for other companies regulated by a lead supervisory authority. The proceedings against Google might be resolved amicably, but still raise substantial questions over the powers of supervisory authorities under the cooperation and consistency mechanism of the GDPR.

Following in the footsteps of the CNIL and the ICO, the Berlin DPA will impose a multimillion-euro fine for breach of the GDPR.

By Tim Wybitul, Joachim Grittmann, Ulrich Wuermeling, Wolf-Tassilo Böhm, and Isabelle Brams

The Berlin Data Protection Authority (Berlin DPA) recently announced that it will issue a multimillion-euro fine for breach of the EU’s General Data Protection Regulation (GDPR), a significant step change in its GDPR enforcement approach. The Berlin DPA’s most significant penalty to date includes two fines on a company totaling €200,000. In that case, as with the latest announcement, the Berlin DPA has not yet named the affected company. The announcement also continues a trend, started by the French Data Protection Authority (CNIL) and followed by the UK Information Commissioner’s Office (ICO), of data protection authorities beginning to show their teeth in GDPR enforcement.

If adopted efficiently, the PCPD’s Ethical Accountability Framework should help organizations to demonstrate and enhance trust with individuals.

By Kieran Donovan

In October, 2018, Hong Kong’s Privacy Commissioner for Personal Data (PCPD) presented the findings of an inquiry into the ethics of data processing, commissioned by the PCPD with the help of the Information Accountability Foundation (IAF). The result of the inquiry, published as the Ethical Accountability Framework, provides an “instruction manual” for processing data in an ethical and accountable manner.

Following on the heels of the PCPD’s report, the Hong Kong Monetary Authority (HKMA) issued a Circular titled Use of Personal Data in Fintech Development, encouraging authorized institutions (AIs) to adopt the PCPD’s Ethical Accountability Framework.

The guidance provides general requirements for obtaining valid consent and details conditions under which audience management cookies may be exempt.

By Myria Saarinen and Camille Dorval

On 4 July 2019, one day after the UK Information Commissioner’s Office (ICO) published new guidance on cookies, the French Data Protection Authority (CNIL) released its own new guidance (Guidance). A corrective version followed on 19 July 2019.

The Guidance clarifies “consent” under Article 82 of the French Data Protection Act (Article 82). Article 82 implements the ePrivacy Directive’s cookies rule and constitutes the foundation of the French rules requiring organizations placing non-essential cookies to provide “clear and complete” information to users and to obtain their consent to the use of cookies.

The proposals would grant consumers increasing rights to require providers to share access to their data directly with chosen third parties.

By Alain Traill and Gail Crawford

The UK government has released a consultation advocating the introduction of sweeping new requirements for service providers to share both consumer data (upon request) and data regarding their own products and services, with third parties. The proposals, released on 11 June 2019 by the Department for Business, Energy and Industrial Strategy (BEIS) in its Smart Data report and consultation, are indicative of a wider drive toward requiring companies to free up access to the data they hold. The drivers behind this include a desire to increase competition, foster the growth of data-driven services, and improve consumer choice.

The proposals follow the introduction of a range of sector-specific initiatives in the UK and is part of a concerted government focus on digital strategy, as evidenced in its recent white paper on Regulation for the Fourth Industrial Revolution, as well as the National Data Strategy introduced last year.