The CNIL has imposed a €250,000 fine on an online retailer for GDPR infringements in cooperation with other EU supervisory authorities.

By Myria Saarinen and Charlotte Guerin

Founded in 2006 and headquartered in France, Spartoo SAS (Spartoo) is one of the leaders of the European online shoe retail market. On 31 May 2018, a week after the entry into application of the GDPR, the French Data Protection Authority (the CNIL) launched an on-site investigation of Spartoo in cooperation with other EU supervisory authorities. The CNIL eventually handed down its decision on 28 July 2020, imposing a €250,000 fine on Spartoo for the infringement of four different provisions of the GDPR. Spartoo may appeal the CNIL’s decision within two months. The decision illustrates how the GDPR’s “one-stop shop” mechanism can operate, and also provides insight to online retailers and other businesses on what to expect regarding GDPR enforcement in practice.

The Council decision contains useful considerations and clarifications on the “one-stop shop” mechanism, transparency obligations, and consent for targeted advertising.

By Myria Saarinen and Camille Dorval

On 19 June 2020, France’s Highest Administrative Court (Council) handed down its decision on the appeal filed by Google LLC (Google) against the French Data Protection Authority’s (CNIL’s) decision of 21 January 2019, which imposed a fine of €50M to Google for failure to comply with the obligations of transparency and to lawfully process personal data on the basis of a valid consent, with respect to the operating system for Android mobile terminals.

After the recent two-year anniversary of the GDPR, one fundamental question remains — who does the GDPR apply to?

By Gail Crawford, Ulrich Wuermeling, and Calum Docherty

Last month marked the two-year anniversary of the General Data Protection Regulation (GDPR), but its territorial reach is still hotly debated. This blog post takes a detailed look at the final guidelines on the territorial scope of the GDPR, which the European Data Protection Board (the EDPB) published on 12 November 2019 following public consultation of its draft guidelines dated 23 November 2018 (the Guidelines).

The Guidelines contain several helpful clarifications around when the GDPR applies to controllers and processors of personal data. At the same time, however, the Guidelines still present latent ambiguity as to when and to what extent the GDPR applies, particularly for multinationals.

Research participants must identify which data sets constitute personal data to ensure compliance with the GDPR.

By Frances Stocks Allen and Mihail Krepchev

The UK Medical Research Council (MRC) has published a useful guidance note on the identifiability, anonymisation, and pseudonymisation of personal data in the context of research activities (the Guidance). The Guidance reminds research organisations that the General Data Protection Regulation (GDPR) applies to health data used in research and contains a number of recommendations that participants in the research process, particularly clinical trial sponsors, should bear in mind. The Guidance has been developed with the participation of the UK privacy regulator, the Information Commissioner’s Office (ICO).

“Business as usual” for UK-EU data protection transition in 2020.  

By Gail E. Crawford and Susan Mann

On 29 January 2020, the EU Parliament approved the UK Withdrawal Agreement after the UK Parliament’s ratification via the EU Withdrawal Act 2020 on 23 January 2020 (Withdrawal Agreement). The Withdrawal Agreement maintains the UK pre-Brexit position and clarifies that the GDPR continues to apply in the UK during the transition period (between 1 February 2020 and 31 December 2020, or any extension agreed by UK and EU), allowing both sides to negotiate the future data protection relationship. The ICO confirmed that the GDPR will continue to apply, and that during the transition it will be “business as usual”.

The provisions of the UK GDPR will be incorporated directly into UK law from the end of the transition period, and will sit alongside the current UK Data Protection Act 2018. At the end of the transition period, there will be the current EU GDPR as well as a UK GDPR. The Withdrawal Agreement includes technical amendments to the current GDPR, so that it will work in a UK-only context.

Despite progress, the online advertising industry and UK regulators are still at odds over the “legitimate interest” definition under the GDPR.

By Olga Phillips and Elizabeth Purcell

Following publication of the UK Information Commissioner’s Office’s (ICO’s) report on adtech and real time bidding in June 2019, the ICO has been working closely with the online advertising industry to improve data protection practices by the end of the year.

Simon McDougall, the ICO’s Executive Director for Technology Policy and Innovation, reportedly stated at the recent AdTech London event that the ICO has made progress with the industry, including through workshops with Google and the Interactive Advertising Bureau Europe (IAB), which were both featured in the June report. However, McDougall noted that there is still “a very big difference” in how the online advertising industry and the ICO view the “legitimate interest” legal basis for processing personal data under the General Data Protection Regulation (GDPR). The ICO has yet to be convinced of the use cases in which the industry is seeking to rely on the legitimate interest basis.

China’s PCPPIC protects children’s personal information in much the same way as COPPA and the GDPR, but with a few differences.

By Wei-Chun (Lex) Kuo, Weina (Grace) Gao, and Cheng-Ling Chen

On August 22, 2019, the Cyberspace Administration of China (CAC) released a new data privacy regulation related to children, the Provisions on Cyber Protection of Personal Information of Children (儿童个人信息网络保护规定)(PCPPIC). The regulation will come into effect on October 1, 2019, and will apply within the People’s Republic of China (PRC).The PCPPIC’s stated purpose is “protecting the security of children’s personal information and promoting the healthy growth of children in the PRC.” In 29 Articles, the PCPPIC sets forth high-level requirements for the collection, storage, use, transfer, and disclosure of the personal information of children within PRC territory.

Recent action by the Hamburg authority may present implications for companies regulated by a lead data protection supervisory authority in Europe.

By Fiona Maclean, Tim Wybitul, Joachim Grittmann, Wolf Böhm, Isabelle Brams, and Amy Smyth

A German supervisory authority has initiated an investigation into Google’s speech recognition practices and language assistant technologies, which are integrated into its Google Assistant product. More specifically, the Hamburg supervisory authority opened proceedings with the intention to “prohibit Google from carrying out corresponding evaluations by employees or third parties for a period of three months. This is intended to protect the personal rights of those concerned for the time being.

This blog post analyzes the procedure against Google in Germany, in the context of recent trends elsewhere in Europe to transfer cases to lead authorities, and the impact for other companies regulated by a lead supervisory authority. The proceedings against Google might be resolved amicably, but still raise substantial questions over the powers of supervisory authorities under the cooperation and consistency mechanism of the GDPR.

Following in the footsteps of the CNIL and the ICO, the Berlin DPA will impose a multimillion-euro fine for breach of the GDPR.

By Tim Wybitul, Joachim Grittmann, Ulrich Wuermeling, Wolf-Tassilo Böhm, and Isabelle Brams

The Berlin Data Protection Authority (Berlin DPA) recently announced that it will issue a multimillion-euro fine for breach of the EU’s General Data Protection Regulation (GDPR), a significant step change in its GDPR enforcement approach. The Berlin DPA’s most significant penalty to date includes two fines on a company totaling €200,000. In that case, as with the latest announcement, the Berlin DPA has not yet named the affected company. The announcement also continues a trend, started by the French Data Protection Authority (CNIL) and followed by the UK Information Commissioner’s Office (ICO), of data protection authorities beginning to show their teeth in GDPR enforcement.

If adopted efficiently, the PCPD’s Ethical Accountability Framework should help organizations to demonstrate and enhance trust with individuals.

By Kieran Donovan

In October, 2018, Hong Kong’s Privacy Commissioner for Personal Data (PCPD) presented the findings of an inquiry into the ethics of data processing, commissioned by the PCPD with the help of the Information Accountability Foundation (IAF). The result of the inquiry, published as the Ethical Accountability Framework, provides an “instruction manual” for processing data in an ethical and accountable manner.

Following on the heels of the PCPD’s report, the Hong Kong Monetary Authority (HKMA) issued a Circular titled Use of Personal Data in Fintech Development, encouraging authorized institutions (AIs) to adopt the PCPD’s Ethical Accountability Framework.