Global Privacy & Security Compliance Law Blog

“Hacking” Warrants: A Question of Procedure or Substance?

Posted in Legislative & Regulatory Developments

By Serrin Turner

Typically, the process for amending the Federal Rules of Criminal Procedure is a sleepy affair. Proposed amendments wend their way through a series of judicial committees and, if approved by the Supreme Court, take effect automatically by the end of the year. Theoretically, Congress may choose to intervene and block the change – but it does so rarely. This year, however, a proposed amendment has caught the congressional eye.

Over the past several days, legislators in both the Senate and the House of Representatives have introduced legislation to block a proposed change to Rule 41 of the Federal Rules of Criminal Procedure, which regulates the issuance of search warrants in federal criminal investigations. Law enforcement already uses Rule 41 routinely to obtain warrants to search computers recovered from physical premises or otherwise taken into law enforcement custody. The proposed amendment addresses a different scenario: when law enforcement has identified a computer being used to perpetrate a crime but cannot determine where it is located. With the proliferation of anonymizing technologies used by hackers and other criminals operating on the Internet, this fact pattern is increasingly common. The rule change under consideration would enable law enforcement to obtain a warrant in such circumstances to search the target computer “remotely” – that is, by hacking into it. Continue Reading

The Countdown to the General Data Protection in Europe Has Begun

Posted in Legislative & Regulatory Developments, Privacy

By Gail Crawford and Lore Leitner

Today, after more than four years of debate, the General Data Protection Regulation (GDPR, or the Regulation) enters into force. The GDPR will introduce a rigorous, far-reaching privacy framework for businesses that operate, target customers or monitor individuals in the EU. The Regulation sets out a suite of new obligations and substantial fines for noncompliance. Businesses need to act now to ensure that they are ready for when the Regulation becomes enforceable after the expiry of a two-year transition period, i.e., from 25 May 2018.

Will this affect your business? What’s next? For a detailed look at the likely impact of the GDPR, read our client alert: Europe Counts Down to the General Data Protection Regulation

Are Changes in Store for the Stored Communications Act?

Posted in Legislative & Regulatory Developments, Privacy

By Serrin Turner

Last week saw action on two fronts regarding the Stored Communications Act (SCA) – the US federal statute regulating government searches of online accounts in criminal investigations. In Congress, a proposal to reform the SCA advanced in the House; and in the courts, Microsoft sued to challenge a provision of the SCA as unconstitutional. Although the reform bill has been portrayed as a major piece of privacy legislation, the version now under consideration is quite modest and would not substantially change how the SCA is applied in practice. However, the Microsoft lawsuit, if successful, could significantly reshape and restrict how the SCA is used by law enforcement.

What is the Stored Communications Act?

The SCA sets forth the procedures by which US law enforcement authorities can compel electronic communications service providers to disclose the contents of (and other records pertaining to) user accounts. While the SCA is applied most often in the context of email accounts, it applies equally to social-networking accounts, cloud-storage accounts, web-hosting accounts, and any other type of account where a user may store electronic communications. Like everyone else, criminals are increasingly communicating over the Internet, and as a result the SCA is now routinely used by law enforcement to obtain the contents of online accounts used by criminal suspects to communicate and do business. Continue Reading

Analysis of the FCC’s Proposed Broadband Privacy Regulations

Posted in Privacy

By Amanda Potter and Alex Stout

As we highlighted in a post last month, the FCC has proposed sweeping new privacy rules on broadband providers. Since our last post, the FCC has released its proposal in the form of a Notice of Proposed Rulemaking. This proposal would institute new customer privacy and data breach rules on broadband providers and follows the Commission’s landmark Open Internet proceeding, in which the Commission imposed common-carrier telecommunications rules on broadband. The public has until May 27 to submit initial comments and June 27 to submit reply comments.

While the proposal includes updates to existing FCC rules, the focus is on broadband providers. The proposed rules would express exclude providers of “edge services” (like search engines, video streaming, and mobile applications), reasoning that consumers can readily avoid edge services and that broadband providers act as “gateways” that could potentially track consumers across the Internet.

The proposed rules would cover two categories of information. First, the rules would apply to “customer proprietary network information” (CPNI), a type of data defined by the Section 222 of the Communications Act to include a customer’s technical usage or billing data. For broadband, the FCC proposes to include, at minimum, Internet service plan and pricing, geo-location data, MAC address, Device ID, IP address, and traffic statistics. Second, the rules would protect personally identifiable information (PII). The FCC only recently began to use the term PII, which it defines here Continue Reading

Recent Amendments to the Russian Personal Data Protection Legislation: The Right to be Forgotten

Posted in Legislative & Regulatory Developments, Privacy

By Mikhail Turetsky, Ksenia Koroleva and Lore Leitner

On July 13, 2015, the Russian President signed Federal Law No. 264-FZ (the Law), which introduced a range of amendments into Russian legislation (the Amendments). In particular, the principle of the “right to be forgotten”, a concept not previously recognized under Russian law came into effect on January 1, 2016.


The Law introduced the right for individuals to request that search engine operators delete links to certain information relating to the individuals from searches run on the individuals’ names or surnames. The Law applies only to individuals and does not mention legal entities. Continue Reading

Privacy Shield is on its Way

Posted in Privacy

By Ulrich Wuermeling, Jennifer Archie & Lore Leitner

On March 17, 2016, the Civil Liberties Committee convened to discuss whether the Privacy Shield framework that will replace Safe Harbor provides adequate protection to the data of EU citizens. A number of experts were questioned including: the US lead negotiator, the EU Data Protection Supervisor, members of the Article 29 Working Party and Max Schrems, whose court case against Facebook led to Safe Harbor’s downfall.

The meeting of the Civil Liberties Committee follows on from the European Commission’s publication last month of the legal texts that will form the basis of the EU-US Privacy Shield and a Communication summarizing the action taken to rebuild trust in the data flows from the EU to the US. The European Commission also made public a draft “adequacy decision” establishing that the safeguards provided under the Privacy Shield are equivalent to the EU data protection standards. The documents provide a better idea of the substance and structure of the Privacy Shield, announced by the European Commission on February 2, 2016 and confirm the US commitment to ensuring that there will be no indiscriminate mass surveillance by its national security authorities.

Focus areas of the Privacy Shield

From the material made public, the new framework focuses on four areas: Continue Reading

FCC Proposes Broad Privacy Regulations for Broadband Providers

Posted in Privacy

By Matt Murchison and Alex Stout

Last week, the FCC announced that Chairman Tom Wheeler had circulated a Notice of Proposed Rulemaking (NPRM) on implementing Section 222’s privacy obligations for broadband providers. Section 222’s requirements were originally crafted for telephone companies, and were first applied to broadband providers as part of the 2015 Open Internet Order, which reclassified broadband providers as telecommunications carriers. However, the FCC expressly forbore from applying to broadband providers the rules it had adopted over the years implementing Section 222 in the telephone context. The upcoming NPRM, which the full Commission will vote on at its March 31 Open Meeting, will, for the first time, propose specific requirements implementing Section 222’s privacy obligations in the broadband context.

The FCC’s fact sheet about the NPRM reiterates the three guiding principles that the Chairman has identified in recent weeks—choice, transparency, and security—and provides some new details on the specific proposals under consideration. Continue Reading

Proposal of EU-US Privacy Shield Leaves Businesses in State of Uncertainty

Posted in Privacy, Security

By Ulrich Wuermeling, Gail Crawford and Jennifer Archie

Earlier this week, the European Commission announced that a “political” agreement has been reached on a new framework for data flows from the EU to the US. The announcement highlights a few changes from the old Safe Harbor regime, such as more direct and active oversight by US regulators, more stringent privacy protections, and establishing an ombudsman at the State Department for EU citizens who wish to complain about data protection matters. However, as a legal and compliance matter, US companies who previously relied upon Safe Harbor to transfer EU data take significant compliance risk if they do nothing in anticipation of newly branded EU-US Privacy Shield framework being formally approved, given it is not yet documented and will be subject to review by the EU data protection supervisory authorities in the so-called Article 29 Working Party as well as representatives of the Member States and the European Parliament. Continue Reading

Political Agreement on European Data Protection Regulation

Posted in Legislative & Regulatory Developments, Privacy

By Ulrich Wuermeling

A political compromise has been reached on the new European Data Protection Regulation. On December 15, 2015, the negotiators in the so-called “informal trilogue” between the Council, the Parliament and the European Commission closed the final issues. Meanwhile, the Luxembourg Presidency informed the LIBE-Committee of the Parliament as well as the Permanent Representatives Committee of the Member States about the outcome. The LIBE-Committee will review the final changes on December 17, 2015, but the aim is not to request further changes. If the text is acceptable to the Parliament and the Council, the formal votes in the so-called early second reading will take place early 2016 and the new Regulation will come into force in early 2018.

In the last Trilogue meeting, agreement was reached on the following issues that had remained on the table until the eleventh hour:

  1. High requirements for valid consent

Consent has to be given by “clear affirmation action establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement.” In relation to special types of data (such as health data), the consent needs to be “explicit.” The Parliament wanted every type of consent to be “explicit,” but the Council asked for a lower threshold. However, silence, pre-ticked boxes or inactivity will not be considered valid consent. A statement in the recitals of the Regulation clarifying that browser settings could constitute consent was deleted in the final round of negotiations. The text also questions whether the provision of services can be made dependent on consent. The Council thought this was acceptable if the service could be reasonably obtained elsewhere. However, the Parliament did not agree on this clarification to the Recitals. The relevant Article only states in considering whether consent is freely given one should take “into account” whether the consent is conditional for the provision of a service. Consent for different purposes should be separated out in appropriate cases.

Overall, the Regulation sets out onerous requirements for valid consent and businesses will have to reconsider the risks involved in trying to request such consent form data subjects. Existing consents which do not comply with the new requirements will become invalid when the Regulation becomes applicable in 2018.

  1. Broad exemptions for archiving, scientific, historical and statistical purposes

The Regulation can potentially hinder archiving or scientific, historical and statistical activities. The negotiators agreed that a number of exemptions with respect to purpose limitation, legal grounds and transparency should apply in these areas. With respect to archiving, these exemptions will only apply if carried out in the public interest. For the other areas, such public interest is not required. This outcome gives hope for “big data,” because it often fits under the categories of scientific, historical or statistical activities. For special types of data (such as health data), national laws will have to provide further safeguards and Member States are allowed to maintain and introduce further conditions. These national laws will probably lead to an uneven European playing field especially for big data in the health sector.

While the negotiators added a sentence stating that the national provisions should not hamper the free flow of data within the Union, that would seem unavoidable in practice given providers have to comply with the laws of each country in which they operate. Given that that Regulation provides for national flexibility in many areas, it will require a high degree of discipline by the Member States to avoid a negative impact on the envisaged harmonization.

  1. Age limit for children’s protection is inconsistent

In the Triloque, the negotiators had already agreed  that children must be 16 years to give valid consent without parental approval. In the last meeting, concerns were raised that the age limit was inconsistent with age limits in individual Member States. As a compromise, the parties agreed that generally children under 16 are not allowed to provide consent without parental approval, but Member States are permitted to reduce the age limit to 13 years. Another example of the Regulation providing inconsistent rules across Europe by providing flexibility for national laws.

  1. No impact assessment for biometric data

The new Regulation requires data protect impact assessments if data processing is likely to result in a high risk to the rights and freedoms of individuals. The large scale processing of biometric data was mentioned as an example for such high risk data processing. However, in the last meeting of the Trilogue, the parties agreed to delete biometric data from the list of examples given the extended use of biometric data for identification purposes.

  1. Standardized icons to allow easy transparency

The Parliament originally proposed icons to be used by businesses in order to provide more transparency to data subjects. The Council feared that the proposed icons would probably cause more confusion than clarification. As a compromise, the parties have agreed that the European Commission should be empowered to introduce icons through delegated acts. It remains to be seen whether the Commission will be able to invent icons suitable for consumers.

MEPs Agree to Europe’s First-Ever EU Cybersecurity Law

Posted in Legislative & Regulatory Developments, Security

By Gail Crawford and Andrea Stout

On December 7th, members of the European Parliament (MEPs) and the Luxembourg Presidency of the EU Council of Ministers provisionally agreed to the text of the long awaited network and information security directive also known as the cybersecurity directive (Directive).

While the text of the proposed Directive has yet to be released publicly, press releases indicate that the Directive will introduce new requirements for certain organizations to implement security measures to prevent against cyber security attacks. Organizations caught by the Directive will also be required to report security breaches to the national authorities – a requirement currently only imposed upon telecommunications operators.

In addition, member states will be required to adopt cybersecurity policies and to designate a national authority for the implementation and enforcement of the Directive. Many countries, including the UK, have already introduced Computer Emergency Response Teams (CERTs) to manage and prepare for cyber security incidents. The Directive also aims to encourage cooperation between competent authorities to enable coordinated information exchanges and detection/response plans.

These requirements come as part of the broader EU cyber security strategy introduced in 2013 when the Directive was first proposed. The aim of the strategy is to ensure that critical businesses meet minimum standards for network and information security and to encourage member states to coordinate regarding their cyber defense efforts.

Who does it apply to?

The European Parliament has announced that the Directive will apply to both “operators of essential services” such as those operating in the fields of energy, transport, banking, financial markets, health and water supply and “some internet service providers” such as those hosting online marketplaces (specifically naming eBay and Amazon), search engines and cloud service providers. Notably absent from this list are “social networks,” such as Facebook, and “application stores” who do not appear to be caught by the current version of the Directive, but were mentioned in an earlier draft.

One of the most debated topics is around which businesses will be caught by the Directive’s obligations and if internet service providers will be subject to the same requirements as those providing critical  infrastructure services. Internet service providers such as Google and Cisco have lobbied to be left out of the Directive stating that they do not provide critical services to society, hoping to avoid the extra security compliance costs likely to be incurred following the implementation of the Directive.

Following a breakthrough in the negotiations in June 2015, a two tier approach towards compliance was agreed; those companies providing digital services would be subject to a different set of less onerous, “light touch” requirements than those providing essential services such as in the banking, energy and transport fields. Micro and small digital companies will be exempt from compliance with the Directive.

What’s the concern?

The statistics about cybersecurity breaches are often staggering. PwC, in connection with the Department of Business, Innovation and Skills (BIS), conducted a survey of businesses in 2014 and found that 73% of large organisations and 45% of small ones suffered from a security breach resulting from an infection by viruses or malicious software in the last year.  However, there is currently little open discussion about breaches given the vast numbers of businesses that claim in surveys, such as the PwC report above, to be affected. One of the aims of the Directive is to mandate reporting of any security incidents having a significant impact on critical services (including the number of users affected, the duration of the incident, etc.) to the dedicated national CERTs to encourage information sharing with the hope that sharing information will enable organizations to improve their security and work together to mitigate the impact of attacks. Earlier drafts have proposed the definition of “incident having a significant impact” to mean “an incident affecting the security and continuity of an information network or system that leads to the major disruption of vital economic or societal functions.

What does it mean for businesses?

Whilst the agreed draft will not be released until December 18, 2015, it is clear that businesses providing critical services and some internet service providers will be required to meet a minimum standard of protection to defend against cyber-attacks. Businesses captured by the Directive will also be required to report security breaches to the national authority. The reporting obligation will apply in addition to similar data breach notification obligations under applicable data protection laws.

We expect more details of the requirements for businesses to emerge following publication of the agreed text and specifics to be negotiated following the implementation of the Directive into local law. Earlier drafts indicate that member states will be required to determine which measures businesses will need to adopt to ensure that they “take appropriate and proportionate technical and organizational measures to detect and effectively manage the risks posed to the security of the networks and information systems which they control and use in their operations… those measures shall ensure a level of security appropriate to the risk presented.” This is a similarly vague statement to that contained in the European Data Protection Directive.

Next steps

The Directive is not likely to be effective for another two years. The Luxembourg Presidency have announced that their aim is for the agreed text to be presented to the Council Committee of Permanent Representatives for approval by 18 December 2015. The text will also need to be formally approved by the European Parliament’s Internal Market Committee. To conclude the procedure, formal adoption by both the Council and the Parliament is required. Following entry into the EU Official Journal, it will then officially enter into force, allowing member states 21 months to implement the legislation into local law and six more months to identify the operators of essential services who will be subject to the more onerous security requirements.

We will provide further details once the final text of the Directive is published.