What is it?

1995 and the Data Protection Directive (DPD), the EU adopted an ambitious set of data security and privacy rules. Within the Directive were requirements to obtain consumer opt-in, limit the amount of data that was collected, allow correction and erasure of personal data on request, and force organisations to erase data that was no longer relevant.

The EU was one of the first to take many privacy principles — more familiar to us today as Privacy by Design (PbD) — and turn them into real-world data security laws and policies. The EU’s DPD had an advanced definition of personally identifiable information, referred to as personal data, the data that is ultimately protected by the law. In the DPD, personal data could cover both standard identifiers (name, address, phone number) as well as Internet-era handles.
As the years, have passed, with further interpretations by the regulators and court rulings from the EU Court of Justice (ECJ), the original DPD was extended to cover cloud providers, erasure of data on the Internet, and at least for the US, an additional framework — the EU-US Safe Harbour — to cover the exporting of data outside the Eurozone.
However, the DPD soon began to creak. Partly for this was that the Directive gave EU countries the power to create their own laws based on the DPD and then to interpret them, so differences began to emerge. While the DPD provided a solid foundation, it was not equipped to handle the explosion in data collection and storage, and it did not specifically address the world of cloud processing, which fell in to a regulatory grey area.

The new General Data Protection Regulation (GDPR), which will replace the DPD, was approved in April 2016. It will provide a uniform law across the EU and address many of the shortcomings in the DPD. Companies have up to two years to become compliant: the GDPR will go into effect in May 2018. This interim period is for companies to prepare for its arrival.
The GDPR will add requirements for documenting IT procedures, performing risk assessments under certain conditions, notifying consumer and authorities when there is a breach, as well as strengthening rules for data reduction. For companies that only collect data of EU citizens over the Internet without having a formal presence in a country, the GDPR’s concept of “extra-territoriality” will mean the GDPR will apply to them as well.
Finally, the GDPR will contain a significant financial sting for noncompliance: maximum fine €20,000,000.00 although some fines will be tiered with some violations set at 2% and more serious lapses at 4% of a company’s global revenue.
Ultimately, the message for companies that fall under the GDPR is that awareness of your data — where is sensitive data stored, who is accessing it, and who should be accessing it — will now become even more critical.

Data Protection Act and the DPD

The EU’s Data Protection Directive can be traced to the 1980s. At that time the European Commission decided to formalise ideas on privacy — as a fundamental right — through a single set of data security rules to replace what was then a patchwork of country-by-country rules.

The results were the DPD, which was adopted in 1995. While it did not achieve its goal of unifying data rules — more on that below — it did point the way towards the EU’s approach to data security. Since the new GDPR borrows heavily from DPD — both terminology and principles — let’s take a brief look at some of the more significant aspects of the Directive.
The DPD introduces three important concepts that relate to consumers and their data, and the collection and processing of that data.

In the DPD, personal data means information “relating to an identified or identifiable natural person,” known as the data subject. By an identifiable person, they mean anyone “who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors.”

While obvious identifiers such as phone numbers, addresses, and account numbers, are encompassed by this, the definition is flexible enough — anything that relates to the person — to also account for data not foreseen by the DPD writers: for example, email and IP addresses, biometric information, and even facial images. The DPD attempted to create a future-proof definition, rather than using a static list of individual identifiers — which was unusual at the time.

Besides defining personal data, the DPD also introduces the important terms data controller and data processor that are used throughout the law.
A data controller is anyone who determines the “purposes and means of processing of the personal data.” It’s another way of saying the controller is a company or organization that makes all the decisions about initially accepting data from the data subject.

A data processor is then anyone who processes data for the controller. The DPD specifically includes storage as a processing function, so that takes into account centralized databases owned by third-parties.
Putting all this together, the DPD places rules on protecting personal data as it’s collected by data controllers and passed to data processors. The DPD requirements were specifically written to cover controllers. Data processors, though, are under obligations to protect personal data based on legal contracts given to them by the controller.

The key point is that the DPD recognises these two functions, and that organisation can use outside processors. The Directive anticipated the rise of external databases and in a way the cloud itself — though the cloud is actually is far better addressed in the new GDPR.

CONTROL ACCESS TO CONFIDENTIAL MATERIAL

With that as background, it’s now easier to understand more specific requirements of the Directive. The Directive is based on a foundation of seven principles (see chart) that is reflected in the Directive’s article 6.

1. Fairness – Process data “fairly and lawfully.”
2 Specific purpose – Ensure that data is processed and stored “for specified, explicit, and legitimate purposes and not further processed in a way incompatible with those purposes.”
3. Restricted – Ensure that data is “adequate and relevant, and not excessive in relation to” the purposes they are for which they are collected
4. Accurate – Ensure that data is “accurate and, where necessary, kept up-to-date,” so that “every reasonable step [is] taken to ensure” errors are “erased or rectified”
5. Destroyed when obsolete – Maintain personal data “no longer than necessary” for the purposes for which the data were collected and processed
6. Security – Data must be processed with adequate “security” (a “controller must implement appropriate technical and organizational measures to protect personal data against…destruction or…loss, alteration, unauthorised disclosure or access…)
7. Automated processing – The “decision[s]” from data processing cannot be “based solely on automated processing of data” that “evaluate[s] personal aspects”

These should look somewhat familiar as they are related to Privacy by Design (PbD), and both are based on older ideas from the Organisation for Economic Cooperation and Development (OECD) privacy guideline3. In any case, the GDPR still includes these principles but it further extends and expands on them.
These principles are the basis behind specific DPD articles. Let’s look at three very significant ones.

Data subjects are given “the right to obtain from the controller…as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data.

So, under the DPD consumers really have a right to erase (and correct) data — though the rule only applies to controllers. Over the years, there were additional court rulings that extended the erasure rules to processors and more specifically cloud search engines.

The DPD puts additional obligations on the controller by requiring that personal data is “adequate, relevant and not excessive in relation to the purposes for which they are collected” and then erased when the data is no longer necessary.

While securing data should be an essential part of a law that starts with the words “data protection”, the DPD was still vague on this subject.
The DPD acts as a kind of template, and EU countries are supposed to “transpose” the rules into specific legislation. A country’s local data protection authority (DPA) then enforces the law. This opened the problem of diverging interpretations and enforcement patterns, depending on where the data controller was located.

ADAPTING DPD

Until recently, data security regulations have been ‘back-page’ news. Certainly, when the DPD was the first introduced in the late 1990s, it was mostly of interest to attorneys, compliance officers and secondarily to IT executives. Enter the Internet, the data storage revolution, and ubiquitous consumer devices.

The result? An exponential year-to-year growth in the amount of information being stored and accessible from consumer devices.
The first problem the DPD structure had in dealing with these changes is that each country had some leeway in interpreting how the core rules would apply.
For example, the Internet era gave birth to a whole new source of electronic identifiers: email and IP addresses, online handles, and biometric data. But are they personal data? In many EU countries, these basic electronic identifiers were protected but some DPAs did not consider them personal data.

Other differences emerged regarding data transfers, making a few countries far more desirable locations for company headquarters and data centres.
Multinationals soon could cherry pick where they located their headquarters, essentially subverting the goal of the DPD to provide a uniform data security law.
With the rise of the cloud and massive amounts of processing and storage available on-demand, questions also came up about its legal status. Recall that the DPD is focused on data controllers.
Is the cloud a data controller or processor?
In 2012, the EU’s Article 29 Working Group, responsible for advising on DPD issues, provided guidance: companies that use the cloud are controllers since they direct how the cloud provider should handle the data. Therefore, the cloud is a processor.

Now everything else falls into place. As a processor, the cloud service must have a contract in place with the controller per the DPD.
The Working Group added that cloud customers should not accept boilerplate contracts from the cloud provider. Instead, contracts between the parties should have certain minimal DPD data security and a right to access clauses — for example, a request to delete consumer data by the controller had to be honoured by the cloud provider.
But again, individual DPAs were free to interpret and come up with their own contract terms.
Further issues involved search engine providers, who as cloud-based data processors, would also be required to delete data on demand — in their case, search results. Only very recently was this resolved after a lengthy court process.

Per the EU Court of Justice, there is effectively a “right to be forgotten” in the current DPD. Interestingly, this right has an extraterritorial nature — personal data of EU citizens can be deleted even when the data processor is not located in an EU country.
Of course, it would have been far more straightforward if the DPD had more explicit language on data processors and erasure rights, and the member countries had less leeway to interpret the rules. This would all soon change.

NEW EU GENERAL DATA PROTECTION REGULATION

Realising the data old security law had to be revamped, the EU Commission in 2012 started the process of creating new legislation. Their primary goal was a single law covering all EU countries and a “one-stop” shop approach to enforcement through a single data authority. The result is the General Data Protection Regulation, which will go into effect in 2018.

The GDPR is not a complete rewrite of the DPD. Instead it enhances the existing DPD. However, there are some new requirements, most significantly for breach notification and more extensive documentation.

WHAT TO KEEP?

Let’s first take up what’s been revised and clarified by the GDPR.
First, the GDPR keeps the DPD’s definition of personal data. However, there’s additional language that takes into account what’s known as quasi-identifiers. These are one or more pieces of information — location and online identifiers are specifically mentioned — that with additional external data references can be used to pinpoint a person.
The GDPR puts in place more specific obligations on processors and therefore the cloud and effectively says that the cloud provider must protect the security of data given to it by the controller. The GDPR added the ability of a consumer to directly sue a processor for damages — in the DPD it was only the controller that could be held liable.10
Article 5 (principles related to personal data processing) essentially echoes the DPD’s article 6’s minimization requirements: personal data must be “adequate, relevant, and not excessive in relation to the purposes for which they are processed”. There’s language that says that personal data can’t be kept longer than is necessary based on the original reason it was collected. Is also says the data controller is ultimately responsible for the security and processing of the data.

Article 25 (data protection by design and by default) further enshrines PbD ideas. The article is more explicit about data retention limits and minimization in that you must set limits on data (duration, access) by default, and it gives the Commission the power to lay down more specific technical regulations at a later time.

WHAT TO ADD?

Article 30 (records of processing activities) adds new requirements for controllers and processors to document their operations. Most importantly there are now rules for categorizing the types of data collected by controllers, recording the recipients for which the data is disclosed, and specifying an indication of the time limits before the personal data is erased.
Article 35 calls for data protection impact assessments (DPIAs) before the controller initiates new services or products involving the data subject’s health, economic situation, location, and personal preferences — and more specifically data related to race, sex life, and infectious diseases. The DPIAs are meant to protect the data subject’s privacy by, among other restrictions, forcing the controller to describe what security measures will be put in place.

The new breach notification rule probably has received the most attention in the media. Prior to the GDPR, only telecom and ISP service providers had to report breaches within 24 hours under the e-Privacy Directive.11
Modelled on this earlier Directive, the GDPR’s article 33 says that controllers must tell the supervisory authority the nature of the breach, categories of data and number of data subjects affected, and measures taken to mitigate the breach.
Controllers are required to notify the appropriate supervisory authority of a personal data breach within 72 hours (at the latest) on learning about the exposure if it results in risk to the consumer. But even if the exposure is not serious, the company still must keep the records internally.
Per the GDPR, accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data is considered a breach.
Article 33 also requires that a data processor has to notify the controller of a breach upon discovery.

Article 34 adds that data subjects must also be told about the breach but only if the breach results in a high risk to their “rights and freedoms”. If a company has encrypted the data or taken some other security measures that render the data unreadable, then they won’t have to inform the subject.
Article 17 (right to erasure and right to be forgotten) has strengthened the DPD’s existing rules on erasure and then adds the controversial right to be forgotten. This article explicitly states that if the data subject withdraws consent, the personal data should be erased without delay.

There’s now also language that would force the controller to take reasonable steps after “taking account of available technology and the cost of implementation” to inform third-parties of a request to have personal data deleted that was made public. This is the new right to be forgotten. It means that in the case of a social media service that publishes personal data of a subscriber to the Web, they would have to try to remove not only the initial information, but also contact other web sites that may have copied the personal data. This would not be an easy process! And by the way, it won’t apply if the processing is necessary for “freedom of expression”.

Finally, a requirement that has received less attention but has important implications is the new principle of extraterritoriality described in Article 3 (territorial scope). It says that even if a company doesn’t have a physical presence in the EU but collects data about EU data subjects — for example, through a web site — then all the requirements of GDPR are in effect.
This is a very controversial idea especially in terms of how it might be enforced. As we pointed out earlier, it has already been applied in the more limited case of search engine processors under the existing DPD.
CONCLUSIONS AND RECCOMMENDATIONS: GDPR COMPLIANCE CONSIDERATIONS
Going into the final negotiations that began in 2015, the parties — the EU Council, Parliament, and Commission — still had differences on some key issues. These included the GDPR fine structure, data privacy officers (DPO), and breach notification reporting. We’ve already mentioned the breach rules, so let’s cover the other two.

The GDPR has a tiered fine structure. Article 83 (general conditions for imposing administrative fines) says that a company can be fined up to 2% of global revenue for not having their records in order (article 30), not notifying the supervising authority and data subject about a breach (articles 33, 34), or not conducting impact assessments (article 35).

More serious infringements merit up to a 4% fine of global revenue. This includes violation of basic principles related to data security (article 5) and conditions for consumer consent (article 7) — these are essentially violations of the core Privacy by Design concepts of the law.
Since the EU GDPR rules apply to both data controllers and processors, that is “the cloud”, the huge cloud providers are not off the hook when it comes to GDPR fines.
Coming into the negotiations, there were also differences over whether companies had to appoint a data protection officer who would be responsible for advising on and monitoring GDPR compliance, as well as representing the company when contacting the supervising authority.
With the final GDPR, many companies will likely need a data protection officer or DPO (article 37). If the core activities of a company involve “regular and systematic monitoring of data subjects on a larger scale”, or large-scale processing of “special categories” of data — racial or ethnic origin, political opinions, religious or philosophical beliefs, biometric data, health or sex life, or sexual orientation — then they’re required to have a DPO.
In general, there is some room carved out for micro, small, and medium-sized businesses in the GDPR. Most under-250 employee companies will likely not need to have a DPO, keep records, notify a supervising authority about a breach, or carry out a DPIA.
For EU companies and their US and other foreign subsidiaries that are currently under the existing DPD, the new GDPR will be viewed as an evolution of the existing regulations. Although the breach notification, the new documentation requirements, and the steep fines will mean that they will have to up their compliance game.
For companies, particularly US, caught in the extraterritoriality net, the GDPR will come as something of a shock. This is especially true for web-based services that are not regulated under existing US financial or medical data security laws.
For companies with existing IT data security standards in place — SANS 20, PCI DSS, ISO 27001 or NIST 800-53 — compliance with the EU’s GDPR should be readily achievable
Our overall recommendation is that any company affected by the new law should focus on these following points:
Data classification — Know where personal data is stored on your system, especially in unstructured formats in documents, presentations, and spreadsheets. This is critical for both protecting the data and also following through on requests to correct and erase personal data.
Metadata — With its requirements for limiting data retention, you’ll need basic information on when the data was collected, why it was collected, and its purpose. Personal data residing in IT systems should be periodically reviewed to see whether it needs to be saved for the future.
Governance — With data security by design and default the law, companies should focus on data governance basics. For unstructured data, this should include understanding who is accessing personal data in the corporate file system, who should be authorized to access, and limiting file permission based on employees’ actual roles – i.e., role-based access controls.
Monitoring — The breach notification requirement places a new burden on data controllers. Under the GDPR, the IT security mantra should “always be monitoring”. You’ll need to spot unusual access patterns against files containing personal, and promptly report an exposure to the local data authority. Failure to do so can lead to enormous fines, particularly for multinationals with large global revenues.

For the ICO guide on GDPR, click here.

ARE YOU READY FOR GDPR?
CLICK HERE