The General Data Protection Regulation (GDPR) is a set of rules that dictate how organizations should collect, store and dispose of the personal data of EU residents. It will supersede the Data Protection Directive (1995) when it goes into effect on May 25, 2018.
The high fines that may be imposed on noncompliant organizations adds some urgency to the regulations – the maximum fine being €20.000.000, or “4% of the total worldwide annual turnover of the preceding financial year, whichever is higher” (GDPR, Article 83(5)).
Since the regulation was finalized in April 2016, there has been an explosion of content produced explaining what it is, its impact on different industries and roles, and how to prepare for its arrival. So in this article, we will not focus on what GDPR is (if you’d like to learn more, the Wikipedia article is a great place to start).
The focus instead will be on the importance of documenting your organization’s data processes and architecture with a GDPR documentation tool. Accurate documentation of the systems and processes handling user data will help you identify areas where you may or may not be compliant with GDPR, and in the future, serve as a reference in case of audit.
The new privacy regulations will apply to any company that processes EU user data, even if that processing or data storage occurs outside of the EU, as explained by Andy Green of data security company Varonis:
“Under the old rules in the Data Protection Directive (DPD), there was some wiggle room that allowed data collectors to escape having to follow the regulations. A common practice was for service or app providers to keep their data processing outside the EU.
The idea was that if the main processing and servers weren’t located in the EU zone, then the rules didn’t apply.
Companies such as Google, Facebook, and other social networking companies were following this approach.”
Under the new regulations however, any company, regardless of where they or their servers are physically located, will be liable for user data that originates in the EU. This significantly broadens the scope of affected organizations.
Moving towards GDPR audit readiness requires a thorough understanding of your organization’s systems and processes that collect, process, transfer or store personal data – not an easy undertaking, especially for large organizations with complex technical landscapes. Having comprehensive, accurate documentation of your systems will enable you to reduce time spent searching for answers, give you a more holistic overview of the entire organization, and provide a clear auditing trail.
Here are three scenarios in which up-to-date, comprehensive documentation can make GDPR audit readiness significantly easier:
Data minimization – the practice of limiting personal data collected to the bare minimum required for the purpose – is explicitly required in the regulation, as seen in Article 5(1)(c):
“The personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’).”
To determine exactly what personal data is required for a purpose, it’s helpful to look at the higher-level context. For example, if a web application requests users’ physical address during payment process, it’s important to look at what systems it’s transferred to, and what business requirements that piece of personal information satisfies (both to determine if that information is truly necessary, and if so, to have documentation demonstrating that in case of a data audit).
Ultimately, data controllers are responsible for justifying why each piece of personal data is collected. As you document all the personal data that you currently collect with a GDPR tool, anything that can not be tied directly to an explicit business requirement should be considered a candidate for removal.
The responsibilities and liabilities of data controllers and processors are significantly expanded under GDPR. One new responsibility is the requirement that data controllers must “be able to demonstrate…compliance with the [principles relating to processing of personal data]” (Article 5(2)). Among those principles that must be demonstrated are data minimization and the secure processing of data.
No specific requirements are given as to how to demonstrate those principles; the regulation stipulates that it is the controller’s responsibility to “implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with [the GDPR]” (Article 24(1)). So, the good and bad news is: you have total freedom to choose the correct tools and processes to demonstrate compliance.
The importance of systems documentation is obvious here – you can not demonstrate something that you do not know. To be prepared for an audit, you should have a single source of truth that documents all systems and processes that touch personal data, as well as the business requirements that demonstrate the need for that data. It is equally important to create and maintain a culture in which that documentation stays updated, either through manual efforts, automation, or a combination of the two. A GDPR documentation tool like Ardoq can provide that source of truth to build a maintenance culture around.
GDPR gives broad rights to users who want their data deleted from an organization’s systems (the so-called “right to be forgotten”):
“The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay”. (Article 17(1))
Deleting user data can be very straightforward, but for larger organizations, it may be spread out across tens or even hundreds of systems. Having documentation of those systems and the data that flows between them simplifies the process of deleting all instances of a user’s data. And again, the documentation serves as a reference of your systems and data deletion processes in case of an audit.
GDPR introduces a strict timeline for disclosure of data breaches to Data Protection Authorities (DPAs):
“In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.” (Article 33(1))
As international legal firm White & Case notes, this timeframe will create new challenges for organizations:
“The GDPR’s 72 hour deadline for reporting data breaches to DPAs is likely to prove extremely challenging. In most cases, the amount that an organisation knows about the extent and causes of a data breach develops substantially in the first couple of weeks after the breach is discovered. It will be extremely difficult for organisations to ascertain whether or not a data breach poses a high risk (and therefore needs to be notified) within that timeframe.”
To increase your chances of meeting that 72 hour deadline, you’ll need to be able to quickly access documentation of the system that suffered the breach, then look at dependent systems that may also be affected. Having all that information up-to-date and in one place allows you to focus on the important work: uncovering exactly what happened to the data and how.
In order to gain a complete overview of your organization and be confident in the quality of your documentation, you need a GDPR tool that is easy to update, effective and flexible. To prevent documentation from getting out-of-date and fragmented across multiple tools, PowerPoint files and text documents, it should be collaborative and available across the organization.
That’s our vision for Ardoq. Ardoq is a platform for creating semantic documentation of systems and processes. Unlike traditional architecture documentation tools, it focuses on data first and collaboration across functional teams.
We designed Ardoq to be easy to use so that anyone in an organization can contribute, and flexible enough that users can model virtually any system, process, or other data. Our mission is to connect knowledge across the organization from business to IT. We believe that this provides the perfect foundation to support GDPR documentation and provide the knowledge necessary to be prepared for an audit.
In order to improve the quality of documentation we have taken two unique approaches:
By empowering everyone in your organization to document their own processes and systems, and giving them the ability to create custom models to reflect their reality, using Ardoq results in more up-to-date and accurate information.
Some documentation has to be done manually. For the rest, Ardoq offers a REST API that can be used to integrate with almost any system you may have. We also offer integrations with Excel, Maven, Swagger, JIRA and Docker so you can quickly get your data from those tools into Ardoq.
Another tedious activity that is automated in Ardoq is drawing diagrams – not only are they created automatically, they’re also updated whenever the underlying data changes. You can also visually explore your documentation by clicking through it, as seen in the embedded Ardoq presentation below.
Ardoq is a flexible tool and can be used for many different use cases within your organization. Read through some of the other use cases below, or start an Ardoq trial now.
An international telecom customer uses Ardoq as a central hub for their IT asset management documentation.
Fjordkraft uses Ardoq to document requirements and acceptance criteria, and to create and track JIRA issues related to them.
Gard uses Ardoq to map its application landscape and system integrations, capturing all interdependencies between systems in order to reduce risk during change processes.
Hafslund uses Ardoq to visually explore their data pipeline documentation, helping them connect key metrics to the data sources they come from.