“In an almost direct clash of intentions, the GDPR has effectively banned the use of blockchain technology in Europe because of its immutable nature.” – Forbes
The current perception is that blockchain is not compatible with GDPR. We disagree that this is the case and explain why none of the issues identified by legal scholars and stakeholders are likely to pose issues for blockchain applications. Our conclusion is that GDPR is well able to regulate this new technology.
The current conception amongst industry stakeholders is that blockchain (BC) is not compatible with GDPR, resulting in a call for urgent revision right after GDPR came into force. The concerns are fed by public statements of Jan-Philipp Albrecht (the MEP responsible for coordinating the Parliament’s input for GDPR) that BC “probably cannot be used for the processing of personal data” and the CNIL cautioning in draft guidelines that public BC may not be the most appropriate technology for the processing of personal data and that priority should be given to other processing solutions that can achieve the same purpose. As the same results currently can always be achieved with another solution, this is a difficult standard to meet. The above conception is fed by recent in-depth publications on the data protection aspects of BC, which indeed paint a grim picture: the characteristics of public BC would be “on a collision course” and “profoundly incompatible at a conceptual level” with GDPR.
What are the issues?
GDPR requires identification of a central ‘controller’ who is responsible for compliance with GDPR, while a public BC decentralizes the storage and processing of personal data, as a result of which there is no such central point of control. For lack of a better alternative, the authors conclude that all ‘nodes’ involved in operating a BC qualify as a controller under GDPR, raising enforcement and jurisdictional issues that make it impossible for individuals to enforce their rights. The transparency and immutability of a public BC would further not sit well with principles of data confidentiality, data minimization, data accuracy, and the rights of individuals to correction and deletion of their data.
A different perspective
We disagree with this analysis. The main reason is that the authors focus on the shortcomings of the initial public (Bitcoin) BC when many new types of permissioned private and consortium BC already have been developed that significantly diverge from the original, permissionless public BC. In fact, these types of permissioned BC have been developed exactly in response to the shortcomings of public BC. The authors further consider the data processing implications of BC as if this technology itself constitutes a data processing activity for which a controller has to be identified. Controllership is, however, decided based on a specific use or deployment of a certain technology. BC, like the internet, is a general purpose technology (GPT) that is subsequently deployed by actors for a certain purpose in a specific context. We will explain below why none of these identified issues are currently hampering application of the GDPR to the internet and are equally unlikely to pose issues for BC applications.
The conclusion is that GDPR is well able to regulate this new technology. This does not, however, mean that BC will thus be suitable for all use and deployment cases.
Intermediaries will not become obsolete
We consider it highly unlikely that BC will make intermediaries obsolete; rather, it will replace the current intermediaries. The BC revolution is well described by the World Economic Forum (2017 report), indicating that where the last decades brought us the internet of information, we are now witnessing the rise of the internet of value, whereby we can send money and soon any form of digitized value – from stocks and bonds to intellectual property – directly and safely between us.
As BC is about value (rather than just ‘information’), and therefore whether someone has ownership of money, stocks, houses, or not (as evidenced by the BC), the participants will insist that their stakes be safeguarded before the BC will be trusted. The prediction, therefore, is that, whenever BC applications are built for evidence and transfer of value, there will always be a set of governance rules reflecting the terms agreed by the participants of the ecosystem to regulate their relationship. The first examples indeed show new entities being set up mostly as a consortium (often including or funded by incumbents, such as banks), which are in charge of the governance of the BC platform, as well as separate entities operating a BC application on top of the BC platform for specific ecosystems. These BC are permissioned, in the sense that they implement membership rules that determine which parties have read or read/write authorization. To avoid jurisdictional and enforcement disputes, these rules will provide who the responsible entity is, as well as a choice of law and forum. The jurisdiction and enforcement issues raised by the authors, therefore, are likely not a realistic reflection of how these issues will be encountered in practice. The controller issue is solved as, in any event, this central entity deciding on purposes and means of the BC platform will qualify as the controller under GDPR. The entities operating the BC application on top of the BC platform will also qualify as controllers in their own right (potentially jointly with the controller of the BC platform).
We here recall that early predictions in respect of the internet foresaw similar enforcement and jurisdictional issues. Every encounter of consumers in cyberspace would raise the possibility that diverse laws would apply and multiple courts would have jurisdiction, and a myriad of court cases was predicted. Contrary to these early expectations, there have been only isolated court cases dealing with online cross-border consumer disputes. One of the explanations is that stakeholders quickly found practical work-arounds in the form of contractual self-regulatory systems. Examples are the use of credit cards for online payments, which bring their own dispute resolution system, and the emergence of large intermediaries like eBay, which was at first just regulated by the ratings and review consumers could post, but later introduced full‑fledged dispute resolution. Also, here the old intermediaries (retailers) were replaced by new intermediaries, again generating the required trust to do business. In fact, it is fair to say that there is very little happening on the internet that is not governed by some form of contract. The use of websites is regulated by their website , online purchases are governed by purchase terms, access to the internet is governed by the terms and conditions of ISPs, App stores have their own Terms & Conditions (“T&Cs”), search functionality is governed by the T&Cs of the provider of the search engine, etc. As happened with the internet, it is a justified expectation that the stakeholders involved in BC will implement their own contractual self-regulatory mechanisms to ensure adequate dispute resolution.
GDPR applies to the use of a technology, not the technology itself
The authors try to determine controllership in respect of BC technology at large, which would indeed raise the identified issues. Controllership is, however, decided based on a specific use or deployment of a certain technology. BC, like the internet, is a GPT that is subsequently deployed by actors for a certain purpose in a specific context. None of the issues raised by the authors have hampered the development of the internet, for the simple reason that controllership is not decided based on the technical level of operation of the relevant technology, but is based on who deploys this technology for a certain purpose. For example, a website owner uses the internet to offer its website. It is the website owner who qualifies as the controller in respect of the processing of any personal data via the website and not the operator of the technical infrastructure.
GDPR does not impose requirements on designers of technology
GDPR includes an obligation for the controller to set up data processing functions on the basis of privacy-by-design (Article 25 GDPR). GDPR does not impose this requirement on providers of software and infrastructure that are used to process personal data. As a consequence, individual controllers need to expressly instruct each of their technology suppliers to provide software and infrastructure that incorporate privacy-by-design in order to meet their controller obligations.
Although this indirect manner of regulating seems inefficient, the reality is that for technology developers, it is often difficult to foresee all possible deployments of their technology. As a consequence, it is difficult to implement all requirements into their product from the outset. It is often in the feedback loop of the users, customers, or society at large when the technology is deployed in practice that the design issues become apparent and are addressed. Too-strict upfront design requirements (in the form of standards) may even hamper innovation, and it may even lead to “widespread adoption of inferior technology” (as explained in this report of the World Economic Forum). In the words of Behlendorf (CEO of the Linux Foundation):
“The space is still so young that the desire for standards, while well-placed, runs the risk of hardening projects that have just come out of the lab,” and “we need to avoid making serious architectural decisions that first become legacy and then become a hindrance.”
GDPR is, just as its predecessor, technology agnostic (see Recital 15) in the sense that it provides for general data protection principles and requirements but does not prescribe any technology or technical manner for how these principles and requirements should be implemented. As BC is an emerging technology still in its infancy, GDPR works exactly as it is intended, challenging developers to think of creative ways for how to develop the technology in such a manner that the impact on the privacy of individuals can be mitigated and basic principles of GDPR can be complied with. That this may take several development cycles to be achieved is fully understood. The conclusion of the authors that GDPR is thus unable to embrace this new technology is missing the point that GDPR is intended to provide guidance on how to develop new technology in the first place. In Part 2 of this sequel, we will discuss how the transparency and immutability issues raised by BC can be addressed by implementing innovative privacy‑by‑design measures.
Read Part 2 of this post here.
This blog is a summary version of a full publication of Lokke Moerel published in European Review of Private Law 6-2019 [825 – 852] and in The Cambridge Handbook of Smart Contracts, Blockchain Technology and Digital Platforms (September 2019).