EU Data Protection Laws Are Not Fit For Purpose: They Undermine the Very Autonomy of the Individuals They Set Out to Protect

The European Union is supposed to have the strongest data protection laws in the world. So why do privacy violations continue to make the headlines? I believe that the lack of material privacy compliance is not due to lack of enforcement, but is due to a fundamental flaw in our European data protection laws. Our laws are supposed to ensure people’s autonomy by providing choices about how their data is collected and used. In a world driven by artificial intelligence, we, however, can no longer understand what is happening to our data, and the concept of free choice is undermined by the very technology our laws aim to protect us against. The underlying logic of data-processing operations and the purposes for which they are used have now become so complex that they can only be described by means of intricate privacy policies that are simply not comprehensible to the average citizen. Further, the reality is that organizations find inscrutable ways of meeting information and consent requirements in a way that discourages individuals from specifying their true preferences, and, therefore, they often simply feel forced to click “OK” to obtain access to services.

Our data protection laws have resulted in what Prof. Corien Prins and I have named mechanical proceduralism (read here), whereby organizations go through the mechanics of notice and consent without any reflection on whether the relevant use of data is legitimate in the first place. In other words, the current preoccupation with what is legal is distracting us from asking what is legitimate to do with data. We even see this reflected in the highest EU court having to decide whether a pre-ticked box constitutes consent (surprise: it does not). Privacy legislation needs to regain its role of determining what is and what is not permissible. Instead of a legal system based on consent, we need to re-think the social contract for our digital society, by having the difficult discussion around where the red lines for data use should lie, rather than passing the responsibility for a fair digital society to individuals to make choices they cannot fully comprehend. 

Read our client alert.