Data protection laws, such as the EU’s General Data Protection Regulation (GDPR), establish a comprehensive framework of individual rights intended to give data subjects greater control over the processing of their personal data. Together, these data protection rights aim to empower individuals while restricting how organizations use and process that data.
The Promise of Control
In practice, however, the experience of control often falls short of expectations. This is not necessarily due to bad faith or unwillingness to comply, but to legal, operational, and technical limitations. Even where controllers act in full compliance, both the regulatory design of the GDPR and the architecture of modern data-processing systems shape and constrain the practical effectiveness of data subject rights, potentially leaving all parties dissatisfied. The result is a recurring gap between legally conferred rights and the technical realities within which those rights must function.
For example, the right to be forgotten (right to erasure) is frequently subject to statutory retention periods, contractual obligations, or legitimate interests that justify continued storage. Consequently, erasure may not result in the disappearance of data in the way data subjects intuitively expect. In addition to these substantive legal constraints, exercising the right typically requires identity verification to prevent fraud, often through the submission of additional identifying information. Paradoxically, deletion may therefore require further disclosure to the very organization from which erasure is sought, making disclosure a precondition for being forgotten. Moreover, the erasure request itself must typically be logged for compliance purposes. In seeking deletion, the data subject creates a new record, a record of having asked to disappear. The path to being forgotten thus ends in continued retention.
Control, in this sense, operates through intermediaries rather than directly. Comparable substantive and procedural limitations likewise affect the exercise of other data subject rights. As a result, formal compliance with the GDPR’s requirements does not necessarily translate into meaningful control from the data subject’s perspective revealing a structural misalignment between the GDPR’s normative commitment to control and its practical operation.
Control as Framed by the GDPR
Recital 7 of the GDPR, titled “Framework Based on Control and Certainty,” states that “natural persons should have control of their own personal data” and that “legal and practical certainty for natural persons, economic operators and public authorities should be enhanced.” While the recital emphasizes both individual control and organizational certainty, it does not define the normative content or practical scope of “control”. By linking individual control with legal certainty, the recital situates control as one objective within a broader regulatory framework rather than as an independently operational concept.
This structural understanding is also reinforced by Recital 11 of the GDPR, which provides that effective protection of personal data throughout the EU requires strengthened data subject rights, clearly defined obligations for those who process personal data, and equivalent powers of monitoring and sanction across the Member States, thus placing individual rights within an enforcement and accountability structure.
Article 25 of the GDPR translates this objective into a design obligation requiring controllers to implement technical and organizational measures that embed data protection principles into processing. However, it does not specify the conditions required for its effective realization. Responsibility for system architecture remains with controllers, reflecting the GDPR’s decision to regulate those who determine the purposes and means of processing.
The wording of Recital 7 and 11 and Article 25 makes clear that the GDPR was not designed to grant individuals technical authority over data systems, but to regulate the conduct of controllers. Control is therefore exercised through legally enforceable rights directed against controllers, embedded in an accountability framework supported by documentation requirements, supervisory oversight, and sanctions. This framework governs controller conduct and requires the embedding of data protection principles without redistributing technical authority. It reflects the GDPR’s regulatory model, which assumes relatively centralized and identifiable controllers capable of being supervised and held accountable by supervisory authorities.
Within this structure, privacy rights constrain organizational conduct but do not transfer technical authority to data subjects. Verification and enforcement remain largely dependent on controller compliance and regulatory intervention. The GDPR therefore confers control in the form of legally enforceable claims against controllers, not as direct, user-enforceable powers over data-processing systems. As a result, data subjects must rely on controller disclosure and compliance, with limited capacity to independently verify outcomes.
Notwithstanding the GDPR’s express commitment to enhancing individual control, its rights, bound by substantive, procedural, and technical constraints, often fail to translate into meaningful practical autonomy. As a result, effective control is largely determined by system architecture, specifically by how technical capabilities to access, delete, retain, and verify personal data are distributed between controllers and data subjects. As data systems become more decentralized and interdependent, the divergence between legally conferred control and technically enabled control becomes more visible.
A closer examination of individual data subject rights illustrates how this architectural constraint operates in practice.
GDPR Control Rights in Practice
The data subject’s Right of Access (Article 15 GDPR) allows data subjects to obtain confirmation of processing and access to their personal data. In practice, this results in disclosure without independent verification. The right provides no technical means to confirm that a controller’s response is complete or accurate. It cannot reveal undisclosed copies, internal inferences, or downstream sharing.
The Right to Rectification (Article 16 GDPR) allows data subjects to correct inaccurate personal data, but rectification does not alter underlying system architectures in which the information circulates. There is no technical guarantee that corrections are consistently implemented across backups, logs, or derived datasets.
The Right to Erasure (Article 17 GDPR) allows data subjects to request deletion of their personal data, but deletion constitutes a commitment to future non-use rather than technical destruction. The scope of this right is legally constrained, as controllers may be required to retain certain data due to statutory retention obligations, contractual duties, or other legal exceptions. Erasure is a legal state change, not a cryptographic event.
The Right to Restriction of Processing (Article 18 GDPR) allows data subjects to limit certain uses of their data, but restriction operates as policy rather than as a technical barrier. The effectiveness of restriction therefore depends on organizational compliance, not architectural constraint.
The Right to Data Portability (Article 20 GDPR) allows data subjects to receive their data in a structured, machine-readable format, but copying data does not transfer control. Portability duplicates data. It does not reallocate authority.
The Right to Object (Article 21 GDPR) allows data subjects to oppose certain forms of processing, but legal objection does not provide a technical stop. Objection is a legal instruction, not a system-level interruption. Its force derives from enforceability, not from embedded technical safeguards.
Rights Related to Automated Decision-Making (Article 22 GDPR) provide data subjects with protections such as the right to human intervention in certain cases. These safeguards regulate outcomes rather than system design and leave the underlying technical architecture largely intact.
Rights Without Technical Authority
The organizational and accountability-based model of control embedded in the GDPR does not fully reflect the operational logic of modern data systems. While the GDPR defines control through legal obligations, duties of transparency, and supervisory oversight, the practical capacity to access, retain, or restrict data is shaped by the configuration of data-processing systems. This structural distinction explains why formal compliance does not necessarily amount to meaningful control in practice. Recognizing this gap is essential to understanding the limits of data protection frameworks centered on individual rights.
The tension becomes particularly visible in complex and highly distributed processing environments. Data is routinely replicated across services, jurisdictions, and vendors, complicating efforts to maintain comprehensive oversight of access and use. In such systems, rights that depend on accurate disclosure and faithful enforcement by the controller cannot reliably guarantee outcomes.
At the same time, questions of control are increasingly shaped by how data-processing systems are configured in practice. As we have discussed in previous blogs, certain technical measures, such as end-to-end encryption or user-managed credentials, show that access restrictions can, in some contexts, be enforced directly through system design rather than relying exclusively on legal obligations or subsequent oversight. Where such arrangements are feasible, describing “user control” only in procedural or regulatory terms appears incomplete, as it maintains an emphasis on centralized accountability even where authority over data can be embedded directly in system architecture. In this sense, defining control primarily as a right exercised against controllers does not capture how authority over data may be allocated by design.
From Legal Control to Technical Control
The limitations described above do not imply that GDPR rights are ineffective or unnecessary. Rather, they show that legal control and technical control address different problems. GDPR rights remain essential for accountability, redress, and organizational oversight, but they cannot, on their own, provide verifiable control in complex, distributed systems.
If privacy is to be more than a matter of policy compliance, it must be reflected in system design.
This discussion reflects a consistent position in Least Authority’s work: Legal and policy frameworks can create obligations, but system architecture ultimately determines what can be enforced and verified. Privacy is not simply declared through regulation, but provided by technical design. Viewing GDPR “control” rights through this lens helps explain why procedural compliance may coexist with persistent dissatisfaction, and why technical design plays a central role in making privacy real, not merely promised.
Written by: Dr. Dorothee Landgraf