Why the “Right to be Forgotten” is still hard to implement?

POSTED ON APRIL 06, 2026 BY DATA SECURE
breach

Introduction

The Right to be forgotten/ Right to erasure (RTBF) is one of the most widely discussed ideas in the realm of data protection laws. At a glance, it seems simple i.e. if an individual no longer wants an organisation to keep their personal data, that organisation should delete it. The idea behind the right was to allow individuals to reclaim control over their personal data and remove outdated or harmful information from constant processing. However, its implementation is far more difficult than the phrase suggests. Article 17 of the General Data Protection Regulation (GDPR) grants a data subject the right to erasure of his/her personal from the data controller without undue delay, but it does not create an unlimited right to demand deletion in every case.

The challenge lies for organisations to assess whether the right meets the legal criteria, whether any exceptions apply, and whether the data can actually be located and erased across complex systems. This doesn’t limit the right to be forgotten as a regulatory compliance but extends itself to a technical and operational challenge. The problem lies in the fact that modern organisations are designed to store, duplicate, analyse, and preserve data, while the GDPR sometimes requires them to do the opposite which is to ‘forget’.

What does the law say?

The GDPR summarises when is a data subject entitled to the right to forget, it only applies when specific grounds are met such as an individual may be entitled to erasure where the data is no longer necessary for the purpose for which it was collected, where consent is withdrawn and there is no other lawful basis for processing, where the person objects and there are no overriding legitimate grounds, where the processing is unlawful, or where erasure is required by law.

But the law doesn’t treat this right as absolute. GDPR under Article 17(3) lists exceptions including where processing is necessary for freedom of expression and information, compliance with a legal obligation, public health purposes, archiving or research in the public interest, or legal claims. Therefore a request for right to be forgotten does not result in automatic deletion of data, rather it warrants a legal analysis.

Legal complexities behind RTBF requests

breach

When an erasure request arrives, an organisation cannot simply press a delete button. It must determine whether one of the Article 17 grounds actually applies. Even if it does, the organisation must then decide whether one of the exceptions prevents erasure. This creates a situation where a personal data deletion leaves the organizations to make a ‘ legal judgement’ call where the onus lies on them to interpret context, apply exceptions, and justify their decision. For example: An employee may request the deletion of his data from the organisation however, the employer may need to retain certain records to comply with labour or tax laws. In cases like these the individual’s interest in exercising his/her right is real but it does not automatically override organisation’s obligations.

Search engine law suits and the public understanding of RTBF

The general understanding of RTBF was shaped less by routine deletion of personal information, rather it was shaped by search engine case laws. Although these cases brought the right to the forefront, but it also made it easy to misunderstand.

In the sAGoogle Spain case, it was held that that search engine operator can be treated as a data controller in relation to the personal data it indexes and displays in search results, especially where results appear following a search on a person’s name and that individuals’ may ask search engines to remove links from name-based search results where the information is “inadequate, irrelevant or no longer relevant, or excessive” in light of the purposes of processing and the time that has elapsed. This remedy provided by the court is called delisting. Delisting is not ‘deletion’, CNIL (Commission nationale de l'informatique et des libertés, French DPA) explains that distinction clearly, it states that a search result with a person’s personal data remains available on the source website, but the results from the search engine can be made less visible without ceasing to exist online.

The phrase “right to be forgotten” often suggests a right to remove information entirely, whereas in reality it concerns a narrower remedy about discovering through search engines. This gap between public expectation and legal reality makes implementation more difficult from the outset, because organisations are often responding to a concept that is understood by individuals more broadly than the law supports.

Why erasure requests are difficult to execute in practice?

breach

In current times even when the erasure is legally required, the challenge remains significant. As the personal data in an organisation is not maintained in a single centralised environment. Rather the personal data is distributed across customer databases, support systems, payment platforms, marketing tools, internal communications, analytics environments, cloud storage, and archived records. Regulatory guidance reflects this operational reality. The EDPB’s coordinated enforcement action (CEA, 2025) found recurring problems around internal procedures, data mapping, retention periods, backup deletion, and the application of exceptions.

Before the data can be forgotten or erased, the organisation must identify the full range of systems in which it is appears. This may be manageable in organisations with simple environments, but in large organisations with legacy infrastructure, multiple vendors, or fragmented governance, it can be materially difficult.

The second challenge faced by organisations is that digital systems are typically designed for resilience, continuity, and auditability. The data that comes into the organisation is therefore replicated, stored in logs or persevered and retained backups and disaster recovery environments. These steps are sensible from an engineering and security perspective, but they complicate erasure. So, when a deletion is carried out in the active system/ live systems it does not ensure that all copies have been immediately deleted from the wider technical environment.

Why identifying data is more complex than it appears?

Individuals are rarely represented consistently across an organisation’s infrastructure. The same person may appear under an email address in one system, an account number in another, a device identifier in a third, and free-text references in support records or internal correspondence. An effective erasure of the data would require all the fragmented identifiers to be reconciled accurately. If relevant instances are missed the deletion remains incomplete.

The practice of erasure becomes more complicated when an organisation not only holds the data provided by the individual directly, but also the data it has generated about that person during the course of its operations (for eg: profiles, scores, labels, and inferences) which can be linked back to a specific person. The Information Commissioner’s office (ICO) makes this clear in its sAguidance, noting that inferred characteristics and scores can still be personal data when attached to identifiable individuals. So, when an organisation receives a right to be forgotten request it may be required that the organisation delete derived outputs such as behavioural segments, fraud indicators, or preference profiles. It is not mandatory that all outputs must be automatically be erased, especially if Article 17 exceptions apply, however organisations cannot assume that inferred or generated data sits outside the erasure analysis simply because the individual did not provide it themselves.

Why AI increases the complexity further?

breach

The Artificial Intelligence introduces an additional layer of complexity. In traditional environment erasure pertains to deletion of records, files, logs or database entries. But in an AI system, when personal data is captured, it is used for various purposes such as training/ fine-tuning the model behaviour. Which further raises questions as to what does erasure of personal data look like if it has already influenced a model?

The European Data Protection Supervisor has identified ‘ machine unlearning’ as one of the important aspects for AI models in terms of auditability and verifiability in demonstrating that data has been removed from a model or that its impact has been mitigated. This further introduces broader compliance challenges. In future AI based models will require organisations to consider not only source data, but also focus the downstream effect of that data on model outputs and system behaviour.

So, when organisations employ AI based models in their environment, it does not reduce the challenges of right to be forgotten but it extends them. Where businesses are increasingly using data to train, optimise, and automate systems, the practical meaning of ‘forgetting’ becomes harder to define and harder to evidence.

What do organisation need to focus on to ensure confirmation with requests for Right to be forgotten?

  1. Ensure proper data mapping: as an organisation it is essential to have all your data mapped. In simple terms the rationale is that you cannot erase what you cannot find. This was one of the primary reasons for lack/ improper implementation of RTBF which the EDPB found in its CEA 2025. One of the suggestions to over come this issue was to Map personal data and storage locations, so that the organisations have a clear idea on where to search for the data when they receive a request for erasure.
  2. Create a documented erasure workflow: The right to erasure request is just like any other right granted under GDPR, and has a response period capped at one month. Many organisations in the EDPB’s survey were found to either have an incomplete/ irregular erasure workflow or to not have any internal procedure to deal with data erasure procedures in place. Without a proper documented erasure workflow, organisations will struggle to handle requests consistently.
  3. Set and enforce clear retention periods: A major problem which organisations overlook is the storage of data without properly defined retention periods. GDPR states that the personal data should not be kept for an indefinitely but must be must be retained only for a period linked to the purpose for which it was collected. With clearly defined data retention practice it becomes easier for an organisation to identify what should be deleted, what may still lawfully be kept, and how erasure requests can be handled in a structured and credible way.
  4. Make deletion work across backups, processors and recipients: The ICO and CNIL have emphasized on the fact that compliance with the RTBF, requires more than simply deleting the ‘live data’ from a controller’s main systems. Rather, erasure must be carried through, so far as possible, across the wider data environment, including internal databases, inferred datasets, departmental records, backup systems, third-party processors, and other recipients to whom the personal data has been disclosed. In case the deletion, is not possible the organisation should rely on anonymization techniques to ensure that the data cannot be linked back to the individual. As the anonymized data will no longer be personal data it would not warrant compliance with RTBF.
  5. Train Staff and keep records of decisions: As mentioned earlier in the blog the right to be forgotten is not an absolute right. When responding to such a request organisation have to assess when the right is applicable? Where the exceptions apply? And how to document it. EDPB’s recent report found many organisations across Europe to be lacking in this. It is therefore recommended that the organisations have specific policies in place to assess the request with different scenarios of exceptions. Organisations should also keep records of the decisions made on each request, since documentation helps show that requests were assessed carefully, handled consistently, and resolved on a defensible legal basis thereby showing accountability in case any future conflicts arise.

Conclusion

The right to be forgotten is better understood not as a guarantee of disappearance, but as a test of how far organisations can actually limit their own ability to retain and reuse personal data. The discussion in this blog shows that erasure is shaped by legal conditions, practical constraints, fragmented systems, duplicated storage, inferred data, and the growing role of AI. For that reason, the right matters not simply because it allows deletion in some cases, but because it exposes the gap between what the law promises and what digital systems are often designed to do. Privacy law seeks to make forgetting possible whereas modern data systems are often built to make remembering easier. The challenge of the right to be forgotten lies in trying to reconcile those two realities.

We at Data Secure (Data Privacy Automation Solution) DATA SECURE - Data Privacy Automation Solution  can help you to understand Privacy and Trust while lawfully processing the personal data and provide Privacy Training and Awareness sessions in order to increase the privacy quotient of the organisation.

We can design and implement RoPA, DPIA and PIA assessments for meeting compliance and mitigating risks as per the requirement of legal and regulatory frameworks on privacy regulations across the globe especially conforming to GDPR, UK DPA 2018, CCPA, India Digital Personal Data Protection Act 2023. For more details, kindly visit DPO India – Your outsourced DPO Partner in 2025 (dpo-india.com).

For any demo/presentation of solutions on Data Privacy and Privacy Management as per EU GDPR, CCPA, CPRA or India DPDP Act 2023 and Secure Email transmission, kindly write to us at info@datasecure.ind.in or dpo@dpo-india.com.

For downloading the various Global Privacy Laws kindly visit the Resources page of DPO India - Your Outsourced DPO Partner in 2025

We serve as a comprehensive resource on the Digital Personal Data Protection Act, 2023 (Digital Personal Data Protection Act 2023 & Draft DPDP Rules 2025), India's landmark legislation on digital personal data protection. It provides access to the full text of the Act, the Draft DPDP Rules 2025, and detailed breakdowns of each chapter, covering topics such as data fiduciary obligations, rights of data principals, and the establishment of the Data Protection Board of India. For more details, kindly visit DPDP Act 2023 – Digital Personal Data Protection Act 2023 & Draft DPDP Rules 2025

We provide in-depth solutions and content on AI Risk Assessment and compliance, privacy regulations, and emerging industry trends. Our goal is to establish a credible platform that keeps businesses and professionals informed while also paving the way for future services in AI and privacy assessments. To Know More, Kindly Visit – Your Trusted Partner in AI Risk Assessment and Privacy Compliance | AI-Nexus