Does Data Retention Prevent Crime? A Critical Analysis in Light of Union Law, Fundamental Rights, and Alternative Policy Models Executive Summary This study provides a critical analysis of the effectiveness of general and indiscriminate data retention practices practices within the European Union (EU) regarding crime prevention. Particularly in the context of criminal proceedings, it becomes evident that data retention practices does not deliver the expected security gains, while simultaneously posing significant threats to EU fundamental rights such as the right to privacy, freedom of expression, and the protection of private life. Key Findings:

Weak Legal Foundation: The practice of comprehensive data retention practices contradicts primary EU law, as confirmed by the jurisprudence of the Court of Justice of the European Union (CJEU) (Digital Rights Ireland, Tele2 Sverige). Limited Effectiveness: Independent research indicates that it does not improve the crime clearance rate. Interference with Fundamental Rights: Journalism, activism, and political opposition are affected by chilling effects. Increasing Economic Burden: Smaller providers bear excessive costs; cybersecurity risks are on the rise. Technological Inadequacy: In the age of IoT, 5G, and artificial intelligence (AI), data volumes are exploding, making these storage models increasingly intrusive and uncontrollable. Key Recommendations to the Commission:

General forms of data retention practices should be strictly avoided. Member States should promote targeted models with time limits and independent judicial authorization. The right to encryption and anonymity should be respected, and users’ digital privacy strengthened. Supportive infrastructures and policies to alleviate the burden on small and medium-sized enterprises should be developed, with particular consideration for economic impacts.

  1. Introduction With ongoing digitalization, methods of combating crime have also evolved. One such method is data retention practices, which is widespread, under the pretext of counter-terrorism. Nevertheless, these general and indiscriminate practices pose significant risks to individual freedoms and often fail to achieve the intended security objectives. This study, as part of the European Commission’s Impact Assessment process, thoroughly examines the dimensions of necessity, proportionality, impact on EU fundamental rights, utility for criminal justice, and economic costs of current data retention practices policies. Furthermore, their future viability in the age of new technologies is analyzed.
  2. Methodology The study is based on a qualitative content analysis. Key sources include:

Jurisprudence of the Court of Justice of the European Union (Court of Justice of the European Union (CJEU)): the Digital Rights Ireland and Tele2 Sverige decisions. European Court of Human Rights (European Court of Human Rights (ECtHR)): the Big Brother Watch v. United Kingdom decision. Independent Reports: such as those from ENISA (European Union Agency for Cybersecurity) and the European Parliament. National Case Studies: such as the Turkish ByLock case. Additionally, statistical data and economic cost analyses were included to evaluate both the effectiveness and the social and technical consequences of data retention practices. 3. Literature Review Research on data retention practices can be broadly categorized into three areas:

3.1 Legal Approaches Binns (2018) meticulously analyzes the incompatibility of these practices with the right to privacy. De Hert & Poullet (2013) address the legitimacy of such measures in light of EU fundamental rights.

3.2 Effectiveness Assessment Hoofnagle et al. (2012) demonstrate that data retention practices measures introduced under the Patriot Act in the USA showed no measurable effect. ENISA (2020) highlights the technical and financial burdens faced by small providers.

3.3 Political and Societal Impacts Lyon (2018) links these policy forms to the emergence of a “surveillance society.” Zuboff (2019) exposes how platforms commercially exploit personal data—a phenomenon she describes as “surveillance capitalism.” 4. Data Retention and EU Law: Necessity and Proportionality 4.1 Necessity Test In the view of the Court of Justice of the European Union (CJEU), general data retention practices practices do not pass the necessity test. They have so far failed to provide clear evidence of their suitability against terrorism or serious crimes.

4.2 Proportionality Test The principles of proportionality are violated because:

All citizens are indiscriminately affected. No suspicion is required. The retention period is excessive (up to two years). No prior authorization by independent courts is mandated. 5. Utility for Criminal Justice 5.1 Presumption of Innocence This policy encourages “fishing expeditions,” which in turn undermines the presumption of innocence.

5.2 Example Turkey – ByLock Case Millions of individuals were suspected without concrete evidence based solely on the use of an app (ByLock); mere presence in metadata was sufficient. 6. Economic and Technical Costs 6.1 Impact on Service Providers ENISA (2020) notes that small providers, in particular, face disproportionate financial pressure.

6.2 Cybersecurity Risks The inability to securely store sensitive data leads to:

Massive data breaches. Increased risks to public safety. Loss of trust in digital systems. 7. Future Outlook: IoT, 5G, and Artificial Intelligence The explosive increase in data volumes due to IoT, 5G, and artificial intelligence (AI) renders traditional storage models unsuitable. artificial intelligence (AI) today goes beyond mere analysis—it can derive new correlations that further endanger EU fundamental rights.

7.1 New Risks Posed by artificial intelligence (AI) and Mass Data Analysis Automated Profiling and Discrimination: artificial intelligence (AI) models learn from historical data. If these data contain systematic biases (e.g., association with a criminal offense solely based on using an app like ByLock, which can lead to collective stigmatization), such biases can be automatically reproduced and discriminatory practices intensified, unjustly targeting groups or individuals. False Positives and Weakening of the Presumption of Innocence: Statistically relevant correlations can be misleading or oversimplified. For example, a model might falsely identify a user group as statistically linked to “suspicious” activity based on the use of a particular app, even if there is no concrete individual evidence. This undermines the presumption of innocence. Opacity and Lack of Transparency: Often, artificial intelligence (AI) systems operate as black boxes, whose decision-making processes are not explicit or easily traceable. This makes it difficult for affected individuals to ascertain the reasons for surveillance measures or other decisions concerning them, or to effectively defend themselves against them, thereby impairing the right to an effective remedy.

7.2 Lack of Adaptation of Current Policy to New Technologies Existing data retention practices rules are not equipped to handle the rapidly increasing data streams from IoT, the speed of 5G, or the predictive capabilities of artificial intelligence (AI). This leads to general storage models becoming unmanageable and misuse risks increasing. Without independent judicial control, massive risks of abuse threaten to undermine social justice and EU fundamental rights.

Conclusion to Chapter 7 In developing future artificial intelligence (AI)-supported methods for combating crime, the principles of data collection and analysis must be subjected not only to technical but also to strict ethical and legal limits. Otherwise, general data retention practices measures combined with artificial intelligence (AI) could lead to a structure that contradicts the values of democratic societies, violates human dignity, and opens the door to arbitrary interventions. 8. Conclusions and Recommendations General data retention practices possesses neither a stable legal basis nor demonstrable effectiveness. It directly attacks the most EU fundamental rights. Specific Recommendations to the Commission:

The Commission should determine that these practices are incompatible with primary Union law, as confirmed by Court of Justice of the European Union (CJEU) jurisprudence. Member States should receive clear guidelines to promote targeted, proportionate models limited to serious crimes, under independent judicial control. Frameworks for the protection of encryption, anonymity, and digital privacy must be strengthened. Small and medium-sized service providers burdened by the requirements should be relieved through technical and financial support.

Bibliography Binns, R. (2018). Algorithmic Accountability and Transparency in the EU GDPR. Philosophy & Technology, 31(2), 211–233. De Hert, P., & Poullet, Y. (2013). The Data Retention Directive: The Ghost that Should Not Walk. Computer Law & Security Review, 29(6), 673–683. Hoofnagle, C. J. et al. (2012). How Different is Privacy Law in Europe vs. the US? Berkeley Technology Law Journal, 28(2), 411–454. Lyon, D. (2018). The Culture of Surveillance: Watching as a Way of Life. Polity Press. Zuboff, S. (2019). The Age of Surveillance Capitalism. PublicAffairs. European Court of Human Rights (ECtHR) (2021). Big Brother Watch and Others v. the United Kingdom, Application no. 58170/13. Court of Justice of the European Union (CJEU) (2014). Digital Rights Ireland Ltd v. Minister for Communications, Marine and Natural Resources and Others, Case C-293/12. Court of Justice of the European Union (CJEU) (2016). Tele2 Sverige AB v. Post- och telestyrelsen and Secretary of State for the Home Department v. Tom Watson and Others, Joined Cases C-203/15 and C-698/15. ENISA (2020). Data Retention Practices in Europe. European Union Agency for Cybersecurity. European Parliament (2019). Privacy and Data Protection in Law Enforcement.