Data Protection Considerations for the Paris Olympic and Paralympic Games

With the anticipation of several million spectators and thousands of athletes flocking to France for the upcoming Olympic and Paralympic Games, strict security measures are being scrutinised by the National Commission for Information Technology and Civil Liberties (CNIL), the French data protection body. This includes verifying the use of QR codes, access permissions, and advanced camera technologies. CNIL is also focusing on the commercial aspects, particularly data collected through ticketing services. Given the scale of involvement and potential data recipients, CNIL ensures compliance with legal requirements, checking data accuracy, recipients, and security measures.

CNIL Privacy Concerns

CNIL, the French data protection authority, has expressed significant concerns regarding data protection during the Olympic and Paralympic Games. CNIL will assess the collection of commercial data, particularly concerning ticketing, to ensure the privacy of the millions of spectators and athletes. This involves verifying the nature of shared information, data recipients, and the adequacy of security measures. 

Furthermore, CNIL will focus on safeguarding minors' privacy on online platforms, such as social networks and gaming platforms, by checking age control mechanisms and data minimisation practices. Loyalty programs and the digitalisation of sales receipts in the retail sector are also under scrutiny for potential data misuse and compliance with GDPR and French Data Protection Act regulations. Lastly, CNIL emphasises the importance of respecting individuals' right of access to their data, which will be subject to thorough examination in coordination with European authorities.

AI Surveillance

In order to increase security and protection, the French government is planning on using  Artificial Intelligence (AI)-driven video surveillance at the Olympic Games in Paris this summer. However, as with any advancement in surveillance, it is raising concerns about privacy and civil liberties.

AI-powered surveillance employs sophisticated algorithms to analyse real-time video feeds from surveillance cameras. Developed by companies like Videtics, Orange Business, ChapsVision, and Wintics, these algorithms are trained to identify predefined "events" or abnormal behaviour, such as crowd surges, abandoned objects, or the presence of weapons. Human intervention is crucial, as alerts generated by the system require human assessment before action is taken.

Despite the potential benefits of enhancing security and preventing incidents like the 1996 Atlanta bombing or the 2016 Nice truck attack, AI surveillance raises significant privacy concerns. Rights campaigners caution against the potential erosion of civil liberties, especially considering the broad scope of surveillance permitted under the new legislation.

While the law prohibits facial recognition in most cases, there are still concerns about this. Although companies like Wintics say that their algorithms are not designed for facial recognition, there's apprehension about the potential for such technology to be integrated in the future. Amnesty International France highlights the need for robust legal safeguards to prevent misuse of facial recognition technology.

Privacy Concerns

The proposal to implement algorithmic video surveillance in publicly accessible areas poses a serious threat to civic freedoms and democratic values. The mere presence of such surveillance can create an effect on fundamental rights, such as the right to freedom of assembly, association, and expression. Biometric surveillance takes away individuals' expectation of anonymity in public spaces and diminishes their willingness to exercise their civic freedoms due to fear of identification, profiling, or persecution.

Expanding surveillance to include activities deemed as "atypical," such as begging or stationary assemblies, risks stigmatising and discriminating against marginalised groups. Evidence suggests that surveillance technologies disproportionately harm these communities, leading to over-policing and structural discrimination within the criminal justice system.

Critically, the proposed legislation denies the processing of biometric data by algorithmic surveillance systems. However, the nature of these systems inherently involves capturing and analyzing individuals' physiological features and behaviours, constituting biometric identification under EU data protection law. This raises concerns about mass biometric surveillance and the potential for invasive categorisation based on individuals' biometric features.

As discussions around AI surveillance continue, it is essential to support democratic values and protect fundamental rights, particularly during high-profile events like the Olympics. Striking a balance between security measures and privacy protections is crucial to ensuring a safe and rights-respecting environment for all participants and spectators.

To address these concerns, France's Interior Ministry has established an evaluation committee comprising officials from the administrative court, CNIL (France's privacy watchdog), lawmakers, and a mayor. This committee will oversee the deployment of AI surveillance during the trial period, ensuring that civil liberties are safeguarded.

How Can Gerrish Legal Help?

Gerrish Legal is a dynamic digital law firm. We pride ourselves on giving high-quality and expert legal advice to our valued clients. We specialise in many aspects of digital law such as GDPR, data privacy, digital and technology law, commercial law, and intellectual property. 

We give companies the support they need to successfully and confidently run their businesses whilst complying with legal regulations without the burdens of keeping up with ever-changing digital requirements. 

 

We are here to help you, get in contact with us today for more information.

Previous
Previous

The UN Agrees on Global AI Rules

Next
Next

Advertising cookies: The ICO raises the question on "Consent or Pay" models