Part 1: EdTech - Privacy Implications
In 2019, Ed Tech made up 4% of companies operating in the digital sector, and there are projections that this industry will grow by 7% annually.
Schools across the UK have spent close to £900 million each year on Ed Tech. However, what do we really know about the privacy implications of this fast-growing industry?
In this first part of our two-part article, we look at the use of AI and Surveillance in education. In the second part our our two-part article, we look at case studies from around Europe and set out some practical considerations.
The Use of Surveillance and AI in Education
Education is changing. New AI systems are allowing students to be in control of their own education and can be used to predict curriculums that will be suitable for them and interesting to them. Children are also increasingly taught how to use tech for the future. Some predict that in the future, educators could be replaced by technology.
Additionally, the use of analytic surveillance cameras with AI based software allows schools to monitor students based on their age, gender, clothing and facial characteristics. It makes it possible to review where they have travelled throughout school and search through camera feeds to identify individuals.
Schools can also employ online surveillance. Automated 24-hour a day surveillance systems can continually scan school emails, documents and chat messages to alert school officials about any concerning phrases.
These new AI and surveillance systems collect personal and biometric data meaning that the General Data Protection Regulation (EU) 2016/279 (GDPR) will apply. So, what privacy implications does this bring for school students, and is it really possible to use this tech in a fair and lawful way?
Children’s Data
The GDPR explicitly states that processing personal data relating to children requires specific protection since children might be less aware of the risks.
The age of privacy consent
It is also important for providers of EdTech to understand the different ages for privacy consent - which can range from 16, down to as low as 13, depending on the services offered and the country where consent is sought. It is therefore important to ensure providers implement appropriate measures, including by obtaining parental or guardian consent where required.
IT and Information security
There must be appropriate technical and organisational measures put in place to safeguard the privacy rights of children. The UK ICO recommends a privacy by design system, organising procedures with the protection of children’s rights in mind from the very outset. Transparency, awareness and fairness are key.
It might be feasible to create AI systems and new school apps in this way, so long as the rights of the child are kept at the forefront of creators’ minds.
However, using surveillance in order to record attendance must be justified by showing that it is necessary, proportionate and fair. This means that surveillance systems are automatically difficult to justify when it comes to children. CCTV is constant and AI systems gather massive amounts of data- not only the data that is required for its purpose.
Sensitive Data: Surveillance Systems
As well as this, video surveillance which specifically focuses on biometric data, such as via facial recognition tracking must be used with caution, since it collects physical and behavioural characteristics. The GDPR recognises biometric data as a “special category personal data” which means that processing such data is technically prohibited, unless a specific exception, such as consent, has been relied upon, and additional protection is required. Data controllers must conduct a privacy impact assessment to evaluate whether the processing of such data would result in a high risk to the rights and freedoms of the individuals concerned. Controllers are also required to consult with data protection authorities.
Obtaining Consent?
The GDPR allows for consent as a basis of processing personal data. However, this will only be appropriate if schools are truly able to give children (or their parents or guardians) informed choice and control over how the data is used.
The ICO has outlined that obtaining consent should not be used as a way to avoid your obligations under the GDPR- simply obtaining consent won’t necessarily mean that processing is fair.
Consent under the GDPR must be informed and freely given, without any imbalance of power. In an education setting, students are in a dependant position towards the school considering factors such as grades and future career options. In this context, it is difficult to imagine how consent could be freely given, given the large imbalance between the controller and the data subject.
There is also the fact that children might not have as great an understanding of the privacy issues of data processing and the rights that they deserve. If they do not understand the effects that AI and CCTV could have on these rights, they cannot consent to processing.
Of course, it is always important to look at live-cases when considering a compliance plan, as ongoing or recent enforcement action can help companies rolling out EdTech tools and surveillance to avoid the pitfalls.
Stay-tuned next week for the second article in this two-part series, where we take a look at cases from around Europe and provide some practical guidance.
However, if in the meantime, you have any questions relating to Ed Tech, or data privacy in general, please don’t hesitate to get in touch!
Article by Lily Morrison @ Gerrish Legal, April 2020