ChatGPT, the AI powered chatbot with hundreds of millions of users, is a phenomenon which has taken the world by storm. You will no doubt have heard about this innovative platform, developed by the company Open AI and released in 2022.
Use of this extremely powerful platform needs to be carefully considered, however, due to several legal risks involved with the technology.
There are various data protection law issues to consider when using ChatGPT, as its use has given rise to different types of data privacy law challenges.
The short answer is that it is currently difficult to say whether ChatGPT complies with the UK GDPR and this question is being heavily debated in the legal and tech industry. However, there have recently been numerous privacy concerns voiced around use of the platform and this is therefore an area which requires careful consideration by businesses seeking to use the technology.
So what are the key issues to consider from a UK GDPR perspective when using ChatGPT?
ChatGPT relies upon an enormous dataset of text scraped from the internet and uses various data, such as training data, testing data, input and output data.
It’s possible and in some instances highly likely that these categories of data can include both personal data and highly sensitive special category personal data. Further, data of children might be gathered (which is also considered to be high risk).
Where personal data is processed in order to provide this service, the UK GDPR rules will apply.
Some of the key privacy issues that are relevant to ChatGPT include the following:
- How is personal data used by ChatGPT and what is the role of the parties involved in the relevant ChatGPT project?
- What is the lawful basis for ChatGPT processing personal data?
- What types of data protection records need to kept in relation to data processed via ChatGPT and has a Data Protection Impact Assessment been carried out where necessary?
- ChatGPT relies on a huge amount of data – how can this comply with the UK GDPR principles around data minimisation, accuracy and retention, where a huge amount of personal data is used? This is a big concern, given the vast collection and storage of information used to train ChatGPT.
- How are individuals given information about how their personal data is used by ChatGPT, particularly where personal data is derived from third party sources? Transparency about the use of personal data is a key principle of the UK GDPR.
- How can individuals exercise their data subject rights under UK GDPR, in connection with use of their personal data by ChatGPT?
In addition to this, several other privacy considerations will apply. These are complicated issues and, in practice, may be very difficult for businesses to address when using ChatGPT. We await further guidance from the ICO on how far ChatGPT itself sits with the UK GDPR rules and the ICO’s opinion on use of this platform.
Important message from the UK Data Protection Regulator
The ICO (the UK data protection regulator) recently published guidance on the key factors to consider when developing generative AI systems.
The ICO has stated that the following questions should be asked when using or developing generative AI:
- What is your lawful basis for processing personal data?
- Are you a controller, joint controller or a processor?
- Have you prepared a Data Protection Impact Assessment?
- How will you ensure transparency?
- How will you mitigate security risks?
- How will you limit unnecessary processing?
- How will you comply with individual rights requests?
- Will you use generative AI to make solely automated decisions?
The ICO has stated in its blog:
‘As the data protection regulator, we will be asking these questions of organisations that are developing or using generative AI. We will act where organisations are not following the law and considering the impact on individuals.
Therefore, it is vital to consider UK GDPR compliance when using the platform and ensure it is used lawfully and in accordance with data protection law rules.
Interestingly the Italian data protection authority temporally banned use of ChatGPT due to privacy concerns in March 2023, as it deemed that Chat GPT breached the GDPR rules (for example, because it had no lawful basis for processing personal data and failed to notify users of its data collection practices). The ban was recently lifted due to OpenAI implementing changes to address concerns as required by the Italian data protection regulator – however, this was not the only regulatory concern raised around the platform and several countries are also investigating the platform and the extent of its compliance with data protection laws. In fact, the European Data Protection Board decided to launch a ChatGPT task force in April 2023, to investigate it due to increasing privacy concerns amongst regulators.
The key takeaway here is that for UK based data controllers and processors, UK GDPR compliance is compulsory and not just optional, and organisations must remember this and ensure that any activities using the ChatGPT platform are always carried out in compliance with data protection law rules.
It is highly advisable for businesses planning to use the platform to carry out a thorough Data Protection Impact Assessment, to consider how personal data would be used by ChatGPT and to investigate the privacy risks involved before deciding whether to proceed.
This is a fast evolving, vast and complex legal topic and this article covers some of the key issues to consider, but please contact our data protection law team if you require advice on using ChatGPT in your business.
We are continuing to closely monitor developments in this area and recommend that businesses keep up to date with the developing legal landscape around use of artificial intelligence and compliance with data protection laws.