ChatGPT & data protection: What you need to know! [2024]

Table of contents
WordPress slow?
We'll sort it out for you!
ChatGPT & Privacy Blog post image
Table of contents

Artificial intelligence ChatGPT opens up many new possibilities for companies, but also for the self-employed and private individuals. ChatGPT can answer simple and complex questions, plan birthday parties or write you a Python script. Companies have also recognized this potential.

In companies, ChatGPT can be used, for example, to create texts for flyers, websites or social media accounts. But ChatGPT can also be used to answer general questions that previously had to be laboriously googled.

Artificial intelligence is a new, constantly improving technology. However, ChatGPT, like many new technologies, comes up against old existing law. For this reason, new technologies must be designed and used in compliance with the law. In the case of online services, this applies in particular to data protection law.

We explain exactly what this means for you in the following article. After that, you'll know what to look out for from a data protection perspective when using ChatGPT!

ChatGPT and data protection

The General Data Protection Regulation (GDPR) regulates the processing of personal data (data that identifies or can identify a person) in the EU. Such processing also takes place when ChatGPT data is communicated. The strict requirements to which the processing of personal data is bound should be met.

Transparent processing

The principles of processing personal data are listed in Art. 5 GDPR. These serve to ensure the lawful and appropriate handling of personal data.

One of these data protection principles for the processing of personal data is transparent processing. The principle of transparency requires that all information on the processing of data and the way AI functions is easily accessible, understandable and written in clear and simple language.

Due to the non-public algorithm and other complex processes that take place in the background with ChatGPT, such informed, transparent processing is not possible.

💡 Note:

The ChatGPT algorithm - i.e. the way ChatGPT works - is not transparent. As a result, the data protection principle of transparent processing of personal data cannot be complied with when using ChatGPT.

Commercial use

For commercial use of the OpenAI API (not ChatGPT), access is provided for paying customers via a programming interface, also known as an API. For this purpose, the operator offers OpenAI LP an order processing contract (AVV) in accordance with Art. 28 GDPR. This serves to regulate the processing and data transfer between OpenAI and the customer.

However, the legal situation with regard to responsibilities has not been clearly defined. It is unclear whether this is an order processing relationship or whether there is joint or separate responsibility. The concerns arise from the fact that OpenAI processes the data not only for training purposes, but presumably also for advertising purposes. As a result, OpenAI also pursues its own interests, which could speak against the requirements of commissioned processing pursuant to Art. 28 para. 3 sentence 2 lit. a) GDPR.

ChatGPT confirms that data will be used for training and other purposes

According to Art. 26 para. 1 sentence 1 GDPR, joint controllership exists if two or more controllers jointly determine the purposes and means of the processing.

However, it is unlikely that both conditions are met, as the company using the API probably has little influence on the processing at OpenAI and the means of processing are not jointly determined.

Since only your own AVV from OpenAI the legal situation with regard to subprocessing is also unclear. The company using the API could potentially act as a processor for its own customers, which would make OpenAI a sub-processor. This aspect would need to be clarified transparently in the relationship between the company and its customers in the DPA

💡 Note:

Since OpenAI uses the data for internal training purposes, a joint responsibility presumably arises between ChatGPT and the user who uses ChatGPT. This makes it necessary to conclude a joint controllership agreement in accordance with Art. 26 GDPR.

Data transmission

Articles 44 et seq. GDPR set out specific requirements for the transfer of personal data to third countries. The United States of America is classified as an unsafe third country, which means that for the transfer of data, the company must either be certified in accordance with the EU-U.S. Data Privacy Framework or the standard contractual clauses pursuant to Art. 28 para. 7 GDPR should be included in order processing contracts with additional security measures.

The OpenAI API is offered by the US company OpenAI LP, which is why data is transferred to the USA every time it is entered.

OpenAI is not certified under the new Data Privacy Framework between the USA and the EU, which is why the transfer is regulated in the GCU standard contractual clauses. These also require a Transfer Impact Assessment (TIA) to be carried out for the transfer and security to be ensured through appropriate technical and organizational measures.

💡 Note:

ChatGPT is a US-American tool which, like Microsoft or Google Tools, transfers personal data to third countries. Generally prefer tools, software or even Hostersthat have their headquarters in Europe - or even better, Germany - and are therefore subject to the General Data Protection Regulation.

Dealing with AI

As a matter of principle, personal data or other sensitive data should not be passed on to artificial intelligence systems such as ChatGPT. As all data could be reused as training data, there is a possibility that the data entered could later be passed on to third parties by the AI.

The OpenAI API offers more security in this respect, as the processing of the data is defined in the DPA. If the OpenAI API is used with the transfer of personal data, the data subjects whose data is transferred must be informed accordingly in accordance with Art. 13 GDPR.

All projects that use artificial intelligence should also have a data protection plan.
impact assessment (DPIA) must be carried out. A data protection impact assessment is
a tool for identifying, describing, assessing and minimizing risks to the rights and freedoms of natural persons with regard to the processing of their personal data.

According to Art. 35 para. 1 GDPR, a data protection impact assessment must always be carried out if the processing of the data is likely to result in a high risk to the rights and freedoms of data subjects. To determine this, a risk analysis must be carried out before data processing begins. If it is determined that the planned processing is likely to result in a high risk for data subjects, a DPIA must be carried out.

💡 Note:

ChatGPT sensitive data
ChatGPT's answer to the question of whether sensitive data should not be disclosed in ChatGPT.

No personal data, such as name, address, e-mail address or IP address, or other sensitive data should be entered in ChatGPT. It cannot be guaranteed that this data will not be passed on or processed further.

Conclusion on data protection & ChatGPT

The use of ChatGPTs with the entry of personal data is not advisable in the company. Even if the use of personal data for training purposes is deactivated in the settings, it cannot be determined with certainty how exactly the data is processed.

The OpenAI API offers more security in this case, as certain security guarantees are provided by the GCU and the standard contractual clauses. However, even here it is not transparent how exactly data is processed, which is why the provision of personal or company data should still be avoided.

When using artificial intelligence, it is recommended that data protection is taken into account from the very beginning of implementation. Company guidelines, transparent documentation, the implementation of a DPIA and employee training are important components of the secure and lawful use of artificial intelligence.

Picture of DEUDAT GmbH
DEUDAT GmbH
As an owner-managed company, the management has more than 25 years of expertise in the field of data protection and information security. A team of experienced professionals and lawyers specializes in providing comprehensive advice on data protection and information security issues to organizations regardless of their industry, size or focus. The aim is to help you introduce, implement and maintain an appropriate level of data protection. We will show you how the associated challenges can be mastered securely and efficiently with the appropriate expertise and a suitable organizational structure. The main focus is not only on solving problems and minimizing risks, but also on working with customers to develop tools that contribute to success. The company acts as a strong partner that goes far beyond simply solving problems. The company's guiding principle is: simple, secure, good advice.

Links marked with an * are affiliate links. If you buy a product via this link, WPspace receives a small commission. There are no additional costs or disadvantages for you! This has no influence on our opinion of individual products and services - we only recommend what we love ourselves.

Leave a Reply

Your email address will not be published. Required fields are marked *