Artificial Intelligence: How Data Protection Works

Sandra Dury, Managing Director DURY CONSULT

Artificial intelligence (AI) has been on everyone’s radar, at least since ChatGPT became such a hit. But the use of AI in the professional world far outdates the advent of ChatGPT. AI has been developing steadily for several years now and is gradually finding its way into many industries.

AI can take over work or make it easier. However, data protection should always be taken into account, as AI generally works and trains with data. In this article, we give you tips on how you can implement AI successfully, securely, and legally.

What can AI already do for Companies?

Artificial intelligence already facilitates and automates numerous tasks that would normally require human intelligence, such as:

  • automatic invoicing and quotation processing,
  • composing texts and modules (e.g., in emails),
  • programming software and programmes,
  • collecting and evaluating data (e.g., applicants’ CVs),
  • language translations.

Even though AI is getting better and better at this, mistakes can still occur and the technology cannot yet replace human thinking.

Comply with GDPR and Data Protection Laws

When using artificial intelligence (AI), compliance with data protection laws is also crucial. As soon as AI processes personal data, the relevant data protection regulations must be observed. Careful preparation and knowledge of the applicable legal requirements are therefore essential before a company uses AI.

The following data protection requirements must be observed:

  1. Data subjects must be informed that their data is being processed through the use of AI (transparency).
  2. Data subjects have the right to object to this use or the processing of the data as a whole at any time (right to object)
  3. Processing must always be reduced to a minimum, even when using AI (data minimisation)
  4. Processing of personal data must always serve a specific and legitimate purpose. AI may not store or process data without such a purpose (purpose limitation).
  5. The AI must comply with the deletion periods for data.
  6. If data is transferred within the Group or to third parties, there must be a legal basis for the transfer. If data is transferred abroad, there must be an adequate level of data protection in the third country.

Defining a data protection strategy

Almost every company is affected by data protection law. Therefore, companies should already have an effective and well thought-out data protection strategy in place, which can now also include the use of AI.

Such a strategy should establish internal mechanisms for implementing GDPR requirements. This means, for example, that the deletion deadlines for data are automatically adhered to and that there is a processing directory that is constantly updated.

Before using AI, it is important to check whether this is necessary, whether it meets the requirements for data minimisation, purpose limitation, etc. and what risks its use entails (risk analysis). Only then should it be used.

Clarify Responsibilities

Before using AI in the company, responsibilities and roles should be clarified. There are various options here:

  • Order processing: As a rule, the company acts as the client and the AI service provider as the processor. An order processing contract must then be concluded in accordance with Art. 28 GDPR. In the case of commissioned processing, it is not the processor who is responsible under data protection law, but the client. The processor is bound by the instructions of the client and may not use the data for its own purposes.
  • Joint responsibility: Both the AI provider and the company are jointly responsible for data processing. As a rule, there is no relationship of instruction or dependency. According to Art. 26 GDPR, an agreement is required that defines the division of data protection obligations.
  • Separate responsibility: Only the provider is responsible for the processing of the data, while the entrepreneur is responsible for the input into the AI. Everyone has to fulfil their own data protection requirements.

How to make the use of AI in the company a success

The first step before implementing AI in a company should always be a team discussion and data protection classification. This is because not all software meets data protection requirements and is therefore legally compliant. Therefore, check carefully whether the use of AI with personal data can make work processes better and more effective and how this relates to the data protection risk.


This article was first published in the Silicon Luxembourg magazine. Read the full digital version of the magazine on our website, here. You can also choose to receive a hard copy at the office or at home. Subscribe now.

Total
0
Shares
Related Posts
Total
0
Share