Consumer protection /

Artificial intelligence in the financial sector

A legislative procedure is underway to regulate the creation, marketing and use of artificial intelligence systems in the European Union. The following are important issues that may affect entrepreneurs in the financial industry who plan to use AI technologies in conducting business.

A risk-based approach

In recent years, more and more market branches have been subject to regulations based on the principle of risk estimation. We are dealing with this approach in the field of personal data protection (RODO) or anti-money laundering and counter-terrorist financing (AML/CTF) regulations. The AI ACT also uses this method. In the AI ACT, EU lawmakers proposed to introduce four risk levels for artificial intelligence systems:

  • minimal,
  • low,
  • high, and
  • unacceptable

Depending on the level of risk achieved by a given system, there will be corresponding responsibilities for the development, implementation and use of such a system.

High-risk AI in the financial sector

The EU legislature has decided to assign a high level of risk to the following AI-based systems, among others:

  • artificial intelligence systems intended to be used for remote biometric identification of individuals “in real time” and “post factum”,
  • artificial intelligence systems designed for use to assess the creditworthiness of individuals or determine their credit score.

Remote biometric identification is a common solution in connection with remote customer onboarding in the financial sector. When modern systems of this type use artificial intelligence technology, they will be regulated by the AI Act. The act will regulate not only the creation of such systems, but also the rules for using them

The creditworthiness of customers is also examined using IT tools. It is common to use AI systems that, by processing available data on a given customer, provide information on the customer’s history, ability or creditworthiness. Incorporating AI solutions into such systems will result in them being subject to the regulations of the discussed legislation.


The Artificial Intelligence Act will regulate the creation, marketing and use of artificial intelligence systems. This means that not only the developers, but also the users of these systems will be required to take a cautious business approach. Entrepreneurs in the fintech industry, will be required to verify whether IT systems planned for deployment use AI technologies, and if so, to carry out appropriate assessments and measures to safeguard the use of such tools.

Author team leader DKP Legal Alicja Mruczkiewicz
Contact our expert
Write an inquiry: [email protected]
check full info of team member: Alicja Mruczkiewicz

Contact us

Młyńska 16
61-730 Poznań
+48 61 853 56 48[email protected]
Rondo ONZ 1
00-124 Warsaw
+48 22 300 16 74[email protected]
Swobodna 1
50-088 Wrocław
+48 61 853 56 48[email protected]
Opolska 110
31-355 Kraków
+48 61 853 56 48[email protected]
Jana Sobieskiego 2/3
65-071 Zielona Góra
+48 61 853 56 48[email protected]