Balancing Risks and Rewards: ESMA’s Analysis of AI in Securities Markets
The European Securities and Markets Authority (ESMA) on 1 February 2023 published a report on the use of artificial intelligence by financial market participants in the European Union. The report focuses on identifying the challenges and risks associated with the development of artificial intelligence technology, as well as presenting regulatory proposals to enhance investor safety and ensure the stability of financial markets.
Artificial intelligence is already present in financial markets and is being used to (i) analyse data, (ii) forecast prices, (iii) manage risk and (iv) make investment decisions. In response to these trends, ESMA highlights the following risks in its report:
Lack of transparency and accountability for responsible decisions – ESMA highlights the need for information to be made available about AI algorithms and models, how they work, monitoring the effects and impact on the market. This is because complex AI systems can make decisions that are incomprehensible and unclear to financial market participants.
Unreliability of AI algorithms – another challenge associated with the use of AI in finance is the possibility of unreliable algorithms. If an algorithm is poorly designed or misapplied, it can produce inaccurate results that can affect the stability of financial markets. In addition, AI algorithms may exhibit inappropriate behaviour or an unethical bias, which creates additional risks for financial market participants.
In response to defining the risks, ESMA identifies solution that should counter them:
Establishing national standards and regulations – ESMA’s report highlights the need for more robust regulation and oversight of the use of AI in finance. Regulation should address issues such as transparency and accountability for AI-based activities, privacy and data protection, and oversight and auditing of algorithms. ESMA suggests that national supervisory authorities should also extend their oversight to the implementation of AI by financial institutions.
Appropriate adaptation of risk management procedures – ESMA’s report on the use of AI in financial markets highlights the need for specific procedures to control the risks associated with the use of AI. It points to the need to put in place mechanisms to identify and assess the risks associated with the use of artificial intelligence and to monitor the decision-making processes undertaken by artificial intelligence.
One of ESMA’s proposed solutions is the introduction of dedicated AI committees, which would be responsible for monitoring decision-making processes and assessing the risks associated with the use of AI. These committees would have access to information about the algorithms and models used by financial market operators and would be able to verify their effectiveness and impact on the market.
ESMA also proposes the introduction of specific tools to control the risks associated with the use of artificial intelligence, such as tools to identify and eliminate errors and tools to detect market manipulation. The report also highlights the need to train financial institutions’ staff on the use of artificial intelligence and to teach them how to use risk control tools effectively.
It is also worth noting that risk control procedures should be introduced in all financial institutions that use artificial intelligence in financial markets. The ESMA report stresses that financial institutions should be responsible for assessing the risks associated with the use of artificial intelligence and for putting in place appropriate risk management procedures.
In conclusion, risk control procedures are crucial for investor safety and for the stability of financial markets.
61-730 Poznań +48 61 853 56 48[email protected]
00-124 Warsaw +48 22 300 16 74[email protected]
50-088 Wrocław +48 61 853 56 48[email protected]
31-355 Kraków +48 61 853 56 48[email protected]
65-071 Zielona Góra +48 61 853 56 48[email protected]