The use of artificial intelligence in the financial sector: ESMA’s guidelines for responsible use

The use of artificial intelligence in the financial sector: ESMA's guidelines for responsible use

Advances in technology have driven the use of artificial intelligence (AI) in many industries, and the financial sector is no exception. The use of AI is expected to offer considerable potential, particularly in investment advice and portfolio management. But with opportunities come risks. The European Securities and Markets Authority (ESMA) has looked into the issue and published its findings in the Public Statement: On the use of Artificial Intelligence (AI) in the provision of retail investment services (https://www.esma.europa.eu/sites/default/files/2024-05/ESMA35-335435667-5924__Public_Statement_on_AI_and_investment_services.pdf). In it, ESMA addresses the challenges and provides guidance for investment firms to ensure that the use of AI is compliant and client-oriented.

Opportunities and risks of using AI in the financial sector

AI is used in the financial sector in various areas, such as for chatbots in customer support or virtual assistants that support customers or for complex data analyses for investment decisions. Other applications can be found in the support of compliance tasks, risk monitoring and fraud detection. The automation of these processes promises an increase in efficiency and an expanded basis for decision-making. However, there are also risks associated with their use: Non-transparent decisions, risks relating to data protection and the danger of incorrect results due to inadequate data input are just some of the challenges that need to be overcome.

The content of the public statement

In order to counter the aforementioned risks and protect the interests of clients, ESMA is issuing new guidelines as part of the public statement. These guidelines are intended to ensure that the use of AI complies with the MiFID II conduct of business requirements.

ESMA sets out specific requirements for the use of AI in connection with investment decisions. When selecting the data that forms the input of such models, investment firms should ensure that the data:

  • relevant,
  • sufficient, and
  • representative

When training such models, it is important to ensure that accurate, comprehensive and sufficiently broad data sets are used.

ESMA points out that clients should be explicitly made aware of the use of AI. Employees and management should be trained in dealing with AI risks.

ESMA also requires thorough tests and reviews of the AI systems used in order to measure the performance of the systems used and the impact of their use on the performance of the offering. Comparisons with index developments or historical data from the investment firm are a good way of doing this. Data protection regulations must be strictly adhered to and customer complaints regarding the use of AI must be carefully documented.

Implementation of the guidelines in practice

The Public Statement makes it clear that the use of AI places special demands on internal processes. In the case of internal models (or the use of LLMs that are adapted to the specific requirements of an investment firm), it must be ensured that the purpose for which the models are used and the data basis on which the training was carried out is clearly documented. Furthermore, it must be ensured that the performance of the models is monitored on an ongoing basis. Furthermore, it must be explained to clients whether and for what purpose AI is used.

Conclusion: Responsible use of AI in the financial sector

With its new guidelines, ESMA has specified the requirements for the use of AI. The publication of the Public Statement initially makes it clear that the use of AI is possible for investment firms. However, special requirements apply that differ significantly from the previous ones. ESMA’s requirements should not be viewed in isolation, but must be considered in conjunction with the provisions of the AI Act in particular.



By continuing, you accept our privacy policy.
You May Also Like