Disponible en Español

Course on SupTech and RegTech

June 27 - 29, 2022
Videoconference

 

premio-rodrigo-gomez

The Course on SupTech and RegTech was held from June 27 to 29, 2022, for representatives of the CEMLA membership, working on these topics. Each presentation was made by the FNA.

Welcome remarks

Dr. Isela-Elizabeth Téllez-León, Director of Financial Markets Infrastructures, CEMLA

Dr. Elizabeth Téllez delivered the opening speech of the course, emphasizing the importance of the change that is occurring in financial systems and the economic world, which can be analyzed through new technologies. Afterwards, she shared CEMLA's advances in financial technologies, starting with the program to sponsor a regular collaboration on the implications of the potential use of new financial technologies in the regional community of central banks, with this course being a channel to create that capacity. Dr. Elizabeth Téllez underlined the importance of practical sessions to generate questions about how to apply these new technologies in their respective central banks. Finally, she stressed that the objective of the course was to generate practical knowledge on these topics.

 

 Day 1

REGTECH & SUPTECH - TRENDS & DEVELOPMENTS IN 2022

Session 1. State of the art of RegTech & SupTech: Overview of new data sources and applications

This session was given by Dr. Kimmo Soramäki, founder and CEO of FNA. He first gave the introduction of FNA, which is a software specialized in network analysis and simulation for payment systems, mainly for central banks, government authorities, commercial banks and financial infrastructures. He then spoke about the evolution of terminology in the area, stressing that FinTech is the technology that facilitates retail financial services in new ways, RegTech is the technology that helps banks and FMIs comply with regulatory requirements, SupTech is the technology that helps authorities monitor financial markets, and TechFin refers to technology giants, like Apple, entering the financial services market. He commented that his talk would focus on RegTech and SupTech and how they relate.

Dr. Kimmo Soramäki spoke about how SupTech can reduce the burden on supervised firms, enable proactive monitoring, better risk insight and lead to better resource allocation. Also, he talked about how the opportunities increase and there are more and more facilities, due to a very significant technological leap in recent years, since it’s now possible to do real-time monitoring, and also make analysis with granular data, instead of aggregated data, which leads to better predictions. He then spoke about the importance of moving from qualitative to quantitative analysis with agent-based models or machine learning, for example. He went on to mention that now the reports are much more visual, so the communication of analysis has improved a lot.

Continuing with his presentation, Dr. Kimmo Soramäki highlighted the main components of SupTech, which are quality data from different sources and data science capabilities combined with specialized knowledge of the area, focusing on reducing the gap between research and development, because it’s useless to do great research if it cannot be put into practice. He mentioned that the usual workflow in SupTech is first to obtain data, second to clean, transform and solve entities in the data, for example, to ask how to classify a bank that is subsidiary to another, which leads to the third point, to have the groups well identified. The fourth step is to analyze the data in order to visualize the results, and finally automate the analysis in real time and send alerts to those responsible.

Dr. Kimmo Soramäki went on to talk about the applications of SupTech, which include understanding the dynamic movement of credit risk in retail and corporate portfolios based on granular borrowed data, understanding the interconnections and evolving exposures of derivative markets and exchange dynamics, based on OTC TR data, or anomaly detection and early warning signals, based on payment data. In addition, he mentioned that it can be applied in macro-prudential analysis, financial crime, liquidity and crisis management. For example, if a bank is in trouble, the first thing to do is review their payments.

To end the session, there was a discussion on how to implement a SupTech in a central bank, where Dr. Kimmo Soramäki stressed that it should be a project of several years, although it can give results in a few months. In addition, he explained how it’s best to do it in conjunction with a private company that already has experience and similar works already deployed, so the bank won’t have to build everything from scratch.

They also commented on how to prioritize the projects in which a Central Bank must invest, to which Dr. Kimmo Soramäki replied that it is necessary to know the strategic priorities of the Bank, as well as to contrast what is available against what is wanted to be built, to avoid doing from scratch what has already been done before.

 

Session 2. Tips for designing an effective SupTech roadmap

This session was given by Joanne Horgan, chief innovation officer at Vizor Software, who began by talking about the differences between Fintech, RegTech and SupTech, stressing that the first should seek to improve and automate the sending and use of financial services in consumers, the second to improve compliance with regulations and manage risk, while the third should allow proactive monitoring, improve reporting, compliance and oversight of regulations. To summarize, she explained that, in general, FinTech seeks to increase profits, RegTech reduce costs and risks, and SupTech increase efficiency and reduce regulatory burden. She also talked about the risks involved in doing this manually.

Then she talked about how RegTech and FinTech are two sides of the same coin, only RegTech is focused on regulators and FinTech on the industry. She said supervisors want to be more proactive, get closer to doing it in real time and to have more granular data. This is because the industry is moving towards a more descriptive data analysis, supported by artificial intelligence and machine learning. She also stressed that it is important to have solutions for the market, since there is a gap between the development of this area and the actual deployment. Solutions for financial institutions of various sizes should be considered.

Joanne continued the presentation by talking about a problem in the current state of regulatory reporting, first because institutions should cooperate more with each other on issues that are not their core business, and regulatory oversight and reporting are very expensive. In this context it was discussed whether the current system really works, because the implementation is very expensive and the benefits are minimal.

Joanne also spoke about current trends, which include increasing granularity, frequency and interoperability between requirements, increasing the vision obtained from the data and standardizing the schemas of the reports. Then she explained the importance of looking for the technology that solves the current problem, not the other way around, and that there needs to be collaboration between competitors.

 

Hands-on exercise 1. Integrating Alternative & Payments Data into Supervisory Analysis

This session was given by Lubos Pernis, head of SupTech development at FNA, who presented the FNA platform, where payment systems can be visualized, analyzed and simulated. He explained the components of the platform and began his hands-on session on how to use it.

Lubos explained the importance of granular data, which are collections of each instrument of a financial institution and commented that if banks can have access to such specific data, they can get better results. He commented that FNA's platform contains granular data from many companies. For any of them, the user can see the banks to which that company is exposed, and which sectors are vulnerable in case of failure. In addition, he showed how the connections between organizations related to the company can be seen, and the impact it has on them, and how to see a timeline of the company, with marks at special events.

Hands-on Exercise 2. News Data in Focus - Overview of key operational and conduct risks & Taxonomy of data for risk analytics

Lubos Pernis explained how the idea is to predict which banks behave abnormally, and understand why it happens. For this, each bank has an anomaly rating. Then he talked about how the algorithms of the system are trained with normal days and tested with anomalous days to calculate these ratings.

In the question session the participants enquired about how to adapt the model after changes in real life, to which Lubos commented that there is no simple answer, since the first thing they do is see if the model works with each particular case, and if there is any important event to analyze, the model is trained again. He explained that if it was trained with each case, there would be many false positives, in addition to those that normally exist.

 

Day 2 

NEW FRAMEWORKS AND TECHNOLOGIES FOR RISK ANALYTICS

Session 1. LegalTech & smart contracts - what central bankers need to know

The session was given by Professor Patrick McCarty, founder of McCarty Financial LLC and teacher at Georgetown University Law Center and at the Catholic University of America Columbus School of Law. He first gave the definition of smart contracts, stressing that they are computer code in which a person has no stake. He commented that they gained a lot of relevance with Blockchain, since this technology has embedded smart contracts in its core, particularly with the launch of the Ethereum Blockchain in 2015. Next, he talked about the 3 types that exist: legal smart contracts, which are legally binding; DAOs (decentralized autonomous organizations), which belong to Blockchain communities with their own rules; application logic contracts, which are smart contracts linked to another smart contract.

To highlight its importance, he commented that there are more than 1 million smart contracts in existence. However, a big problem is that they are hackable, with the notorious cases of Ronin and Wormhole, in March and February 2020, respectively. He also talked about an assurance problem, if a person loses his money by mistake or an attack, there is no guarantee that he will get his money back. Also, if someone loses their passwords to their assets, they are lost forever.

He went on to talk about how banks have accelerated their operations using smart contracts, and how central banks are very interested in them, in the context of cryptocurrencies and DeFi. He mentioned that this also raises a lot of questions and problems. Then he mentioned that central banks seek to regulate these markets to protect consumers, as there is currently a high risk of fraud and theft. To conclude his presentation, Professor Patrick said that smart contracts can make many operations, such as making a loan, much simpler and faster, but the retail client must be protected.

Within the discussion with the participants, Professor Patrick expressed his concern about the increase in hacks and in particular large hacks, since it would be expected that with the advancement of technology these would decrease. He also talked about how he would expect all these attacks to have led to greater regulation, since he can't see a system where it's possible to have mechanisms that can harm consumers without any accountability. He stressed that many times the products are launched without being tested in security issues.

 

Session 2. SupTech for Financial Crime Analytics

This session was given by Brandon Smith, Director of Homeland Security Solutions at FNA. He began by talking about the various factors that motivate an attack, which are mainly financial and political. He highlighted the increase in cyber-attacks in recent years, which have cost central banks millions, making it a very important problem. In addition, he spoke about the increase, both in number and in magnitude of money laundering, and how it is also a serious problem. However, he stressed that institutions such as the G20 are increasing their attention on this issue. He also stressed that money laundering schemes are becoming more and more complex, making them more difficult to detect. He then began talking about the mechanisms FNA uses to detect fraud, commenting that finding abnormal behavior can be seen as a classification problem, with the use of machine learning algorithms, particularly network-based supervised algorithms. For example, one way to detect anomalies in a transaction is if the distance between participants is very long, which implies having a large chain of intermediaries.

In the discussion with the participants, he highlighted the advantages of using network-based algorithms since one way to detect fraud is by predicting a link between two entities, either the link is predicted, but in reality, it does not occur, or the link is not predicted but actually occurs. This is modeled in a neural network.

 

Exercise 1. Anomaly detections and their applications

To make a practical demonstration, Brandon explained how visualizing the anomalies of a database can be very complex, since at the beginning the network looks like a "hairball". He started the example with a database where anomalies had to be detected, and began to filter the network created to reach the connections most likely to be fraudulent, until he reached a very small network with all the agents involved in the possible fraud. To conclude, he recommended combining the anomalies found by FNA's algorithms with other databases, to contrast or link the results, since the data from a third party can fill the gaps of an operation.

MODELING THE IMPACT AND IMPLICATIONS OF CBDCs AND PRIVATELY ISSUED CRYPTOASSETS

Session 1. Key differences and overlaps between CBDCs and privately issued cryptoassets

The session was given by Dr. Carlos León, Director of FMI and Digital Currency Program at FNA, who began by talking about the taxonomy of money, starting with central bank money, which includes cash, reserves and CBDCs, all of which are responsibility of the central bank. Then there is the money from commercial banks, like the one in digital accounts. Then there is electronic money, based on tokens, mainly in electronic wallets to make peer-to-peer transfers. Dr. Carlos stressed that one of the possible advantages of this type of money is to increase financial inclusion, remembering that all these types of money are regulated. However, there are two other types of money that are not regulated, starting with cryptocurrencies, which are peer-to-peer, but there is no institution backing their value, so they are very volatile. He then explained how the search to mix cryptocurrencies with electronic money led to stable coins, which are backed by cash, other cryptocurrencies or other mechanisms, although there is no certainty of these endorsements.

Continuing with his presentation, Dr. Carlos warned that the use of CBDCs can affect commercial banks if they stop being intermediaries, in addition to the fact that great care must be taken in the collection of data, which can be used for social engineering, giving China as an example. In contrast, crypto-assets are independent of governments, and are more efficient, but their high volatility prevents them from being used to buy goods, so they are not really a "currency". To conclude this section, he explained that stable coins do maintain more stability, but there is no guarantee of their support.

To continue, he showed a diagram prepared by the FNA to determine what type of distributed technology for CBDCs is required in each case, for example, issuers have to ask themselves who it is aimed at (retailers or wholesalers). Also, 3 types of ledger are contemplated, one completely controlled by the central bank, one where the central bank manages and commercial banks participate, and one where the central bank only approves but is controlled by commercial banks. He also launched the question of what is the problem for financial inclusion in each case, for example, if it is an issue of lack of access to the internet, there is no point in launching a CBDC that also requires access to the internet. Another question that Dr. Carlos asked was what are the motivations for implementing a CBDC, which can be to stop the gradual disuse of cash or to preserve monetary sovereignty against the entry of private digital currencies. He mentioned that cash has many properties, such as resilience, universal access and privacy, which are desirable for a CBDC to have.

He then explained the need to have realistic simulations of CBDCs, so that central banks can test their designs. For this he presented the FNA agent-based simulation tool. In most cases, the commercial bank is an intermediary between the central bank and final users; each country has different laws and mechanisms that can be implemented in the simulation. The results of this model are the rate of adoption of CBDC and at what stage people would use it, the composition of consumer wealth, and the diffusion of payment instruments. On the side of the banks, the simulation returns the degree of disintermediation, the balance sheet and an analysis of specific scenarios.

 

Session 2. Impact of challenger FMIs on payments and settlements

Continuing with the previous session, Dr. Carlos explained the difference between traditional FMIs, and a new type of institutions called challenger FMIs, is that they use new technologies, such as distributed ledgers, to improve the efficiency, supervision and regulation of payment systems. CBDCs are an example of their applications, although Dr. Carlos stressed that the purpose CBDCs is for them to be a complement to cash, not a replacement, not even in the long term. Despite this, the impact of challenger FMIs is that supervision and regulation will have to adapt to changing payment systems, so they will have to use new tools and mechanisms, oriented to an international vision.

Within the discussion with the participants, the practical difference between CBDCs and cryptocurrencies was commented, highlighting that these are not a good common currency due to their volatility. Therefore, they are not competitors, since they are aimed at different markets. There were also comments about the high reputational risk of a central bank if its CBDC fails, for example, if VISA fails 30 minutes it generates chaos, and CBDCs are not exempt from this. He concluded by talking about the importance of having CBDCs in a different channel than all traditional banks and services, so that if they fail the CBDC is maintained.

 

Day 3

GRANULAR SUPERVISORY & PAYMENTS DATA IN FOCUS

Session 1. Taxonomy of granular supervisory data and its applications. Tips for integrating payments data into supervisory analysis

The session was delivered by Perttu Korhonen, Associate Director of Financial Analysis and Innovation at FNA. He began by explaining that many times they do not have the ability to manually monitor all financial institutions, so it’s very important to automate these functions, enabling a supervisor to know in real time when something relevant happens, as well as being able to ignore irrelevant information. He stressed that finance is an area where everyone agrees that needs good regulation, to protect both banks and institutions and customers. He stressed that the emphasis of supervision has to be risk-based, making regular reports, having day-to-day monitoring and face-to-face visits to make risk assessments. In all of them, data analysis is important, but it is not the only thing. In this context, SupTech helps to make this process more efficient, in addition to making it more effective.

Perttu stressed that supervisors are knowledge workers, since the process is to take data and facts, and when it is given proper context and the relevant is separated from the non-relevant, it becomes information, so that later the important information is retained by its meaning, explicitly and tacitly, generating knowledge. He mentioned that in the passage from data to information many steps are repetitive and can be automated, this means a lot of effort and waisted time can be better used.

He then talked about the program they have to transform the information, first facilitating access and reducing the time of manipulation of the data, then making it easier to give it context, and finally retaining the knowledge generated by the regulators, because when they retire or change companies, that knowledge won’t be completely lost. To develop SupTech, he commented that banks must have a good idea of what they have and where they want to reach, and with that they can start developing the framework. He mentioned that no bank fails out of nowhere, as it will give indications that something is not right, underlining the importance of having a long-term vision in SupTech. 

He highlighted the vision of what their SupTech will be like in 2030, with automated information management, in a single platform that continuously learns from users, which will allow them to take advantage of the strengths of each person and that they will learn from others in their weaknesses. He stressed the importance of the platform also obtaining knowledge that lasts after the departure of people, without neglecting that these people have a role and that the development of the staff benefits everyone. He concluded remarking that on the 2030 SupTech there will be no information on many sides, only on the platform and it will be seen in an accessible way for everyone.

 

Hands-on exercise 1. SupTech & RegTech applications for macro- and micro-prudential analysis, systemic stresstesting, and payments oversight

It was given by Ivana Ruffini, General Director of Advanced Analytics at FNA, who began by highlighting the impact of FinTech, particularly non-associated banks, and that it’s increasing, just as legal patents in associated technology have been increasing. Then began the explanation of network analysis in simulations, with which traditional models can be improved, and which can facilitate the entry of granular data, apply different rules and behaviors and show side effects, tertiary effects, quaternary effects, etc.

Ivana talked about how historical data is necessary, but it’s more important to simulate data that has not been presented, for example, to simulate that the pandemic was more serious or that the war in Ukraine lasts longer. She also gave some concrete use cases, such as designing payment systems, evaluating changes in the environment, validating a system, doing stress and scenario tests, among others.

 Continuing with her presentation, Ivana talked about agent-based models, or ABMs, that allow us to observe changes in agents in the face of changes in the system, and to be able to prepare scenarios. She also stressed that results can be obtained given the possible responses of the agents. For example, to see the impact of a change in liquidity, it is not needed to know why that would happen, to know what might happen if it presents and be prepared. She stressed that it’s not necessary to explain the reasons for a scenario, only the consequences that are desired to evaluate. Thus, she showed that the advantages of ABMs are to be able to model complex behaviors flexibly and realistically, but that the disadvantages are having to spend a lot of time developing them and that they are very sensitive to the assumptions of the model.

 

Hands-on exercise 2. New Opportunities for FMIs Design & Oversight 

Ivana continued with her presentation by giving the most important considerations for the analysis of an RTGS, starting with the fact that it consumes a lot of cash, liquidity is not free, and delays in payments can be expensive and result in a hoarding of liquidity. She also mentioned that technological innovations like Blockchain do not solve the problem of liquidity, and that cooperation does not happen naturally. For all of this, RTGS systems are complex and difficult to analyze. She also talked about considering network connectivity, that is, how many participants are directly connected, and how liquidity is distributed, since it is surely not optimally distributed.

 

Hands-on exercise 3. Examples of Alternative Data Analytics through SupTech

This session was given by Katerina Rigana, Fintech Business Designer at FNA, who began by explaining the use case to be analyzed, through the FNA platform. The aim was to study the causal effects of one currency on another, through its connections in the Forex market. She explained that a clustering model was used where the communities to which each node belonged were grouped, and the weight of the link represented the size of the effect they had between them. She then went on to explain how to use the FNA platform to reach the conclusions of the analysis. At the end there was a session of questions about the operation and the results obtained.

 

Remaining resilience during SupTech innovation: How to protect the data in a central bank

The session was delivered by Manit Sahib, former Head of Penetration Testing at the Bank of England and current Director of Global Intelligence at Picnic Security. He began by highlighting the importance and impact of cyberattacks today, and then explained the top 3 attacks, first phishing, usually in emails, then supply chain attacks, where attackers infiltrate through an independent partner of the organization with fewer security measures, and finally ransomware attacks, that prevents users from accessing their information. He highlighted that one in 4 information leaks is due to phishing, and that 62% of intruders in a system enter through the supply chain. Manit also stressed that even large companies are vulnerable, such as Facebook which has been hacked 2 times in recent years.

Then he asked the question of why central banks would be attacked, and gave 4 reasons, first to obtain a monetary gain, then for national espionage, in order to steal information or intellectual property, then to damage the reputation of that bank or country, and finally to make a breach in the economy. Manit gave as an example the central bank of Bangladesh, which suffered losses estimated at more than 81 million. He stressed the importance of being in the front of security issues because attacks are escalating.

He then gave an outline of how to build cyber resilience, starting with knowing what an attack can do, how to respond and how to recover, all applied to the bank's people, processes and technology. The Bank of England made a framework for smart cybersecurity testing, called CBEST, that tests the cyber resilience of firms' business services and copy the actions of attackers, to better understand weaknesses and vulnerabilities, so the company can take action. Manit stressed that each company acts differently, so it will be attacked differently. Potential attackers range from an inexperienced person, to an activist, then organized crime and finally a nation-state. Central banks are attacked by the latter two.

Within the discussion with the participants, it was mentioned that it is not possible to protect yourself with regulations or sanctions effectively, since attacks can come from anywhere in the world. He also spoke of the motives, explaining that they vary depending on where they come from, for example they may come from a country that tries to spy and steal information or resources, or for example from North Korea where the motivations are exclusively monetary.

 

End of the event

Dr. Isela-Elizabeth Téllez-León, Director of Financial Markets Infrastructures, CEMLA

Dr. Elizabeth Téllez gave the closing speech, thanking the participation of both the panelists and the attendees.