Daoud Abdel Hadi from Eastnets discusses how AI can bring peace of mind to financial institutions by adding context to the business analysis process
Why are false positives still a major issue in the current age of data? Unfortunately, high false positive rates remain a reality for financial institutions using transaction monitoring for financial crime. Unsurprisingly, AI-based monitoring has dominated the conversation around false positive reduction.
This is due to its ability to harness historical data and perform more precise predictions than its counterpart, rules-based monitoring. However, despite AI’s potential to be leaps beyond rules in terms of precision, AI-based predictions too are prone to bad optimisation.
A balancing act
With the unrelenting increase in financial crime over the past few years, financial institutions have been under enormous pressure to put financial crime risk at the forefront of their concerns. Regardless of whether financial institutions adopt a rule-based or AI-based approach to transaction monitoring, both need careful tuning or else risk facing the consequences of a badly optimised system.
A monitoring system too strict inevitably leads to higher false positive rates, which can be both time consuming and operationally costly. On the other hand, a system too lenient could miss out on true positives, subjecting banks to heavy penalties and reputational damage. Financial institutions can avoid this by optimising their transaction monitoring systems from the get-go.
Striking that balance between strict and lenient monitoring requires redefining how business analysis is conducted and approaching things in a more data-driven, evidence-based manner.
Business analysis and its shortcomings
Business analysis is a critical phase in the development of any financial crime prevention system. It is an exercise of risk assessment conducted prior to implementation, which provides an opportunity to assess the financial institution’s vulnerabilities to financial crime and develop strategies to mitigate those risks.
This check is typically performed by business analysts with domain expertise in collaboration with the financial institution’s compliance team. The aim is to identify the red flags that pose the greatest risks to the bank and its customers. Once the risks have been identified, both parties agree on how to monitor customers and their transactions by defining customer segments, scenarios, AI monitoring methods, and more.
So why do transaction monitoring systems perform sub-optimally even after business analysis?
This can be attributed to the lack of data analysis during this phase. Key decisions such as how to segment customers or what thresholds to apply for different scenarios are typically decided by the compliance team according to their risk appetite.
In many cases, analysis behind those decisions is far from sufficient, if any at all, which can have a devastating impact on performance down the line. In addition, the lack of access to data for business analysts makes it difficult for them to provide any meaningful recommendation, which would be very valuable given their extensive knowledge of the solution itself.
Data provides context and without it business analysis is akin to playing golf in the dark, difficult to gauge how the monitoring system will perform. Instead of approaching things blindly, banks can leverage their existing data and work with business analysts to conduct a more evidence-based approach to business analysis by tailoring scenarios to their existing customers, transactions, and patterns of behaviour for a more bespoke, optimised solution.
Evidence-based business analysis
With the abundance of data nowadays, there are no excuses for badly optimised solutions. Especially as the benefits of data-driven business analysis have already begun to show.
In one case study, a bank suffering from over 100k false positives a week, engaged in a two-week long business analysis with a financial crime prevention expert and managed to reduce false positives by 80% even without the use of AI. With just a sample of the bank’s data, analysts were able to quickly identify how the bank’s customer transaction behaviour conflicts with their implemented scenarios, and easily calibrated their scenarios and segments accordingly.
To speed up the analysis phase, lightweight data analysis tools that can be easily installed anywhere have been developed. These tools bring the bank’s data to life using a combination of visualisations and its own built-in AI and statistical models.
The advantage of leveraging AI and statistical methods within these tools, is its ability to simulate scenarios and models against historical data and assess their performance in terms of quantity and quality of detections, estimating precisely how a particular configuration will behave in the long term.
By understanding customer transaction patterns and behaviours, analysts can derive meaningful segments, simulate scenarios with different thresholds, and identify redundant overlapping scenarios, all before going to production. Taking this a step further, sophisticated AI anomaly detection models can help banks identify blind spots by revealing hidden risky patterns that would have otherwise been undetected by rules, giving banks the opportunity to introduce additional controls.
Ultimately, the aim is to gather evidence that a particular solution actually works as expected with the data available before deployment. And while this exercise would take slightly longer than the standard business analysis, it gives the bank the peace of mind knowing their implemented solution is effective, and sets expectations in terms of workload by simulating the number of detections generated as well.
One logistical challenge of conducting such an exercise revolves around data access between the bank and analysts. Cloud computing makes this process simple.
The bank’s IT and security teams can set up a secure and private server on their own network with a copy of their data on it, similar to UAT environment. Analysts will then supply IT with files containing their analysis tools which are installed on the server. Both vendors’ and banks' security teams will set up private access for analysts to that server to perform their analysis without the data ever leaving the bank’s premises.
Minimising false positives with data analysis
Financial institutions have for a long time been plagued by high false positive rates, much of which can be significantly reduced by following better practices during the business analysis phase.
Despite being designed to identify and develop strategies to mitigate the risk of financial crime, the absence of data analysis during business analysis is counterproductive as it can lead to badly optimised segments, scenarios and models, the very cause of false positives. Adopting a more evidence-based approach to business analysis, one which involves analysing historical data, allows banks to simulate and tune transaction monitoring systems before deploying, ensuring things operate as expected from the very beginning.
About the author
Daoud Abdel Hadi
Lead Data Scientist - Eastnets
Solving problems using data has been a part of my life for the past 6 years. I have the privilege of using my data science skills to tackle real-world issues such as fraud, money laundering, and terrorist financing by any means necessary whether it’s machine learning, graph theory or simple rules.
Check the original article shared by AI Magazine, here.