Financial services companies have always been early and enthusiastic adopters of data science, and have since contributed much to pushing both the art and science of the discipline to the leading edge.
Insurance, banking, and investment companies all ingest vast amounts of data across hundreds or even thousands of data points to build models designed to maximize growth and minimize risk. As a result, there’s been an explosion in data science tools, technologies, and processes over the past ten years.
At Domino, we’re proud to have played our part in driving data science forward by partnering with many of the top financial institutions in the world, such as Moody’s Analytics, BNP Paribas Cardif, Allstate, S&P Global, and Lloyds Banking Group. As a result, we’ve had the opportunity to witness firsthand the best practices these companies have developed.
These companies and others across the financial services industry are using machine learning, artificial intelligence, and robotic process automation to disrupt the operational status quo and to create a competitive advantage. To help you better understand their impact, let’s look at how these new tools and techniques are being used, along with challenges to watch out for during implementation.
Descriptive and predictive analytics have a long history of success within financial services. They help organizations understand the potential risk of a credit customer, assess the performance of existing assets, or help simulate the potential of new assets through mergers and acquisitions.
The exponential growth of underlying datasets being used by financial services means that, in many cases, regression-based risk models are no longer powerful enough to deliver the highest levels of accuracy. This has led to remarkable investments in machine learning technologies that can enable institutions to work across multivariate and high-volume data sources.
However, many organizations struggle to see a return on their investment due to the challenge in deploying meaningful models into production or an inability to change underlying business processes to enable adoption of the models once they have been deployed. One of the primary challenges when seeking to leverage machine learning is being able to track the full development history of a model and be transparent around the why and how, as much as the what, of a prediction being made.
Another challenge is the need for specialized capabilities from both professional data scientists and new systems to enable machine learning tasks to run efficiently. The move towards GPU and APU processing technologies means that many businesses cannot keep pace with the needs of their data science teams, who need the freedom to explore and experiment with these new technologies.
To be successful in deploying machine learning models, institutions need to combine expert domain knowledge and high levels of capability in data science with a platform that makes it easier to explain and describe how models are built, what data has been used, and why any given decision was made by the algorithm.
Robotic process automation (RPA) technologies enable businesses to automate many manual and menial processes that otherwise require human intervention. For financial services organizations, RPA has lowered the cost of processing documents by verifying that all criteria and critical information has been entered, while automating coding processes for expenses and income on the balance sheet.
Improved capabilities in natural language processing (NLP) mean that machines are able to read and interpret the meaning of words within documents or even from transcribed phone calls. Along with improvements in decisioning, this means RPA is more capable than ever to automate certain human intelligence tasks. In successful implementations, RPA leads to lower failure rates in data entry, lower overhead costs of processing documentation and loan applications, and a higher level of completeness for customer records.
A clear risk of RPA is that it becomes a bandage that masks broken processes or systems integrations. When regulation and compliance require processes and systems to be updated, the need to properly identify the difference between automating human processes that work as expected versus using RPA as a framework to avoid proper systems integration or fixing an underlying business process can lead to massive cost.
By focusing on clear end-to-end business process mapping, you can ensure that any RPA activity isn’t used to solve problems in either broken processes or underlying systems integration and software issues. This helps your business achieve better overall outcomes while mitigating potential future risks.
The rise of digital assistants, chatbots and next-best-action systems are the most common examples of artificial intelligence (AI) use cases within financial institutions. AI is best described as a combination of decisioning systems like machine learning, process automation technologies and new digital interfaces. When combined, they provide the potential for full automation of sequential tasks that would otherwise require human intervention.
The potential for providing a full banking teller-style experience to a consumer in their homes through the use of an AI app is compelling. It allows the consumer instant access to the information and services they require while lowering the overhead costs of servicing that customer.
However, many solutions have become overhyped within the market, leading to failures in deployment. Deploying these systems into organizations can be challenging and complex, in no small part due to the need for extensive rework to existing business processes and underlying systems of record.
For an AI solution to be useful, it must be able to act on behalf of the customer. This often requires microservices that can do things like open a new account, transfer money from account to account, or provide a summary of expenses on command. In many cases, businesses put a smart-looking digital front end over the top of poorly performing legacy systems and processes, expecting a silver bullet solution. As anyone who has ever been stuck in an endless loop with a chatbot can tell you, this can lead to a frustrating customer experience—exactly what you’re trying to solve for in the first place.
The disruptive potential of these new techniques and technologies is unquestionable. The ability to provide a better and more compelling customer experience, to be able to instantly assess full risk profiles of potential loan candidates, or to automate many day-to-day tasks involved in processing large volumes of transactions can have significant bottom-line benefits for financial services operations.
But managing the development of these new digital assets requires a new approach to ensure that they’re transparent, explainable and reproducible. The need for a platform to act as a system of record across the entire development process has become clear. A platform should provide a common place for specialists to collaborate, develop and deploy their systems into production. A platform can also give stakeholders a place to seek clarity on how the underlying behavior of these systems operates and how to seek more information on individual cases where a system has made a decision.
The need to clearly define the prerequisites for businesses to leverage these technologies is essential to a successful deployment. This includes identifying the right mix of talented professionals, finding the right platforms to enable effective and efficient operations, and embedding change management into the principle of the organization to ensure adoption of these new digital assets. As the Chief Data and Analytics Officer for Allstate, Eric Huls, puts it,
“We are weaving fact-based decision-making into the fabric of the organization to not just improve how we operate, but transform the industry itself.”