#AI

Banks Are Changing Their Expectations and Next-Gen Data Standards – Here’s How

Posted on

Financial Services (FinServ) firms are working in earnest to drive greater insights from data, advancing AI/ML initiatives and digitizing their organizations. IDG’s “2018 State of Digital Business Transformation” highlights 93% of FinServ firms have plans to adopt or have already adopted a digital-first business strategy. The majority of organizations surveyed plan to spend a significant portion of their IT budgets on these initiatives. Perhaps unsurprisingly, major priorities are centered on big data, artificial intelligence, machine learning, software-defined storage, and cloud.

 

For IT, this has meant a decade of supporting legacy and meshing a wave of new technology for changing data collection, monitoring, analysis, and reporting needs. This means supporting and augmenting mission-critical systems; a major example would be mainframes or data warehouses. It’s been reported that 92 of the world’s top 100 banks still rely on mainframes. Beyond the cost of keeping and changing away from these systems, they manage core services such as online reporting and transactions, ATMs, payments, and risk management – all essential to business operations – hence, mission-critical. The reliance on legacy system security, scale, performance, and resiliency have been strong arguments against change. Nevertheless, banks have to apply new data tools to meet the changing ecosystem their business operates in and requires for competitiveness.

So over the last eight to ten years, projects like Hadoop became one of the critical tools for organizations trying to make sense of massive amounts of data pouring into their businesses. However, as highlighted in the Wall Street Journal’s recent article on the Cloudera and Hortonworks merger, it’s known that these technologies have “lacked key enterprise-grade features, not least security.” Why would a bank leverage a new ecosystem of data tools, yet give it a pass on requiring the same level of mission-critical capabilities its legacy systems require? Simultaneously, agility and harnessing next-gen tech is diminished by not doing anything.

The past several years have witnessed banks experimenting with different IT models. Bolting on more new technology isn’t solving the dilemma. Also, building entirely from the ground up is extremely costly, time-consuming, and efforts are not bearing the results fully expected on the revenue side. For example, State Street’s Beacon project is cutting IT costs, yet as the bank has spent beyond it’s $550 million projections, is it accretive to enhancing the very digitization sought? The bank’s recent $2.6 billion acquisition of Charles River appears to be a costly exercise to supplement the lack of application and digital lift expected from the business.

PWC’s recent report, Financial Services Technology 2020 and Beyond, is one example piece of research that outlines the rough blueprint that banks are taking to tackle this very challenge. They call it an “integration fabric” – for data, the market knows this as a data fabric. MapR is an example of a true data fabric. This means a data platform that couples enterprise-grade operational resiliency with massive scale, performance, and security. These foundational underpinnings drive integrated AI, automation, hybrid cloud, and containerization functionality for bank-wide mission-critical systems. The new technical components allow for digital lending systems, real-time machine learning credit card fraud prevention systems, or advanced analytics client platforms.

TransUnion calls out Prama in their 2017 10k as an example of a new solution for its growth strategy. Prama provides customers with on-demand, 24/7 customer access to massive, depersonalized datasets and key analytics. MapR underpins Prama because it needed high availability, no single point of failure, and an extreme performance big data platform for client-facing, self-service analytics. TransUnion sees so much value in Prama that they also state in their 10k: “In order to more effectively address these opportunities, we have redeployed and reallocated our sales resources to focus either on new customer opportunities or on selling additional services and solutions to existing customers.”

Embedding analytics into decision-making and workflows is the next step for banks. “Attention to the last mile” is the term McKinsey uses to explain how banks are missing the full value of analytics in “Smarter Analytics for Banks.” McKinsey calls out the “significant technical and production engineering challenges” that are associated with this last mile.

The approach is what we call operational analytics –critical, business decision-making (automated, adaptive, and real-time) embedded in operational transaction activity. It is where next-gen applications combine the immediacy of operational applications with the insights of analytical workloads. It means continuous analytics, automated actions, and rapid response with the integration of historical and real-time data in a single, unified platform. The intelligence is enabled because of concepts like rendezvous architecture, which enable machine learning logistics. These are topics that FinServ firms need to explore more deeply and apply going forward.