What Financial Services need to consider managing security through the Covid-19 (coronavirus) pandemic

Posted on Updated on

1000x-1Credit markets are zero bound and equity markets stumbled into a technical bear market this week. Volatility has returned in a big way. With it, inefficiencies have appeared given a contagion drastically different than previously dealt with. The market is seeing inefficiencies not only in monetary policy impacts on supply shocks and market information uncertainty, but also financial services operational response and threat exposure. This presents a ripe opportunity for bad actors during global uncertainty. Several considerations financial services firms should make now and, in the future, are contending with misinformation, elevated business email compromise periods, work from home scenarios and business continuity planning (BCP).

 

Market-Moving [Mis-] Information

As other worldwide issues, over-hyped and/or fake news can cause markets to swing wildly on speculation, as demonstrated by the NYSE 15-minute “cool-down” Monday. Increasingly, expect to see more purposeful market manipulation with regard to pandemics from reports of false hopes for pump-and-dump schemes on small pharma stocks, to exaggerated death rates in economically volatile countries, to conspiracy theories on ulterior cause and/or motives of this outbreak. Motivations for market-moving events will also vary from purely monetary gain to economic disruption of a nation-state adversary. Tread lightly with the information received, as well as trade prudently until the market volatility subsides.

 

Business email compromise

As information shock transmission takes place, communications and information transparency gets worse. Amidst rising times of confusion potential bad actor activity increases. We are seeing the continued success of BEC in financial services and the pandemic offers a great vehicle for compromise. 

 

Work from home

Two areas specifically with work from home scenarios are dealing with non-critical personal and with highly regulated and infrastructure-heavy groups such as advisors and traders. Beyond internet capacity issues that are being tested, firms need to ensure any local or cloud email, collaboration and productivity suites that will be going over personal and unsecured lines or lackluster VPNs have higher-order levels of protection. 

More unique situations exist for traders, as an example, where having the right infrastructure to deal with remote business is being questioned should quarantining come to it. Capabilities have improved such as web-based Bloomberg terminals capabilities and trade order and execution management systems.

Italy provides us the best backdrop to contend with a mass quarantine scenario. Much of Italy’s trading rooms are working remotely in the Lombardy region. This is a complicated situation being navigated as regulators must offer flexibility and liquidity could be hampered without full trading operations should markets react more violently. Once remote, supervision can be compromised opening the door for potential risks. Do banks have recorded communication compliance capabilities in these scenarios? This would also extend to wealth advisors and brokers. 

 

BCP

Lastly, the virus is pressing firms to ramp up their BCP plans. Financial markets have drastically improved their capabilities since 9/11, other influenza outbreaks (swine and avian), SARS and ongoing compliance measure requirements. The FCA has issued a Statement on Covid-19 (coronavirus) for example highlighting its expectations firms are prepared. The question arises as to how business continuity planning will have to change going forward to address two or the scenarios discussed above as well as others.

As this event has given us a new unprecedented new planning scenario, there is not a clear-cut answer right now as to exactly how planning will change. However, much like the guaranteed changes to come from financial services regulators, in the supply chain, fiscal and public safety response, we will see cybersecurity measures change as well.  

Navigating the Financial Services Hype of AI and Machine Learning

Posted on Updated on

navigateThis is a piece I created jointly with Instinet and wanted to share here. Being one of the largest global institutional agency brokers, Instinet is not only big they are an icon in the industry. They consider themselves the original fintech and for good reason. They gave birth to electronic trading in 1969 for equities and it has reshaped markets ever since.

Instinet tirelessly works to improve its broker service,  the trade, and their research. For research, a vital component that is being added in the big data. this means helping their clients understand what it is and how it can help. This piece aims to do just that – help their clients understand more about the work of big data and giving them the “edge.”

The financial services industry is awash in stories about machine learning and artificial intelligence (AI) applications. Definitions vary widely and discussions are often vague, so it can be hard to determine what is real and what is “spin.”

This report was co-authored  to educate the reader on the following:

  • The 4Vs of big data as they relate to the financial services industry
  • Challenges related to big data and how they can be overcome
  • How big data can benefit trading and investment firms
  • A glossary of terms that may be useful in deciphering the jargon

The report hyperlink: Navigating the Hype

There is other great research from Instinet that is also well worth checking out under new and notable: https://www.instinet.com/

Banks Are Changing Their Expectations and Next-Gen Data Standards – Here’s How

Posted on

Financial Services (FinServ) firms are working in earnest to drive greater insights from data, advancing AI/ML initiatives and digitizing their organizations. IDG’s “2018 State of Digital Business Transformation” highlights 93% of FinServ firms have plans to adopt or have already adopted a digital-first business strategy. The majority of organizations surveyed plan to spend a significant portion of their IT budgets on these initiatives. Perhaps unsurprisingly, major priorities are centered on big data, artificial intelligence, machine learning, software-defined storage, and cloud.

 

For IT, this has meant a decade of supporting legacy and meshing a wave of new technology for changing data collection, monitoring, analysis, and reporting needs. This means supporting and augmenting mission-critical systems; a major example would be mainframes or data warehouses. It’s been reported that 92 of the world’s top 100 banks still rely on mainframes. Beyond the cost of keeping and changing away from these systems, they manage core services such as online reporting and transactions, ATMs, payments, and risk management – all essential to business operations – hence, mission-critical. The reliance on legacy system security, scale, performance, and resiliency have been strong arguments against change. Nevertheless, banks have to apply new data tools to meet the changing ecosystem their business operates in and requires for competitiveness.

So over the last eight to ten years, projects like Hadoop became one of the critical tools for organizations trying to make sense of massive amounts of data pouring into their businesses. However, as highlighted in the Wall Street Journal’s recent article on the Cloudera and Hortonworks merger, it’s known that these technologies have “lacked key enterprise-grade features, not least security.” Why would a bank leverage a new ecosystem of data tools, yet give it a pass on requiring the same level of mission-critical capabilities its legacy systems require? Simultaneously, agility and harnessing next-gen tech is diminished by not doing anything.

The past several years have witnessed banks experimenting with different IT models. Bolting on more new technology isn’t solving the dilemma. Also, building entirely from the ground up is extremely costly, time-consuming, and efforts are not bearing the results fully expected on the revenue side. For example, State Street’s Beacon project is cutting IT costs, yet as the bank has spent beyond it’s $550 million projections, is it accretive to enhancing the very digitization sought? The bank’s recent $2.6 billion acquisition of Charles River appears to be a costly exercise to supplement the lack of application and digital lift expected from the business.

PWC’s recent report, Financial Services Technology 2020 and Beyond, is one example piece of research that outlines the rough blueprint that banks are taking to tackle this very challenge. They call it an “integration fabric” – for data, the market knows this as a data fabric. MapR is an example of a true data fabric. This means a data platform that couples enterprise-grade operational resiliency with massive scale, performance, and security. These foundational underpinnings drive integrated AI, automation, hybrid cloud, and containerization functionality for bank-wide mission-critical systems. The new technical components allow for digital lending systems, real-time machine learning credit card fraud prevention systems, or advanced analytics client platforms.

TransUnion calls out Prama in their 2017 10k as an example of a new solution for its growth strategy. Prama provides customers with on-demand, 24/7 customer access to massive, depersonalized datasets and key analytics. MapR underpins Prama because it needed high availability, no single point of failure, and an extreme performance big data platform for client-facing, self-service analytics. TransUnion sees so much value in Prama that they also state in their 10k: “In order to more effectively address these opportunities, we have redeployed and reallocated our sales resources to focus either on new customer opportunities or on selling additional services and solutions to existing customers.”

Embedding analytics into decision-making and workflows is the next step for banks. “Attention to the last mile” is the term McKinsey uses to explain how banks are missing the full value of analytics in “Smarter Analytics for Banks.” McKinsey calls out the “significant technical and production engineering challenges” that are associated with this last mile.

The approach is what we call operational analytics –critical, business decision-making (automated, adaptive, and real-time) embedded in operational transaction activity. It is where next-gen applications combine the immediacy of operational applications with the insights of analytical workloads. It means continuous analytics, automated actions, and rapid response with the integration of historical and real-time data in a single, unified platform. The intelligence is enabled because of concepts like rendezvous architecture, which enable machine learning logistics. These are topics that FinServ firms need to explore more deeply and apply going forward.

Banks, advisors and investment firms failing to find the big ‘windows’ of client needs

Posted on

Clients and compliance are pushing financials to find them

There is an Window-Closingonslaught of ways financials  are actively rethinking how they engage and add relevance to their customers throughout their financial journey. When, where and how are financial firms aptly finding the ‘windows’ of relevance in their customers financial lives? The windows that I’m referring to are the few times that call for a financial firm to offer a solution to a need. While there are many immediate needs, payments being a large one, key products are largely focused on major life events: college loan, mortgage, equity lines, retirement, insurance, etc. How should banks proceed?

One way is being at the forefront of communications and interaction with their clients. The ABA’s (American Bankers Association) banking channel survey found 26% of consumers use mobile for banking and collectively 73% of all consumers use digital channels (internet, mobile & ATM) versus the branch. The idea of relationship banking, while still important, is only one facet of what is needed.

This highlights why the data-driven, omni-channel, branchless, kiosks, advisor models are all being examined and re-thought to support client-centric digitalization strategies. Certainly, there are lots of ways that these channels can help, but are they adding up to find the major windows or financial moments in a person’s financial path? I don’t think the industry is remotely close to unifying these touchpoints to make this happen. There is a gross mismatch between client behaviors and synching those with financial marketing and sales efforts.

Further challenges, for banks, are coming from regulations such as the General Data Protection Regulation (GDPR) and Revised Payment Service Directive (PSD2) in the European Union, which are ushering in new data access policies. These mandates are rewriting the exclusive control financial institutions have on their customers’ financial data. This opens the door for fintechs and larger digital natives (Google, Facebook) to offer financial services. Additionally, these tech companies have exponentially more depth into a consumer’s behavioral profile and life journey – Facebook doesn’t have to guess when you’re likely having a baby or looking to remodel your home.

Let’s look at one window. It’s now been 3 years since my 10-year opened his first bank account. We laud his savings progress and he looks forward to the bank’s youth account incentives (achieving higher balances could land you a McDonalds gift card) and birthdays rewards (another $5 deposited as a gift.) One looks at this as a great customer-centric offering that helps – and it is. However, my view is that before, during and after events were missed. During the process of opening the account, I was never asked about my own bank account – no teasers or incentives have come in the mail or email as they should know I do not bank with them. I’ve also got a little over seven years to get college savings in order. Let’s not forget my son may also need a car during high school, which may require some lending assistance.

The net is a youth program, additional net new accounts/deposits, financial planning/college 501 solution, and a future car loan perhaps. That’s three to four products that I don’t even know if the bank offers or I’ve been asked about, not once. And we frequent the branch to make their deposits, which is an extremely uncommon destination for us or, as the numbers suggest, anyone these days. This represents one of many extremely common stories across the consumer financial market. The customer journey and lifecycle are all but dependent on the consumer making the outreach into a very confusing landscape of options. Digitally native retailers are good examples of firms that are moving rapidly toward making sure that the most appropriate offering is served to you – removing barriers that exist between you, your life and what you may need or seek.

The payments and credit industries are good financial segments working hard to remove the friction to exchange funds for anything you need. They are also using the data to inform other stakeholders, such as merchant clients who can determine the best windows of opportunity to offer deals/incentives. Yes, they deal in a much more real-time, data heavy environment, and my example is really about these slow windows that open and close on a bank. Though, the ‘crumbs make the loaf’ and the combination or larger product insights with recent transactional histories tell a more complete story or offer the signals before these larger windows open. The best examples are from retail. A classic example is Target’s ability to understand consumption patterns that signal when a major event is near. They were too good in fact, if we remember the 2012 case of how Target figured out a teen girl was pregnant before her father did. They can see the larger windows on the horizon, whereas banks seem void of any future digital data crystal ball.

These windows are critical to growing a client relationship. A recent Bain article highlighted how consumers in the UK change their primary bank only once every 15 to 20 years on average, based on data from the UK’s Competition and Markets Authority. Yet, while an individual will own 8-12 financial products, only a quarter of those are with the same bank. It is no secret of the profit impacts in financial services to cross-selling at the right time.

Getting there requires the assistance of more data-rich repositories, access, analytics and automation features to assist the customer facing teller, lending agent or bank/branch marketing manager. When, where, and what to emphasize isn’t exactly rocket science for the basket of major consumer financial product staples. Sifting and stitching together the views to see those windows is.

Top 10 Big Data Trends in 2017 for Financial Services

Posted on Updated on

2017 trendsIn the past year, the big data pendulum for financial services has officially swung from passing fad or experiment to large deployments. That puts a somewhat different slant on big data trends when compared to 2016 trends.

The question of big data hype versus reality has finally been put to rest for banks. This is further supported by the upward trending spend on big data solutions in financials. Bear in mind that the trends and themes discussed here apply to next-generation infrastructure solutions and quantitative approaches that have come to market over the past seven to ten years. The past year squarely put the ROI and value generated from these efforts into question. This is not only evidenced by discussion from management consulting reports, but also by the change in CIO and CDO positions at many banks, as these financial executives have been increasingly scrutinized after several years of spending.

While the hype has subsided, banks appreciate that big data is not a panacea for all woes. Of course, big data remains vital to reducing overall IT operating costs and delivering advanced data capabilities.

As we anticipated last year, risk, compliance, and marketing are the focus for major banks globally. Risk and compliance naturally continue to be most pressing for SIFIs or G-SIBs, as major mandates tied to heightened capital controls, risk oversight, and data management are coming into effect. Examples include Basel III, FRTB (Fundamental Review of the Trading Book), and MiFID II (Markets in Financial Instruments Directive).

The one theme that remains consistent is that banks are wringing costs from legacy reporting and storage, and are offloading analytic workloads and utilizing “next generation” big data solutions.

By looking at the overall market, we’ve identified what we believe are the key trends expected to reverberate through 2017 in the financial services industry:

1. Banks march to the public cloud

What was once deemed as principally unrealistic for banks is now a reality. Bank adoption of cloud solutions remains strong, and is occuring at banks in a variety of ways. Although private clouds have dominated the past several years, in the coming year we’ll see a marked increase in large projects to test, and a hardening of hybrid cloud environments.

The goal is to gain a clearer view of what a manageable path for large public cloud adoption looks like for financial services. This is not only true for storage and compute, but also for agile applications development. Banks will need to crack the code for big data management and to master nuances of updating, synchronizing, and governing data assets to effectively solve this.

2. Fraud and financial crimes take center stage

The 2016 Wells Fargo account scandal only highlighted how stakes are raised for fraud each year, with 2017 being no exception. Fines are climbing, and sanctions compliance demands have forced banks to increase their transaction monitoring, KYC compliance (Know Your Customer), and money laundering detection and prevention efforts. Regulatory agencies will also increase their scrutiny of business practices and investigation of potential financial crimes.

We also expect announcements to be made concerning a more formal risk assessment for financial services firms, especially with the incoming US administration. Data management and new generation analytics are key tools to improve fraud detection and criminal activity. Therefore, expect risk data aggregation, model risk, and data analytics to be the focus for banks.

3. Financial data governance improving though still limiting adoption

As discussed last year, solutions for big data governance have improved. However, the conversation for banks regarding big data is still largely focused on centralization efforts and data lakes. Bank CDOs—whose importance is growing and who are largely spearheading data governance initiatives—will be more focused on gaining operational and business lift from “lake” projects.

CDOs are becoming less likely to put big data governance processes in their own bucket, but are instead incorporating its data governance into overall bank plans. There is a fair amount of consensus among banking CDOs that there are no single solutions or tools that enable data governance within data lakes. The focus will be on adopting solutions that will manage critical aspects (lineage, wrangling, prep, quality) of the overall governance practice to realize greater usage of data lake environments.

4. Risk and compliance debate shifts to open + commercial tech

As banks look to leverage big data solutions as systems of record for risk and compliance over the next year, the open source and proprietary debate will shift. Banks are realizing that an open source-only approach is proving technically more difficult than expected as bank activity and requirements grow. Further, pure open source business models of vendors who solely rely on them are under scrutiny because of a lack of understanding of the true costs of the solutions they provide, especially at scale.

This is not just a vendor question. Bank business and IT jobs are on the line. Some large financial services IT initiative decisions will begin to change the nature of how these environments are deployed. We will see large bank risk and compliance deployments that consist of a combination of both open source and proprietary solutions.

5. Converged applications to integrate historical and real-time financial data

Financial firms, such as institutional traders, payments, and credit card providers have always stored historical data and increasingly they are analyzing and making decisions based on these data stores. These firms have also made substantial improvements over the last decade to real-time environments. The integration of both historical and real time data, which leverages larger data volumes, is where banks are putting more focus. Further, the integration of both operational and analytical systems is also taking place, putting transactional data in line with analytical or modeling data for vastly improved efficiency and faster time to market.

For example, trading algorithms manage volumes of real-time data very well, though too often these systems don’t go far enough. They often don’t tie in other historical, timed data sources to inform a trader of past customer behavior and trading patterns. Yet, this is commonplace in e-commerce with systems recommendation engines integrating real-time shopping and historical customer activity.

In terms of card or payments, there is a strong drive to accelerate real-time digital payments. Here too, leveraging historical data—which could be measured in seconds, hours or days—can add the needed context to improve customer engagement, or to provide more detailed data to other payments’ parties, thereby improving security and fraud efforts. The ability to combine real-time operational data and historical data analytics into new converged application models using a microservices development approach will become more common over the coming year.

6. Financial services turning to IoT and streaming

IoT was a large theme in big data circles this past year, and while obvious for some industries (i.e., P&C and multi-line insurance carriers or manufacturers), it was not an obvious choice for financials. The basic applications for the financial sector revolved around mobile or ATMs, and those remain important for this year. Additionaly, we expect discussions to expand on how “things” will involve spending or payment activites that could be value added for the consumer. However, the fixation on physical assets (cars, fridges, etc.) caused many to stumble a bit on its applications in banking or capital markets. Re-examining the underlying aspect of IoT, which is ‘things’ sending and receiving data or streaming information from the ‘edge’ in a bi-directional manner is giving rise to a host of uses cases as the definition expands.

In the financial sector, streaming data can expand the speed, access, and ubiquity of market data throughout a financial firm’s trade lifecycle—lowering costs and usage, notably in the middle and back office. Expect the discussion to evolve on how banks define what “edges” are as they explore how to utilize event streaming data on a massive scale. Also, expect to see a more rapid uptake from banks who will deploy streaming solutions that will enable continuous computing in financial risk, monitoring, or transactions.

7. Big data and blockchain move forward in financial services

Financial services saw some of the greatest interest in blockchain technology over the past year. This technology is still in its early stages, and there’s still a lot of confusion about its usage and future in financial services. What are the legalities of contracts in a private vs. public blockchain? How is it different or more advantageous when compared to current settlement or clearing activities? What are the benefits, if any, in working with big data platforms? Many emerging technologies pose the same issues, yet financials will forge ahead in 2017.

General ledgers offer an intriguing way to manage transactions, especially where workflows are repetitive. Ripe areas to explore are how blockchains may enhance big data security, blockchain analytics, or immutable compliance archives for transactional areas. State Street is one financial powerhouse that has indicated its commitment to going live with blockchain POCs in 2017 in areas such as syndicated loans, securities lending, and collateral management.

Expect to see greater exploration and dialogue over the next year as to how blockchains will converge with big data platforms along with the testing of blockchain technology itself. This is still very much an early game.

8. Operational automation a source of disruption

Over the past few years, the financial services sector has come to understand how automation can disrupt services in areas such as portfolio management. The basic element of automation utilizes machine learning, which trains data to improve algorithms that make automated decisions on how to handle incoming data and queries. These tasks are vastly improved by leveraging larger amounts of data.

Many financial services firms are just getting underway with broader machine learning usage. To date, it has been mainly a niche activity in areas such as automated trading systems and high frequency trading.

Simple automated tasks that can assist middle and back office operations such as loan underwriting, reconciliation, and risk model development hold tremendous value across segments. While perhaps not seen as bleeding edge AI examples, they represent solid opportunities for financial firms. Expect wider adoption from banks pushing in this direction over the coming year as big data platform adoption grows.

9. Predictive analytics and data science skills shortage

Machine learning also facilitates predictive analytics, which will continue to receive considerable attention as banks with more mature data science groups will be pushing analytical boundaries. Not only are large and varied data sets being used to develop and improve predictive models, but also real-time and historical data is used as well, as explained above. Deeper internal skills sets will be needed to help apply predictive techniques that enhance existing human and descriptive analytical efforts.

Risk management (i.e., underwriting and credit modeling), has and will continue to be one of the strongest areas of predictive use. Attention and adoption over the coming year will be in cybersecurity and other security initiatives as well as customer activity. We expect that data scientists and data engineers will become important members of quantitative and risk assessment teams and that downmarket firms may struggle to fill open headcount.

10. The utility shift is well underway

Utility-based shared service models have seen substantial growth over the past several years in financial markets, but expect to see more strategic opportunities develop beyond the sell-side. Technologies such as blockchain, big data, advanced analytics, machine learning, and event streaming will increasingly underpin the next generation of data-driven processes. This will create opportunities to advance market structure, resiliency, and efficiency objectives, which serve as key areas for financial utilities.

Symphony is a great early example on the technical communications side, and R3CEV/blockchain represents newer groups to facilitate data management. Buy-side, wealth management, and payments providers can look to the sell-side for examples of how to expel costs with consortium led utility models.

Conclusion

2017 will mark a particularly critical year for vendors and financial firms alike to partner together and ensure that large deployments are successful. This will usher in more opportunity to impact business productivity, and increase the pervasiveness and value of big data.

The coming year will continue to see a strong uptake in core use cases for the financial sector. This includes not only the IT-focused use cases like warehouse offload, storage, and reporting, but also line of business applications such as risk and marketing.

Fundamentals are the focus of big data projects for the coming year for a few reasons. First, banks, wealth management firms, asset managers, and insurance firms are non-digital natives, so the process of conversion is longer. Big data platforms require a fair amount of behavioral change—tools, functions, quality, and usage need agreement across business and technology groups. Without consensus, progress is hampered.

Next, security for financials is and must be at the forefront of big data efforts to enable the expansion of new projects. If data is the new currency of banks, they must accordingly treat it as a monetary asset.

Finally, while the pool of next generation data science and developers are growing, banks still need to aggressively compete for talent. The supply of this type of talent thins even more as you go down market.

While big data adoption in financials is still at a relatively early stage, it is important to note that banks, insurers, and asset management firms represent some of the most advanced big data users in the world. However, the situation of haves and have-nots will likely close over time.

This year marks a great opportunity for a wider collection of firms to hone in on the low-hanging fruit of big data use cases. Therefore, 2017 will see a greater focus on these basic use cases and an expanding uptake on the application of big data technologies in financial services.

The investment and groundwork laid by many major financial firms over the past several years have provided the larger financial market with solid proof that big data solutions are both real and beneficial. The financial advice banks often give their clients is something they should embrace this year for big data as well—it’s about time in the market, not market timing.

Hadoop Big Data Analytics Use Cases: Financial Services Banking on Disruption

Posted on Updated on

shutterstockhadoop_elephant_in_wordsThe last decade has ushered in a perfect storm of disruption for the financial services sector – arguably the most data-intensive sector of the global economy. As a result, companies in this sector are caught in a vice. They are squeezed on one side by highly dynamic compliance and regulatory requirements that demand ever-deeper levels of reporting. And they are squeezed on the other side by legacy platforms that increasingly cannot handle these demands in the timeframes required.

Meanwhile competitive pressures are mounting to use available data to bring more and better products and services to market via better customer segmentation analysis and optimal customer service. Rapidly rising data volumes and data types – mostly unstructured – have strained legacy systems to the limits. Instead of funding innovation, many IT budgets in this sector are funding ‘defensive’ applications for compliance and fraud detection. Also these aging systems are largely unable to aggregate, store and analyze data from customers accessing services from different access points, like smartphones and tablets. Then of course there are the different social media ‘sounding boards’ where customers offer up candid opinions on the services they receive, potentially invaluable information that legacy systems cannot process. That’s a perfect storm by any definition.

For financial services companies, the arrival and rapid development of Hadoop and Big Data analytics over this same past decade couldn’t be more timely. The ability of Hadoop platforms to store gigantic volumes of disparate data matches perfectly with new incumbent data streams. Meanwhile Big Data analytics solutions offer unprecedented opportunities to actually profit from compliance while keeping fraud at bay and enabling new revenue streams. Below are several use cases for Hadoop and Big Data analytics already in full swing.

Risk Management. Post-financial crisis regulations like Basel III posited liquidity reserve requirements forcing lenders to know precisely how much capital they need in reserve. Keep too much and you tie up capital unnecessarily, lowering profit. Keep to little and you run afoul with Basel III.

Hadoop allows lenders to tap into an ever-deepening pool of new data used to analyze credit risk, counter- or third party risk, and even geopolitical risk. Hadoop does this by utilizing simulations that use huge volumes of data and require massive parallel computing power, which you find in a typical Hadoop cluster.

And unlike with existing systems, the Hadoop platform will perform the analysis quickly enough to have lenders make informed decisions – as in when markets open. Legacy systems can take days to perform the same analysis, and at a much higher cost. The bottom line is Hadoop and Big Data analytics offers far deeper and far faster compliance analytics, and unrivalled scalability to deal with any new curve balls regulators can throw.

Sentiment Analysis. Whether it is through using tweets, Facebook, Yelp, Google Reviews or literally hundreds of other opinion outlets, customers are publicly stating their sentiments by the tens of millions. Collectively, these sentiments are vital to making product and service improvements and making better decisions overall. The question is, how does a financial services firm efficiently gather, store and analyze such free-form data from so many sources?

The answer is by leveraging the Hadoop platform and its capability to enable machine learning and natural language processing to extract the pith from enormous stores of sentiment-based data. For example a bank may find that two in ten of its customers post negative comments about it and feel that is acceptable. But further analysis might reveal that only one in ten competitor’s customers post such comments. That’s a call to action.

Financial firms are also leveraging sentiment analysis to peer more deeply into customer trends. This helps spot opportunities that harmonize with those trends, such as self-service mortgages or more customer self service, for example.

Fraud and Crime Prevention. From sniffing out money-laundering schemes to preventing an all-out security attack to protecting customer credit cards, fraud detection and prevention is as vital to financial services firms as is their core lending and investment business. The reason is simple. In today’s world of full-disclosure of security breaches, nothing less than an institution’s reputation and brand are on the line.

Perhaps the most formidable weapon at a bank’s disposal is the ability to create complex models of ‘normal’ behavior across a variety of activities. Doing so requires an ultra-robust, powerful and highly available platform that can:

  • Create customer personas that are self-adjusting as underlying business rules change
  • Detect relationships across different entities that could signal money laundering efforts or credit card fraud
  • Aggregate data from sources as diverse as terrorist watch lists and the near century-old Interpol
  • Integrate and interoperate with other global financial firms to mine an even deeper pool of fraud prevention data

Hadoop enables a far more in-depth degree of modeling, the result of the sheer speed of processing in a Hadoop cluster and the ability to work with virtually any type of data. The result is more aggressive fraud prevention and far fewer false positives along the way. One bank deployed an Apache Hadoop distribution to identify phishing activities in real time, thus markedly minimizing the impact. Using Big Data analytics, the bank can run far more detailed forensics, and with ease of management and peerless performance that deliver maximum ROI.

Comprehensive 360-Degree Customer View. Perhaps the biggest change in customer service for financial services firms in the digital age is that they may well never see their customers, for any service requested. Thus a full view of individual customers – a requirement for service personalization – is derived only from data. This view must take into account analysis of as much or more data than the competition if the investment in a 360-degree view is to pay any dividends. This data can come from emails, social media profiles, complaints, forums, and clickstream data, etc. ad nauseum.

A platform for analytics to tackle this job must combine data exploration and governance as well as access, integration and storage scalability to effectively use all this data and actually drive incremental revenue. This is done with more intelligent and robust campaigns; highly accurate cross selling; and of course better customer retention. Today only Hadoop provides this platform capability.

Conclusions. Financial services continue as one of the most risk-laden and dynamic of all business segments globally. The sheer volume of data this sector generates from customers, transactions, global trading, and many other sources would easily overwhelm traditional processing platforms. It can be accurately said that Hadoop is purpose-built for just such a data-rich and data-dependent environment. With its limitless scale, ability to handle data regardless of format, and ever-widening list of analytics solutions, this open-source platform is the shoulders on which revenue growth rests

Top 10 Big Data Trends in 2016 for Financial Services

Posted on Updated on

2015 was a groundbreaking year for banking and financial markets firms, as they continue to learn how big data can help transform their processes and organizations. Now, with an eye towards what lies ahead for 2016, we see that financial services organizations are still at various stages of their activity with big data in terms of how they’re changing their environments to leverage the benefits it can offer. Banks are continuing to make progress on drafting big data strategies, onboarding providers and executing against initial and subsequent use cases.

For banks, big data initiatives predominately still revolve around improving customer intelligence, reducing risk, and meeting regulatory objectives. These are all activities large Tier 1 financial firms continue to tackle and will do so for the foreseeable future. Down-market, we see mid-tier and small-tier firms (brokerage, asset management, regional banks, advisors, etc.) able to more rapidly adopt new data platforms (cloud and on-premise) that are helping them leapfrog the architectural complexities that their larger brethren must work against. This segment of the market therefore can move more rapidly on growth, profitability and strategic (conceptual/experimental) projects that are aimed at more immediate revenue contribution, versus the more long-term, compliance and cost-dominated priorities that larger banks are focused on.

The market for data software and services providers is moving closer to a breaking point where banks will need to adopt, on larger scales and with greater confidence, solutions to manage internal operations and client-facing activities. This is not unlike the path we have seen cloud technologies take.

Here are some predictions about how big data technologies are evolving, and how these changes will affect the financial services industry:

  1. Machine learning will accelerate, and will be increasingly applied within the fraud and risk sectors. Data-scientist demand and supply continues to work towards equilibrium. Advanced techniques will start being applied within fraud and risk that improve models and allow acceleration towards more real-time analysis and alerting. This acceleration will come from education and real-world applications of market leaders.
  1. Gaps will become more evident between the leaders and the laggards. Each year we see banks that press the gas pedal and are ready to adopt new technology, and those that remain conservative in their efforts to run/change their organization. The stories and use cases will proliferate and become more varied in 2016, and will lead to increased evidence of strong, observable and benchmarked business returns (not just cost takeout) in the broader market.
  1. Data governance, lineage, and other compliance aspects will become more deeply integrated with big data platforms. In order to find a more complete and comprehensive data solution to manage compliance mandates, many banks develop or purchase point solutions, or they try to use existing legacy platforms that are not able to deal with the data surge. Fortunately, there are an increasing number of improved data governance, lineage and quality solutions for Hadoop. More importantly, these new platforms can reach beyond Hadoop and into traditional/legacy data stores to complete the picture for regulation, and they are doing so with the volume, speed, and detail needed to achieve compliance. In addition, 2016 will continue to see the push for “data lakes” that can serve as converged regulatory and risk (RDARR) hubs.
  1. Financial services organizations are struggling to understand how to leverage IoT data. This is the next wave of hype that is grabbing attention in big data, and questions abound in terms of financial services applications. For some industries (telco, retail, and manufacturing) it is already a reality, and these segments have driven the need for IoT data and forced the current conversation. For banks, will IoT data be used more for ATM or mobile banking? Areas that are worth exploring over the coming year involve multiple streams of activity in real time. For example, real-time, multi-channel activities can use IoT data to offer the right offer and advice to retail banking customers at the right time. Or perhaps we should think about this in reverse, where financial firms could embed their services into the actual “thing” or device or other client touch points, not unlike trading collocation facilities that then report home.
  1. Integration into trade, portfolio management, and advisor applications becomes a more prominent feature for software providers. The drum roll of headlines that relates to “gaining benefit from big data” beats louder. Ultimately, this will be judged by end users in the financial sector and the observed (or unobserved, yet measured) benefits and ease of use. Applications that are built off the core of big data platforms will provide that bridge, and sharpen the spear that is big data. We’ve already seen this push with the likes of market data providers, but not with other business user applications, be it CRM, OMS/EMS, etc.
  1. Risk and regulatory data management continue to be the top big data platform priorities. Growth and customer-centric activities sit atop the list of corporate strategies, and there will be firms that can link those strategies to big data. Regardless of whether your bank is an advanced data-driven firm or not, the evolving nature of regulations and the monster challenge to aggregate risk and move towards predictive analytics is still a ways off, yet it’s still a requirement and acknowledged benefit at the C-suite level. Unless heaven opens and regulators ease up on their requirements, risk and regulatory data managements will still be the major challenges for financial institutions in 2016.
  1. The adoption of Hadoop for R:BASE storage and access will proliferate within financial services. Everyone arrives at the party at different times, and adopting technology is no different. The “long tail” adoption is far from here, but middle market or even small-tier banks will begin to see the benefits Hadoop based on:
    • Providers that are bundling/integrating and delivering more complete solutions, services and platforms
    • A community of users that continues to grow and provide the reference base to jump into the pool

    Data offload is now a “classic” use of Hadoop (relatively speaking), while the cool kids move on to larger big data playgrounds, and the masses will climb on board for this application of big data.

  1. Financial services “big data killer apps” gain wider recognition in the market. These have been the FinTech incubators over the past two to three years, and they are helping to form the front-end links needed between the end user and the data platforms. Expect to see more banks running proof-of-concepts with these applications, which will validate the software and provide the basis for “complete solutions.” Both the front-end and back-end should be optimized in concert, rather than as separate projects. We see this market rapidly expanding from the service integrator end as well. This will usher in the discussion of how “big data software” vs. “legacy software” will be adopted by banks.
  1. Operations becomes, and always has been, the last frontier that gains traction. As more reliable “big data” platforms emerge, the idea of security masters, deeper metadata enrichment, ontologies, integrating LEI, and other standards becomes a stark reality. The traditional data approaches are valid, yet some of the thinking will need to turn on its head to gain the full leverage of new solutions–for example dealing schemas and data modeling. Further, with the work of big data largely taking form in the front office, marketing and risk, there are obvious and enormous overlaps of data in middle and back office operations that can more easily be used to leverage existing data lake efforts. We expect to see risk assessment and performance-related big data activities in the middle office to rapidly increase. Further, we will see deeper dialogue on how to actually bring back-office functions (reconciliation, corporate actions, etc.) on board as well.
  1. The institutional side of the bank will start to adopt and take cues from the retail line of business on ways to improve understanding, marketing, and targeting of clients. There are certainly some pure B2B firms that are leveraging big data for improved client intelligence, but largely they take a back seat to the retail B2C line of business, be it credit card, retail banking, wealth management or lending. An easy crossover is for fund complexes (large mutual fund managers) to improve data collection from wealth advisor networks and broker interactions, as well as improved product utilization. This is especially important as mutual funds are typically once removed from their retail client base, so understanding their institutional clients (advisors) is vital.

Confidence still remains key for many larger banks and other financial services firms in adopting new provider “big data” solutions. That said, as you look towards 2016, there will be a greater push from management to move “big data” projects out of IT and within the hands of the business user. In order to do so, there will be a host of architectural, functional, speed, availability, and security questions to consider. As always, applying the traditional rigor to new architectural layouts does not change, as cost and sprawl seen in traditional architectures will begin to surface in new Hadoop and converged big data build-outs.

Further, there will and should be strong leverage and utilization of existing staff/processes for activities such as data governance, quality, reference data management, and standards. This will require continued education of all parties, namely those outside of IT, to understand the rapid developments in the marketplace.

Lastly, there will be the growing conversation on what balance of open source and provider solutions makes sense. Not all open source projects are designed to purely fit the needs of the institutional user, yet open source delivers the agility required going forward—each bank’s requirement will vary, and finding the right mix will be vital to accelerate efforts with big data, which is really all data. All told, the market in 2016 will move forward and evolve to reduce confusion, which will calm the currents in this swaying ocean of “big data.”

Read the rest of this entry »

Is Your Data Worth Anything?

Posted on Updated on

fools-goldOne signpost that a firm is maturing their analytic and data-driven capability is when they monetize data…IDC’s Dan Vesset made this point to me and I agree. Specifically, I’m talking about data, housed within the firm that has not previously been used or leveraged as a source of revenue. It could be dark data (stored and largely unused) or internal data that is frequently used but not shared. That said, understanding where and how you monetize your data is not such an easy task as it turns out either. Predominantly, I believe, why financial firms (or other industries for that matter) aren’t finding this easy is largely due to some key factors. While not exhaustive, the big ones tend to be:

  • They are lacking a robust data-driven (digital, information, analytical, etc..) strategy or are lost on what that is in the first place
  • Do you know what data you have is even valuable? Or what could be valuable?
  • New cultural, behavioral and business model considerations need to be made. And these don’t have to be grand corporate wide sweeping initiatives, they can be more focused and tailored to start.

What does data monetization look like? There are actually very good examples in financial services from new and old firms alike. New firm business models (digitally native) are largely doing this as a big data play as seen in the financial peer-to-peer and crowd-funding/sourcing segment. One good example is Estimize, a give to get model, where crowd sourced estimates are turning the model of analyst stock earnings forecasts upside down. On the site, anyone can contribute to the earnings consensus for a particular stock. As it turns out the predictions are eerily more accurate than what you will get from the street. That data then becomes something Estimize turns around and sells. Another post I’m writing will look more closely at these new data business models, but for now it’s an example of how data is being cultivated and sold. The best examples of where older, more traditional financial firms are winning is in credit cards and payments firms. Other good old guard examples are financial firms (intermediaries) that sit at the intersection of markets, such as State Street, who fairly recently launched Global Exchange to take advantage of the massive amounts of data it collects across markets and offer advanced risk analytics to its clients.

While not in financial services, BloombergBusinessweek’s recent article covering The Weather Channel provides a great read on how a whole company transformed itself with this approach. It offers a nice glimpse into how it could be done and the success it can wring for a firm

There are few particularly interesting highlights of this story. First, is the business was stuck in a traditional model and was seeking a way out. Whether it is a data-driven, innovation, growth, or some other strategy that is being worked through, having the top ranks recognize the opportunity and move on it is always cited as a starting point – and exactly how the Weather Channel started. The next important element is that they worked diligently to close the ‘digital loop.’ This means being able to digitally account for or process information across a life-cycle from start to finish. In this case, mobile was not only a pillar and linchpin in their digital strategy, but also a key component in tightening their digital loop. A firm may not have every aspect of their processes collapsed into a digital workflow, as we often see with new economy e-commerce companies, but they are diligently working to improve it and that is a major feet and accomplishment unto itself.

The last take away I had was the two above steps paved a path to data monetization. The Weather Channel’s data-driven strategy is realized as it is able to apply advanced analytics on its data and recognize a new major source of revenue. And the great part is how they combined their competitively advantageous weather data with partners (Procter & Gamble) to gain insights that have begun to force them (or us even) to rethink some of our basic assumptions about weather, marketing and how people buy products as noted in the article:

In the beginning, WeatherFX assumed it would have to break down its data along the traditional marketing demographics of age, gender, and race, but it quickly realized it didn’t. No matter who they are or what they look like, people in the same place mostly react to weather the same way.

Some thoughts on tackling the challenges. Work to understand what being data-driven means and how it’s achieved. Having researched and written on it more deeply over the past several months, it’s not as obvious as one my think – it can’t be a “I’ll know it when I see it” approach. While the word and concept is used often (and has been for a while) there is surprisingly little that examines what it really means and how to make it a strategic part of your business.

The second is how to go about realizing what data is of worth internally. Here are some some basic criteria (Gartner assist here) to understand what value your data has or can in order from strongest form to weakest:

  1. Scientific approach: firms are using advanced statistics (doing more with math and computing power) and being more experimental to test the correlations, relative values and accuracies of data. Getting a firmer grip on what influences/causes something else
  2. Timeliness: The most valuable information tends to be closer in time or fresh
  3. 1st party data: This is data that is unique to a firm about its clients, products or stores that no one else has and is a cornerstone in data as a competitive advantage.
  4. 2nd party data: These may evolve from alliance or partnerships that is again another unique set of information that is not openly available, yet not entirely your own
  5. Open or 3rd party data: Lowest level is freely available data that is public or from 3rd parties that can be purchased by anyone

The last major challenge in changing the business or culture with leadership…there are scores of books that have been written on the topic and frankly I couldn’t do it justice here. So, perhaps start with the Inc.’s Top 10.

PWC put out an article, The Data Gold Rush, that further explores what business models can be considered when trying to monetize data that is worth a read as well. We continually find in articles, about big data, that discovering new products is often one of the potential benefits from mining your data. That new product may very well already be sitting on your servers.

Getting the Digital Agenda Right for Wealth Management

Posted on

Advisors should start to embrace a smart mix of digital approaches like Amazon recommendation models and automated workflows of previous core competencies.

Even while regulations havImagee piled up during the last few years, wealth management firms have put customer focus back at the top of priorities, especially now that they’ve gained a bit more breathing room in the form of growing assets under management. The goal is to deliver the most relevant, impactful advice that is consumable and cost effective for the client. There are several digital influences now shaping what the best approach is to optimal advice or recommendations. So what should wealth firms be considering?

First, there is online algorithmic money management (or “robo” advisors), which has grabbed top airtime. Many of these sites have actually been building momentum and are driving a wedge in some of the key value propositions traditional wealth managers make. Their goal is to disrupt the delivery and pricing model of personal investing and advice, and I applaud them. The interest in these firms has raised a host of questions, such as their long-term impact and, ultimately, how wealth firms should respond. Key aspects of the robo model that wealth and asset management firms need to be mindful of are:

  • Algorithmic and automated portfolio management for individuals 
  • Online channel only used for advisor communications
  • Highly transparent pricing and fees schedules
  • Advanced self-service tools and access to financial information across channels
  • Gateway to capture lower-income and youth segments

Next, when looking for best-practices to model customer experience and recommendation, we typically search outside wealth management. And out in front of the customer recommendation discussion (I think of this as advice for wealth firms) you will often find Amazon, LinkedIn, and Netflix. Their smart algorithms, dynamic learning, and execution are often lauded as bleeding-edge examples of capturing value from big data — particularly when compared with the wealth and asset management industries. These companies have a phenomenal command of analytics in a way that enables them to improve (generate economic value) their customers’ “jobs to be done” and reduce their effort to use their services. Yet, while impressive, I see them as poor yardsticks when compared to wealth management.

The reason for this is that online segments where consumption (retail, movies, reading, etc.) or social networking are the primary product or service, firms can leverage big data, algorithms, and automation for large portions, if not a majority, of their interactions and delivery model. Further, there are different criteria when using services that dictate how consumers or clients engage. For example, one click on Amazon will get you a $25 dollar DVD that hits your door in two days. This is significantly different from weighing the pros and cons of a Roth or traditional IRA, whose worth, value, and use won’t be seen for decades.

In short, you don’t make financial decisions as quickly as an Amazon one-click buy.

But can a computer effectively manage the financial nuances of so many individual clients or even the ultra-high-net-worth segment? Many would be inclined to say no, however there is plenty of evidence supporting the opposing side — most consumers make only slightly informed, or, more accurately, uninformed, decisions with advice that can too often be disjointed and not particularly aligned to individual situations. Further, advisors are stretched, they lack adequate time to manage relationships, and fresh pools of talent are tightening. So which is it? Both. It’s not an “or” situation, rather it’s an “and.”

I wouldn’t ignore the robo advisor model or Amazon recommendation approaches entirely, nor would I adopt them fully in the context of wealth management. Regardless of which segment a wealth firm serves, be it mass affluent, high net worth, young, old,  etc., advisors need to automate various functions or workflow and certain aspects of their services. Algorithms, digital marketing, and communications are tools to automate workflow within various channels needed for advisors to reduce the friction of how advice is delivered and consumed and, ultimately, experience improved. The transparency and portfolio management aspects from the robo advisor model should replace a portion of services an advisor may perform for clients. It’s a smart mix of digital approaches and advisor interaction.

This leaves us with the following idea: Embrace the robo advisor model, and consider what advisor services can be digitally in/out-sourced like client portfolio management, re-balancing, news, or alerts. Future success may require abandoning previous core competencies. While account management and customer recommendations/advice cannot be fully automated (not yet at least), there is a balance that needs to be struck. Like Amazon and Netflix, the use of data and analytics is needed to help find what balances of best-practices and profitable frontiers exist, and the tools are available today to dynamically evolve how advisors give better advice. Crafting a strong hybrid digital agenda is an imperative to attain profitable growth and execute against the new set of customer advice standards.

Is Wealth Management Missing the Message?

Posted on Updated on

ImageRising costs to do business and contend with regulations has forced some consolidation within wealth management over the past eight to 12 months. Subsequently, the consolidation of the technology they deploy quickly follows. As infrastructure projects start, the conversation quickly leads to how architecture decisions must focus on client and product data and marketing to improve advisors messaging, communications, interaction and overall productivity.

Accelerating client loyalty and advisor engagement is often a leading identified goal.  Yet a recent study by Teradata Applications on DDM, highlights a number of the issues that are arising for marketers which ring true for wealth management and are causing efforts to stumble. Drilling into the survey findings to look specifically at financial firms, one of the top priorities identified in the survey is delivering a customized/personalized experience. Other areas of focus, such as improving marketing efficiency, measuring marketing to business objectives and linking marketing activities to earnings.

We can point to the typical investment planning process, which involves a lengthy, large paper shuffling and data input exercise to onboard a client, followed with a plan and review of the plan with the client. Then, if the advisor is diligent the client may hear back in a year to review the last plan.  A lot can change during those 12 months and clients are finding the lack of interaction alarming. More important, communication is a notable influence of how clients are deciding whether to maintain their advisor relationships.  Calls reminding clients that their money market account positions are running too high or receiving a general investment newsletter is not hitting high degrees of relevancy or timeliness. Our survey highlighted that only about 50% financial services marketers are satisfied with the achievements of their companies marketing program. There are improvements, such as automated alerts, improved segmentation and target to improve advisor outreach.

One of the challenges with improving messaging and marketing activities that was cited in Teradata’s DDM survey was the lack of integrated data. In fact, 70% of financial services marketers agreed that data is the most underutilized asset in their marketing organization. And of those, 96% said they have obstacles that prevent them from using data to make more informed decisions. The data problem is an issue entirely to itself and according to research from CEB Tower Group, which cites that 90% of advisors cannot see a consolidated view of their clients’ holdings that are held away from their firm. Worse, 76% cannot see a consolidated view of their clients’ holdings within their own firm.

Before improving messaging, it is critical to improve data accuracy and availability first and then effectively measure marketing and communications activity. Which drives home what Peter Drucker once said: “If you can’t measure it, you can’t manage it.” For example, our survey found that 90% of financial marketers report challenges with calculating Return on marketing investment (ROMI) or how marketing activities impact sales revenue. With the cost of acquiring and retaining advisors being so high, it is imperative that their outreach and the firms marketing efforts are squarely aligned with not only improving productivity but also revenue outcomes.

Not surprising, the survey found that many financial firms (roughly 66%) cited IT and marketing are misaligned. The multiple disconnects between the business and IT impact how effectively the front lines engage clients. Marketing operations itself cannot seem to get out of its own way to drive towards goals of understanding which initiatives are effective to begin with. Many wealth firms are diligently looking at these issues and starting to understand how they may be able to improve advisor marketing, as right now millions of wealth clients can attest that too often the message is wildly off target.