Select Page

This article was first published 11th Dec 17 on www.btrm.org

The journey from compliance to ecosystem in banking and treasury.

 

Judgement, (noun), the ability to form an opinion, estimate, notion or conclusion as from circumstances presented to the mind, especially in matters affecting action.

 

The pace at which the advances in digital application is impacting banking has been faster than many commentators expected. Over the last two years the rise in the number of disruptors and the prominence of platform companies has increased the threat to banks dominance of several product lines, in particular perceived customer ownership.

In February ’17 we highlighted the urgency with which banks should adopt collaboration with FinTech companies which provide new techniques for data analytics, customer service and adapting work flow, suggesting that this approach would rapidly become BAU. This would enable early adopters to move to new business models which protect and maintain their relevancy.

During October both McKinsey & Co and The Financial Brand published digital banking reports supporting this thesis and also highlighting that the pace of change is ahead of where most expected. Financial Brand highlights that not only has the digital agenda become more intertwined with innovation and business strategy but that the transformation of back-office processes through to customer-facing experiences should now be in lock-step.

In their Digital Banking Report Financial Brand goes on to say that “being a leader in innovation and emerging technology is no longer a luxury only for big players, it is important that all financial organisations make this a core competency”.

McKinsey, in their paper Remaking the bank for an ecosystem world, report that while pressures from regulatory change and disruptive threats to retail banking may be easing there are new threats in capital markets and investment banking and they expect the very customer centric platform companies will be watching this closely.

We believe that banks are well positioned to make the leap to new business models, but to do so they need to increase the pace of adopting experimentation and collaboration, move quickly to maximising benefits of the internal data community and adapt more swiftly to embrace new techniques for digitally industrialising core processes.

Evolving the culture within a financial services company to achieve all this is not easy, but we believe that banks, and buy side firms, have the component parts to make it happen.

 

What’s at stake?

A bank’s core “IP” is the transmission and conversion of capital across a suite of products which meet the needs of depositors and borrowers, plus the unique expertise of a particular firm in a product, geography or marketplace. Relatively high regulatory barriers to entry mean that much of this “IP” is protected, but with the rise of customer- centric platforms and marketplaces focused on distribution or disintermediation all but the very “basics of banking” are at risk.

In his excellent 2016 paper A Trillion Dollar Market, By the People, For the People Charles Moldow of Foundation Capital highlights that marketplace platforms could not just disrupt but displace banks from all but the highest value segment of the lending market where deep due diligence and complex structures are the norm.

The growth of peer to peer marketplaces where participation in a loan or pool is possible, and also secondary liquidity is emerging, offers a real alternative to a bank balance sheet. Especially as cost/yield favours customers and account opening processes are almost always more straightforward based on excellence in User Experience and application of Data.

This would suggest that in some parts of the core banking product suite long-term disruption has already been seeded and will evolve over the next 5-10 years.

PSD2 will accelerate the development of open-banking business models and to some extent play into the hands of the platform companies who are continually deepening their share of mind with consumers and enterprise from a hardware, payments, logistics and software perspective. However, GDPR balances PSD2’s open data push ensuring that the customer is in control of by who and how their data is used.

Banks do maintain a position of trust, with both consumers and corporations, to be the custodians of their data, and their money. In combination with the core “IP” of banking this presents an opportunity to transform business models and emerge stronger.

 

Some visionaries highlight a future world where an enterprise’s balance sheet is optimised by machine, funding needs identified by predictive analytics, origination process triggered and managed by a TMS, syndication process conducted by robots, securities distributed over app-store like portals and immutable records of ownership of these digital securities recorded on the blockchain.

If you have recently explored ICOs, attended an AI conference or had the pleasure of a VR emersion in a future world then you could convince yourself that this will, in fact, be the way of things.

How we get from here to there and how a financial services company makes collaboration, experimentation, and transformation a core competency will be the driver of who gets to play the new game. New rules apply to everything that we do and the pace of change will only pick up during the next decade – starting immediately.

The Critical Need to Access, Analyse, Visualise and Protect data

For the banking and financial services executive, the vast amount of data built up over decades (and growing rapidly) holds the key to being able to make informed and tested judgments about how to tackle the shift taking place across the industry.

The challenges, though, are numerous, data will currently be dispersed across legacy systems, entities and business lines built up during the product led, silo driven, period of rapid growth by acquisition and leverage. In the years since the financial crisis of 2008 a significant part of the focus on data has been to comply with regulatory demands which have often come at a pace to which complex organisations were not set up to respond.

Huge resources have been spent on regulatory implementation, sourcing, reconciling, compiling and reporting data to internal and external parties. While many efficiency gains may have been realised – through identifying data quality improvements, streamlining data reporting structures, and reducing the human resource required to perform such tasks – the industry is now at a point where really delivering value from this process is an absolute must.

Treasury, Risk and Finance are at the core of the bank ensuring that the core “IP” is executed upon in a compliant, efficient and cost-effective manner. Treasury getting this right enables the unique IP of a firm in a product or location to be sustained and their experts to maximise returns in a new ecosystem business model.

But in an increasingly connected and data led economy how does Treasury, together with Risk & Finance teams, make best use of data to test and simulate important decisions so that the expert Judgement of Treasury or Executives is increasingly well informed?

Enter Artificial Intelligence, Machine Learning and scale simulation to improve the pace of data digestion and produce early warnings of strategic threats or un-common behaviours across business lines, products or clients.

Stress Testing is a highly complex and expanding subject for banks and other financial services companies. New divisions have been created fusing multi-discipline expertise to ensure compliance with the raising regulatory bar and to begin to connect models across the organisation to meet newer demands such as IFRS9 or FRTB. Taking a data community approach to this expertise, and applying the right base layer of computational development software means that platforms may be applied and models customised to substantially change the way Treasury works and generate deep insights upon which expert Judgement may be based.

Agent Based Modelling has been gaining an increasing level of attention from regulators, industry participants, and data scientists over the last several years, accelerating in 2017 with publication of expert papers and thought leadership pieces from the Bank of England and EBA.

WTF – What’s the Fuss?

For the Treasury Layman Agent Based Modelling enables us to model our world and run simulations of how specific agents (see below) may react and respond to events, decisions and the actions of each other.

This is a complex arena and one which (for the writer at least!) heightens the need for your Treasury to be Data Science Ready.

An agent can be constructed in many ways, depending on what you want to simulate or model. Debate, often rife, about being able to accurately model agent behaviour can be addressed by beginning with a model layer that enables a bank to construct their own “world” by combining a library of agents and drivers (i.e. regulatory maps) with their internally generated behavioural analytics enhanced by machine learning.

By taking this, or a similar, approach to modelling your world in the context of the outside world Treasury can begin to ask questions of the balance sheet that are not only important for risk management but can be applied to business model transformation.

Prof. Moorad Choudhry explains; “adopting these techniques enables Treasury to really know its own balance sheet. The drivers of key metrics such as net interest margin (NIM) can be understood with better precision, and similarly as in the area of stress testing the bank can derive precise firm-specific scenarios and factor changes when planning and forecasting NIM. Understanding better how changes in internal and external factors impact the balance sheet from a risk and P&L perspective also facilitates more integrated assets and liabilities origination. This is a key ingredient of genuine strategic ALM capability.”

Put simply, moving from the compliance decade to the open data economy can put Treasury insights centre stage when the executive team explores strategic judgement calls. You might be simulating funding risk based on a specific market event, or you could gain insight to Net Interest Margin from a shift to an open marketplace business model.

Gary Frost, CEO & Founder of 51zero – a market leader in big data and modelling transformation in financial risks – explains the three big things that an organisation (especially a treasury) need to do to achieve these strategic insights:

Data Governance is key.

Hundreds of data sources flow into treasury’s data warehouses. Without appropriate data governance structures modellers may find themselves in a quagmire of data quality issues. A data governance programme will allow you to organise and categorise your data, avoiding duplications or gaps, simplifying the on-boarding of new data sources and enable the move from data to information.

“Big Data” computational power.

Whilst not all the data volumes are huge “Big Data” has moved from computation based on very large unstructured data, to highly complex calculations on smaller sets of structured and unstructured data, run at massive scale. Leveraging Hadoop based data warehouses to perform distributed computation will allow you to run sophisticated scenario analysis. Platforms and tooling sitting on top of the Big Data stack allow for the execution of complex agent based simulations which generate deep insight.

Agility.

It is no longer acceptable to plan software release cycles every quarter, the pace of regulatory change as well as business demands for insights from their data means that IT departments must become much better at releasing software more frequently. There are tools and techniques to improve the delivery cycle, e.g. DevOps, but IT departments cannot achieve agility in isolation. Organisational and communication structures between the business and technology groups need to change to support a Continuous Delivery cycle.

As we consider this from a Treasury perspective how do we ensure that our balance sheet managers are involved in the strategic conversation about adopting these types of technology across the bank?

Our view is that as our industry changes and we modify our technology every part of our bank will be joined by a data community. This means that Innovation, data architecture, process digitisation and, by dint, how we model our world will super-charge culture change in the industry from “protective” to “expansive” behaviour. The choice between the two cultures is, really, a choice between being an infrastructure or customer focused company.

It’s still early in the journey but, as we have tried to show, not only is the pace increasing exponentially but these developments will impact every part of our industry – now.

The time has come to shift to an ecosystem model based on highly informed Judgement from Data Insights.

J.F.D.I.!

Seasons Greetings from Waltham Analytics.

About the author

David Castle is the founder and CEO at Waltham Analytics Ltd

[email protected]

 

This article was first published 11th Dec 17 on www.btrm.org

© www.btrm.org

Share This

Share this article with your colleagues/friends!