Benchmarking and normalisation: laying the foundations for data analytics

Data analytics is a critical component of success for financial institutions, especially investment banks operating in today’s competitive FICC markets. According to Celent[1], the US rates business, for example, has shrunk by 40% over the last five years, while new liquidity providers and trading institutions compete harder for market share.

In this fiercely competitive market, banks are increasingly turning to advanced data analytics in order to give them a trading edge over their competitors and maximise the profitability of their current business.

In response, analytics, including advanced machine learning and artificial intelligence (AI), are being explored by banks to integrate risk considerations (market, positional, counterparty), inventory, wider market factors and demand flow to allow trading desks to maximise effectiveness.

To aid them in this task, banks need to harness the goldmine of raw data they are sitting on to give them insight into the activity of their clients and ultimately enable them to leverage AI to improve service delivery to those clients who are most profitable.

But for banks that have numerous desks across their sales and trading departments, each with their own systems and methods of capturing data, how do you lay the groundwork necessary to achieve the ultimate goal of building an enterprise-wide view of client trading behaviour?

A series of careful steps must first be taken to lay the foundations for predictive analytics:

1. Targeting the Data Model

In many banks, each desk has its own ‘home-grown approach’ for data capture, recording different aspects of the transaction with no consistency across the organisation, but, as a broad rule, systems and processes have been designed to capture the minimum amount of information needed to confirm and settle a trade.  Adding to the challenge of creating coherence are the additional trade attributes added later by middle and back office staff, which results in an immediate inconsistency of what’s recorded.  More recently, regulatory reporting requirements have driven the need to capture additional fields, however, these are still fundamentally operational rather than information systems.

In order to enable accurate comparison and analysis of data and to meet regulatory compliance, as a first step it is vital to ensure all desks are recording the same foundational details about each trade.

The key to building a strong analytics foundation is helping the bank define its target data model – an explicit determination of the structure of data.  In common parlance it amounts to a list of the data fields each desk must capture relating to all client trading activity. This includes variables such as product and instrument, client name and location, size and side of the trade to name a few. These are the essential building blocks of data analytics.

2. Aggregation and standardisation

Sales and trading desks typically store their own transaction data independently of each other and within separate databases. In some cases, data capture can be as rudimentary as a simple Excel spreadsheet and thus locating the data within the bank can be one of the most substantial hurdles. This is clearly a challenge for those who require sophisticated data analytics because, in order to deliver the most valuable results to the end user, a comprehensive view of trading activity across the bank is required. With incomplete data, an algorithm’s conclusion could easily be swayed by anomalous or incorrect entries and will be limited in its ability to unlock the hidden patterns within.

Therefore, once the data model has been agreed, the next logical step is to unify all the bank’s transaction data into one centrally-hosted location by extracting it out of all data repositories. In an organisation as complex as an investment bank, this is no simple task. Therefore, the time it takes to fully integrate all the data required to fuel analytics is directly proportional to the complexity of the data landscape within the bank.

To achieve this unified and normalised dataset, data adaptors can be used to plug into the various data sources. These adaptors allow source data to be mapped to the target data model.

3. Plugging the gaps: cleansing and enrichment

The data mapping and normalisation process will undoubtedly reveal inconsistencies in terms of what can be captured from the electronic communications network (ECN) or trading venue. For instance, within the rates space, ensuring that each transaction has an associated DV01 (or CS01 in credit) is a fundamental requirement, without which the analytic value of the data is severely impaired. Having the ability to fill these gaps in real-time becomes a critical step in terms of ensuring a consistent and enriched trade record.

Moreover, consistency across reference data, notably client, product and instrument data, as well as ensuring all market data is captured and stored, are fundamental steps in this augmentation process. The true analytical value can only be revealed by ensuring consistency around not only the ‘who’, ‘what’ and ‘where’ of the transaction but also the market dynamics at the time of trade.

4. Locating the known unknowns

One of the advantages of working with a third party to build the data analytics foundation is the detailed knowledge it has of peer banks’ data capabilities. This allows the vendor to benchmark one bank’s data landscape against the competition and understand where improvements and quick wins can be achieved.

Great data analytics is built on solid preparation. Without a complete and standardised data set, it is impossible to gain an accurate historical or real-time view, or apply predictive analytics effectively. Correctly capturing and storing both the bank’s own data and the data which will help fill the gaps in its knowledge is essential to unlock the full potential of this invaluable information. It is time and resource intensive, but once completed effectively, the investment will yield handsomely.

About Mosaic

Mosaic delivers straightforward, cost-effective integration at a modular level. This approach enables rapid set-up and the ability to scale MSX® to suit individual requirements and budgets.

Working closely with new clients, we first benchmark their current data analytics capabilities against their peers before conducting a full proof of concept programme designed to educate users about the benefits of real-time data analytics, scope resources and secure stakeholder buy-in.

Mosaic will then capture, cleanse and integrate multiple sources of transaction data to provide a single normalised view within the MSX® platform. Once this milestone has been reached, MSX® then allows users to gain insight through the application of diagnostics, trade flow, market share and profitability analytics in real-time on FICC instruments.

Building on this deep level of insight, MSX®’s machine-assisted decision-making capabilities then enable further automation of trading processes and empower users to optimise the efficiency of their FICC businesses.

[1] Rates Revolution – US Treasuries, Celent, 20 April 2017

By Matthew Hodgson,
CEO & Founder, Mosaic Smart Data

IN THE NEWS

Insights