Steps Ahead

Next generation Data platforms and the FX Markets: Are we set to witness a transformation?

By Vivek Shankar

As the FX markets have embraced electronification, the volume of data market participants have access to has increased exponentially. A 2020 Bank of International Settlements (BIs) report discovered that electronic algorithms accounted for 10-20% of all spot FX trading. The BIS highlighted that this represents $200-400 billion in daily turnover.

The rise of algorithms is just one example of how central data collection and management has become to modern institutional FX. Without data, these algorithms cannot react to market conditions intelligently. Everything from market conditions, liquidity information, to transaction cost analysis (TCA) plays a role in optimal execution.

Despite the rise of electronic execution, most market participants struggle to leverage their data. Matthew Hodgson, Founder and CEO of Mosaic Smart Data, highlights a few reasons for this. “The bedrock of any effective analytics platform is aggregated, normalised and enriched data,” he says. With FX liquidity fragmented across so many different channels – and data often delivered at ultra-low latency – it is a significant challenge to aggregate the flow that it is transacting across all channels including electronic, single dealer and voice. Correctly capturing and storing both the bank’s own data is time and resource intensive and requires specialist knowledge of both FX markets and data science.”

In recent years, the markets have witnessed the rise of data-driven platforms that promise insight into trading workflows. How effective are these platforms, and do they really make a trader’s life easier?

“The bedrock of any effective analytics platform is aggregated, normalised and enriched data,”

THE CHALLENGES HINDERING DATA-DRIVEN INSIGHTS

Firms have traditionally had no issues generating and collecting data from their market activities. However, analysing these data sets and deriving insight has been problematic. Ironically, the electronification of the market in spot has posed a challenge.

FX traders have long loved voice since it gives them a full view of the market across different LPs. As Tier 1 LPs have become selective of who they interact with, the majority of spot participants receive prices through aggregated platforms or lower-tier institutions. Thus, clients trading in spot FX at a certain size will receive completely different data from different LPs. This affects the depth to which they can analyse these datasets.

Order flows are also a particular challenge. Typically, orders flow from a PMS to an OMS, and finally to an EMS, gathering and analysing data along the way. As algo adoption has increased, the sheer volume of data and trading frequency has left execution platforms unable to cope with the pressure. While many platforms have introduced major updates, order flow complexity poses a significant challenge.

There’s also the issue of market fragmentation. While spot is traded electronically, for the most part, the NDF and swaps market work largely off voice or hybrid broking models. The result is multiple datasets and workflows that don’t integrate. A manager or trader who wants to gain a bird’s eye view of their activities will always have to account for data siloed in a different system. As an example, here are a few sources from where firms generate data: Electronic & voice transactions, sales coverage, client data, instrument data, market activity (from multiple markets stored on different systems.) Data silos are perhaps the biggest challenge with which firms are currently dealing.

Data analysis, database organization, and integrity have long been the purview of technical, back-office departments that have proved challenging to manage.

In a rush to gather data, firms neglected to impose data integrity and governance processes. The result is multiple data sources, all of them in different formats that don’t talk to each other. Fragmented data sources result in slower reporting since employees must manually convert and filter data. In turn, firms lack a global view of their operations data and must roll up individual datasets.

From a business perspective, inefficient infrastructure results in missed RFQs due to inconsistent quoting and price discovery. Market participants have long been aware of these issues but their efforts have been stymied due to the complexity of the task ahead of them. After all, most firms lack expertise in what is essentially an IT function.

Data analysis, database organization, and integrity have long been the purview of technical, back-office departments that have proved challenging to manage. To illustrate the problem, most firms run into issues defining the goal driving data analysis. Which problems will data analysis solve and which questions do firms need their data analytics to answer?

Daniel Chambers, Head of Data and Analytics at BidFX, further expands. “For a long time there were many talks, conferences and discussions around big data,” he says. “However, it normally isn’t a case of the more data the better. The questions of, ‘What am I trying to see in the data’ and ‘How can it improve my strategies and/or execution’, etc. help to target what data is collected, how it is stored  and how it is analysed. It’s very easy to get lost in data and although experiments on data serve their purpose, the initial value is normally derived if targeted data collection and analysis is done.”

Chambers points out that the usability of data by stakeholders has to be considered at all times. After all, data isn’t of any use unless it can be leveraged by those who will use it routinely. “This will likely lead to an attempt to limit the number of third-party data/analytics providers that they’re connected to,” he says. “This can be difficult due to fragmentation of data and of services provided.”

Hodgson outlines three steps that all firms must take. “First, it is vital to understand what the data needs and analytics ambitions of the firm are,” he says. “Once source systems have been identified, and the parameters and ambitions are established, the second stage of the process – aggregating and normalising data – can be undertaken.There is an important third and final step which is to enrich this normalised data set with additional fields of data that are missing from the transaction record. This is where external data sets can be employed to ‘plug the gaps’ in a firm’s data.”

“Clients are also using analytics less often solely for box-tick regulatory reasons, but more and more for actual execution or strategy improvement.”

NEXT-GENERATION DATA PLATFORMS PROVIDE SOLUTIONS

Institutional finance was initially cagey about adopting third-party expertise, but as the challenges of running complex IT departments came to light, many have chosen to outsource technical ability. There are many benefits to this approach. First, firms can rely on up-to-date market technology at all times since data platforms are built with technology as a central pillar of their businesses.

Their interfaces do the heavy technical lifting behind the scenes while presenting a low-code, easy-to-use interface upfront. The result is anyone can run reports, irrespective of their technical ability. These platforms also help firms normalize and gather their data onto a single platform, making it easy to run analytics. Some platforms even come equipped with artificial intelligence (AI) algorithms that learn a user’s behavior and provide customized recommendations. From an LPs perspective, these algorithms can spot trends in client trading volumes and allow the firm to assume a proactive stance towards client demand.

As Hodgson puts it, “Using AI and machine learning you can, for example, see which customers are about to defect, and therefore up your defensive measures – after all, it’s much more expensive to gain a new customer that maintain an existing one. You can also become more offensive, because you are able to see what customer activity you anticipate on a particular day and then serve that customer with the appropriate inventory. This technology can be leveraged to improve the service a bank’s FX desk can offer to all types of clients – corporates, retail customers, hedge funds, asset managers, central banks and even internal clients.”

The analytics from these platforms can also power TCA that lower costs and provide in-depth insight into execution patterns. As regulatory pressure increases compliance costs, TCA is central to ensuring greater efficiency in trade workflows. Chambers highlights a few examples. “The feedback loop from post-trade to pre-trade can be a tricky one, but it can be achieved in a number of different ways. The first is to help in more general terms of choosing which time of day to trade, which

currency pairs to use to enter/exit positions and importantly how to curate your pool of liquidity. Time of execution assistance can be through deciding how long to take to trade, using an algo or RFQ, exposure to announcement headline risk and more.”

Ease-of-use is one of the primary benefits that modern data platforms offer. For instance, BidFX helps to bring efficiency to traders by offering a suite of analytics templates that help with TCA and other trade workflow task optimization.

DATA SCIENCE POWERS ALPHA GENERATION

The next-generation data platforms currently on offer cater equally to the buy and sell-side. Thanks to customization, these platforms throw up a variety of data-backed conclusions and firms can use them as their business demands. For instance, sell-side traders can use these insights to figure out who their most profitable client is and the value each client brings to the business relationship.

Insights such as these help them prioritize their time better. Spotting execution trends is also simple with these platforms. For instance, Mosaic’s Smart Data platform highlights decreasing business from certain clients and provides analytics to suggest possible causes. Firms can customize the degree of insight they receive from the platform and the data views they wish to dissect.

Mosaic’s platform also makes it simple to evaluate order flow proportions and correlate them to client behaviour. In essence, traders have a worldclass quantitative analyst by their side discovering market and client anomalies and summarizing important data when requested.

Everything from a client’s seasonal flow to their lifetime value is automatically calculated, and traders don’t need special technical knowledge to make these algorithms work. Depending on a firm’s wishes, Mosaic’s platform even suggests future courses of action, thanks to its AI-powered algorithm.

“The key benefit of next generation FX data platforms such as Mosaic is enabling banks to be smarter than ever when it comes to gaining a comprehensive view of their data and extracting value from it,” says Hodgson. “Productivity is key to retaining a competitive edge, and AI can allow the FX desk to predict its clients’ needs far more effectively and, ultimately, drive more business.”

Liquidity-level data analysis offers significant advantages to buy-side teams. Especially relevant is data’s ability to isolate the most effective areas of business in a trading operation. Does the firm have enough access to liquidity to exit their trades efficiently, and so on?

LPs and sell-side firms can compare their datasets to the broader market to analyse gaps in their business. Data analysis also allows for benchmarking opportunities, and banks can use these datasets as a selling point. For instance, comparing current execution data to historical benchmarks provides banks instant feedback regarding standards. They also enhance execution transparency and help clients understand where they stand.

BidFX offers its clients the ability to annotate data, thereby simplifying policy management. Clients can annotate change points in their data and easily measure effectiveness post-change versus historical. This ability to derive real-time conclusions makes it easy to leverage datasets. BidFX is also optimistic about its Verdict system. This system allows its clients to define trade execution parameters (spread, speed, etc) in line with internal cost control policies. These checks take place pre-trade and reduce the chances of a post-faulty trade post-mortem. Data from this system can also enhance client-LP relationships.

Clients can share data with their LPs and explain their decision to either choose the LP or an alternative source. “Clients will move away from only looking at execution costs and TCA, but also analysing liquidity provision. This is easiest done with platforms already collecting the data as it is large amounts of data on a daily basis,” says Chambers. “They are also using analytics less often solely for box-tick regulatory reasons, but more and more for actual execution or strategy improvement.”

Cybersecurity is paramount these days and a good service provider will highlight the steps they’ve taken to ensure top-notch protection

EVALUATING SERVICE PROVIDERS

While the advantages of embracing a data platform are many, how does one go about evaluating service providers? Given the complexity of trading workflows and the features on offer, this task might seem intimidating at first.

The best place to begin is to evaluate how well a service provider understands a firm’s business, whether it’s the buy-side or the sell-side. An ideal service provider is effectively a partner in their client’s business, instead of a mere infrastructure provider. Insights provided by the vendor can also power new trading strategies, via advanced analytics.  Needless to say, data security and partitioning are key issues to consider. The best service providers highlight their ability to provide data pipelines but won’t have access to what flows through them. Client data remains in the client’s hands with the service provider securing all ends of the network.

Cybersecurity is paramount these days and a good service provider will highlight the steps they’ve taken to ensure top-notch protection at all times. Volatile markets will cause turmoil and a good service provider ensures basics such as data integrity and security are always in place, allowing firms to focus on the markets in such times.

The quality of a data platform’s dashboards is also a great indicator of performance. Firms should ideally partner with platforms that give them actionable intelligence with which to work. These recommendations should be sourced from historical trading patterns and must incorporate the specifics of a firm’s workflow. Generic recommendations that could apply to multiple situations are unhelpful.

One of the biggest advantages of outsourcing data infrastructure needs to an expert third-party provider are the frequency of updates and infrastructure maintenance. An ideal data platform partner will be up to speed with market developments. They can help firms assume a proactive stance towards market movements and help them anticipate sources of volatility through deep data analysis.

WHAT THE FUTURE HOLDS

As innovation in FX continues to rise, adoption rates are increasing across the board. Cloud-based technology is making it easier than ever to adopt and scale electronic solutions.

Mosaic’s Hodgson highlights another interesting development. “Advancements in natural language generation (NLG), a software process that automatically transforms data into a written narrative, will also continue to develop,” he says. “Imagine if your highest performing, most experienced quant could write all the reports your bank generates. And now imagine those reports could be produced in seconds and distributed across the bank. The gains in efficiency, performance and business insight would have an almost immediate impact on your bottom line,” he concludes.

BidFx’s Chambers predicts greater electronification away from the spot market. “The product coverage will expand,” he says. “For example, analysis that was done on equities and futures spread into FX and now within FX there’s spot analysis that is moving into outrights and NDFs as well as options.”

Add to all this the rise of platform compatible solutions and greater need for insight into workflows, there’s no doubt that the markets are set to witness significant changes.