Steps Ahead

MOSAIC SMART DATA IS PROUD TO BE A SPONSOR OF FRONTIERS IN QUANTITATIVE FINANCE

Frontiers in Quantitative Finance

A monthly event hosted by the Oxford Mathematical and Computational Finance Group. Providing a forum for academics and practitioners to discuss and debate new ideas in the modelling, management and regulation of financial risks and an opportunity for networking. Students, academics and professionals from the finance and government sector are welcome to join the online seminar.

Attendance is free of charge but requires prior registration.

Date:

Thursday 24th September, 18:00-19:15

Speaker:

Vladimir Piterbarg is a Managing Director and Head of Quantitative Analytics and Quantitative Development at NatWest Markets

Event synopsis:

LIBOR reform and the arc-sine law

The fallback spread that will be used to calculate Libor rates of a given tenor in the future is defined as the median (50%-th percentile) of five years of historical observations of the spread between this Libor and compounded OIS rates, calculated on the future date of Libor cessation announcement. Some of the observations that will enter this calculation have already occurred and some are still in the future. In this note we assert that the future realised median is a non-linear function of future, and hence yet unknown, spread observations and therefore its fair value calculation must account for spread dynamics and not just forward values. We propose a model of the future evolution of spreads and derive a very numerically efficient algorithm for calculating the fair value of the median that incorporates both the historical observations and the future dynamics of the spread. We establish that, given our model, the market expectations of the fallback spreads are at, or somewhat beyond, the upper range of theoretically justifiable values. The approximation method we develop is based in part on the Arc-Sine Law and should be of independent interest to math finance professionals.

Points of discussion:

June 25th

For the second Frontiers in Quantitative Finance webcast in June, Professor Loriana Pelizzon from Goethe University, Frankfurt, gave a presentation on ‘Loss sharing in Central Counterparties: Winners and Losers’. Her research on the subject is a collaboration with Christian Kubitza from University of Bonn and Mila Getmansky Sherman from University of Massachusetts.

Their aim of their work is to determine the relative benefits of central clearing to different participants following default losses in derivatives transactions. Professor Pelizzon started the webcast by giving a brief background of the over-the-counter (OTC) derivatives market and default losses before the role of the Central Counterparty (CCP) was mandated. She described the role of the CCP and outlined two central clearing mechanisms: multilateral netting which offsets gains and losses across multiple counterparties; and loss-sharing in the event of a member defaulting, in which losses are shared among the remaining members.

For their model, Professor Pelizzon and her co-researchers have considered systematic risk and portfolio directionality using a network consisting of dealers only, and another with all users, and then observed how losses are managed or shared by the CCP.

Their research concludes that central clearing favours dealers over other market participants such as asset managers, hedge funds and non-dealing banks. Their reasoning is that the latter are penalised because of the rule of loss-sharing adopted by the CCPs. Furthermore, this imbalance in benefits to all participants is a reason why there is reluctance to choose to centrally clear when it is not mandated.

Points of discussion:

June 11th

For June’s Frontiers in Quantitative Finance webcast, Professor Marco Avellaneda from New York University explored Statistical Clustering, Hierarchical Principal Component Analysis and Portfolio Management. The research presented was carried out in collaboration with Juan Serur from New York University.

Professor Avellaneda discussed different types of factor models comparing those based on explicit versus mathematical factors. He went on to describe Principal Component Analysis (PCA) and the first eigen portfolio, which is the market portfolio. He then showed some results of PCA applied to stock markets and concluded that it is difficult to find a financial explanation for higher order eigen portfolios.

Professor Avellaneda continued by introducing Hierarchical Principal Component Analysis (HPCA) which aims to partition the stock market into clusters, explaining how he and his research partner constructed the correlation matrix, selecting a representative portfolio for each sector, which extends the work of Cont and Kan [2011]. He showed the results when applied to S&P 500 returns. Further analysis on four major global equity markets, in the US, Europe, Emerging Markets and China, showed that HCPA gave clear high order eigen problems.

Comparing PCA against HPCA using the percentage of variance explained, he concluded that PCA rises faster since it is a less greedy algorithm and the HPCA has a lower concentration given a number of components. HPCA is therefore more interpretable.

He ended the talk with a brief explanation of Statistically Generated Clusters and gave some suggestions for future work.

Points of discussion:

May 7th

May’s Frontiers in Quantitative Finance webcast, co-sponsored by Mosaic Smart Data, was led by Dr Brian Healy from Decision Science Ltd and Professor Andrew Papanicolaou from New York University, who gave a joint presentation on Principal Component Analysis (PCA) of Implied Volatility Surfaces.

The presentation was based on new research carried out in collaboration with Professor Marco Avellaneda from New York University and Professor George Papanicolaou from Stanford University.

As an introduction to the subject, Dr Healy began by explaining how they applied PCA to U.S. equities’ returns. He discussed issues such as how many components should be removed for the residuals to be random, how to construct the explanatory factors and the natural structure of the data. He then demonstrated this by applying some results from random matrix theory to the data.

Professor Andrew Papanicolaou followed by discussing the application of PCA using tensors to implied volatility surface data, which is a higher-dimensional problem. When performing matrix PCA on the volatility surface data, they concluded that the number of significant components is 9. As a result of the structure contained in option prices, this is lower than is typical for equity returns, where the number is usually 20.

In a similar approach to a market capitalisation weighted equity factor or index, Healy and Papanicolaou used the open interest on the option and its vega to create a portfolio of implied volatility returns that tracks the volatility surface ‘EigenPortfolio’ – suggesting this can be used in similar ways to the VIX, the S&P 500 volatility index.

Healy and Papanicolaou concluded the talk by explaining how retaining the tensor structure in the ‘EigenPortfolio’ improves how well it tracks the open interest weighted implied volatility returns portfolio.

The online event was hosted by the Oxford Mathematical and Computational Finance Group and was attended by more than 130 participants from around the world. Follow Mosaic Smart Data on LinkedIn to ensure you don’t miss information on the next Group meeting which will be in June.

To read the full presentation of Hedging with Neural Networks, by Professor Johannes Ruf and Weiguan Wang, click here.

Points of discussion:

April 16th

In April’s Frontiers in Quantitative Finance webcast, co-sponsored by Mosaic Smart Data, Professor Johannes Ruf from the London School of Economics (LSE) spoke about the use of neural networks as estimation tools for the hedging of options.

As Professor of Mathematics at the LSE, he has conducted research with one of his PhD students, Weiguan Wang. The aim of Ruf and Wang’s work was to investigate the application of neural networks (NNs) as a tool for non-parametric estimation, discussing their implementation, comparing the results to several benchmarks and analysing the results, giving possible explanation.

Professor Ruf discussed how NNs have been applied to the pricing and hedging of derivatives over the past 30 years, commenting that most research papers have claimed that NNs have superior performance over other benchmarks.

He and Wang have implemented a NN, ‘HedgeNet’, which is trained to minimise the hedging error rather than the pricing error. Professor Ruf described their implementation of the model, the choice of input features and time periods. The data sets used were the end-of-day and tick prices of S&P 500 and Euro Stoxx 50 options.

He compared their results using Black-Scholes and linear regression as benchmarks. The results showed that HedgeNet outperformed the Black-Scholes benchmark significantly but not the linear regression model, which incorporated the leverage effect. 

Professor Ruf argued that the outperformance of NNs previously reported in mathematical literature is most likely due to a lack of data hygiene such as the incorrect treatment of the time series data or using an incorrect split of the test and training data.

The online event was hosted by the Oxford Mathematical and Computational Finance Group and was attended by 167 people from around the world.

To read the full presentation of Hedging with Neural Networks, by Professor Johannes Ruf and Weiguan Wang, click here.

Points of discussion:

February 20th

In February’s Frontiers in Quantitative Finance event co-sponsored by Mosaic Smart Data, Professor Olivier Guéant from the Université Paris 1 spoke about how automation in financial trading is taking the next step in market making.

Professor Guéant discussed research that he and his team have conducted on how to build market making algorithms for options on liquid assets. He discussed problems faced by market makers and the models that price-setting is based upon.

He formulated the problem as one of optimisation which is complicated due to the ‘curse of dimensionality’. His research uses ‘The Greeks’ to obtain dimensionality reduction, reducing the problem down to a system of linear ordinary differential equations which becomes easier to solve.

In giving numerical results he referred to assumptions he and his team have made, as well as ideas for further work in the algo market making space.

The event was hosted by the Oxford Mathematical and Computational Finance Group and held at Citi Stirling Square.