MathFinance Conference Recap

The 17th MathFinance Conference we held in Frankfurt on 20-21 April 2017 was once more the key event in Germany for quants. More than 110 registered participants, at least 30 from countries outside Germany including Taiwan had attended, this time with a high density of model validators.

In a market that is dominated by increased “radical uncertainty”, saturated equity markets, low interest rates persistent in EUR/CHF/JPY, and banks being pushed into avoiding business as much as possible and focus on writing compliance papers by regulators, there are still many areas quants work in, in fact, quantitative modeling is likely to be even more in demand.

Let me recap what I learned:

The first morning Karel in’t Hout organized a symposium on Numerical Methods selecting key innovations from four countries.

1. Kathrin Glau illustrated in her talk on Complexity reduction techniques for finance that real-time prices in complex models, e.g. the rBergomi model, which being non-Markovian leads to nested Monte Carlo, and consequently requires 20 minutes to compute one price. Her solution is called pop, which stands for parametric option pricing. Examples include FFT (Fast Fourier Transforms) or Reduced Basis for PDEs. She explained Magic Point Interpolation – interpolation of a surface in a smart way. It is done via Chebychev interpolation, which also works for higher dimensions. Having fun with Chebychev polynomial is possible in the Chebfun package. The method is especially designed for higher dimensional path-dependent derivatives.

2. Álvaro Leitao spoke about an efficient one and multiple time-step Monte Carlo simulation of the SABR model. For Monte Carlo, the computational cost depends mostly on the CDF inversion. The key challenge is the integrated variance process. A copula-based simulation allows the simulation of one-time-step. He reports that the Gumbel copula performs best. He concludes that a good alternative to the Hagan formula, mSABR, the generalization to m=time steps, can be used for barrier options, negative interest rates. Hopes are raised to extend the method to SLV.

3. Maarten Wyns presented ADI finite difference schemes for the calibration of stochastic local volatility models: an adjoint method. After a motivation to SVL as a popular industry model, he explained that the leverage function ensures matching vanilla option prices. The Gyöngy 86 approach is known to be tricky. Instead he proposes to numerically solve the backward PDE. He prices European options via LV and exotics via SLV consistently. He checks two scenarios, one by Clark, one by Andersen which strongly violates the Feller condition. He requires 1 second calibration time on his laptop using matlab. It turns out that the Feller condition is not relevant when the backward solution is applied.

4. Matthieu Mariapragassam impressed with his Calibration of a four-factor hybrid local-stochastic volatility model with a new control variate particle method. Taking an example of FX EUR/USD Derivatives he recapitulates that the pure local volatility (LV) surface is used a lot in practice, but has the known drawbacks. He recycles Adil Reghai’s very fast calibration via fixed point iteration and applies this to a Heston SLV plus two CIR processes for the short rates of EUR and USD, which he calls Heston-2CIR++. Wyns’ calibration can be valuable part of the calibration flow chart. This leads to a new result for his model where both interest rates are stochastic. After dealing with the violated Feller condition, he suggests using SLV with constant interest rates as control variate for valuation. He restates that the Feller conditition is almost always violated in FX markets, and solves this by a clever change of variables in combination with Finite Element Methods (FEM). The LSV scheme is implemented via QE. The graph shows the solution of the Fokker-Planck density with FEM, yielding a very smooth result.

The conference proceeded with more highlights:

Matthias Fengler presented a discrete-time stochastic volatility model with Meixner innovations. Other than the common GARCH models, this is a proper discrete SV model with own driving factors. The Meixner (MXN) distribution semi-heavy tails, its moment-generating function and all the moments exist. Interestingly, MXN is the only model other than the normal that can be fully characterized. Asset pricing applications included a Heston-Nandi style vanilla formula. The model allows explicit mapping between physical and risk-neutral parameters. And truly innovative: very heavy skew on short term equity options or forward skew can be modeled.

In his talk about a universal pairwise local correlation model Frank Koster motivated his approach by demonstrating that a multi-asset LV model with constant correlation does not fit the basket implied volatility smile. There are stochastic correlation models and local correlation models. Index models suffer from what he calls the Chewing Gum Effect: what if two spots move, but the index doesn’t? The solution he presented is a pairwise local correlation model. However, he notes that this typically produces a higher price as correlation is pushed to 1 to satisfy all constraints. The advantage is that CPU time with up to 10 basket constituents pairwise beats local correlation.

Frank Lehrbass asked wether replacing VaR by Expected Shortfall (ES) is much ado about nothing? Regulation requires this change already. He points out that ES can’t be backtested. Banks using Historical Simulation would have had less risk capital before the Lehman default if they had used ES instead of VaR. Thorsten Schmidt and Jörg Behrens point out that ES is in-fact backtestable, when in combination with VaR. We conclude that the required regulatory change doesn’t appear to add any value, in fact, makes business riskier. One important add-on to risk management is using scenarios.

Roel Oomen talked about the practice of FX spot trading and competition amongst liquidity providers. He first pointed out that FX spot is OTC because of the way dealers interact, in fact, they know their counterparts. Liquidity is driven by internalization, in fact, about 66% of transactions are squared by other internal transactions; the speed of this internalization process influences the cost heavily. Roel applies queuing theory to spot trades’ internalization to come up with some general statements.

Wolfgang Scherer asked whether quantum computing in finance is hype or hyperspeed?
He presents an overview of the current state of quantum computing. Much of this is not yet ready for real applications, but looks promising. We see tensors all over the place and learned about the true meaning of Commerzbank 4.0

Manuel Wittke and Mikhail Beketov talked about Robo Advisors: The algorithms behind the user screens. Their market survey shows a gigantic rise of capital under robo management; how this may impact the market eventually is an option questions. There exist 147 robo advisers, but most of them are black boxes. Therefore, there is very little one can say about the algorithms. In fact, I would even say that an investment algorithm that has gone to the public domain is unlikely to be successful.

Christoph Burgard presented his vision of the second quantization of banks. Markets have move from individual derivatives to portfolio valuations; products are simpler but underlyings more complicated; counterparty risk had to be rethought and in fact hedged/risk managed on a CVA desk; the next step was DVA, which, however, is not hedgeable; Vladimir Piterbarg came up with FVA, proposing to discount with the funding rate rather than the LIBOR rate. That way one can use FVA to monetize DVA; furthermore, balance sheet impact has become an important issue; on top of all accounting must be aligned with the economic value; attention should be paid to network effects: a new systemic risk in CCP networks is building up[1]. After all of this we are still left with unhedged risk and exposure, particularly when looking at the portfolio under the real-world measure. Conclusion: right now, there is an awful lot to do. In fact, risk analytics vs. front office analytics are being linked. Processes and process platforms are the requirements for a modern bank if one could start it from scratch.

Mauricio I. González Evans showed a real application of a cloud calculation service for the FX volatility smile. Using real live market data, the FX volatility smile can be built, interpolated, extrapolated, checked for arbitrage. We saw volatility on the delta space, total variance on the log-moneyness space, which is helpful to spot calendar arbitrage, and the probability density on the strike space, which is helpful to spot butterfly arbitrage. The financial market data platform can manage all incoming and calculated data with all entitlement and logging features required in today’s regulatory environment. The platform combines the market data management of BCC group with the MathFinance library MFVal.

In the panel discussion, chaired by Tino Senge (Head of Quantitative Analytics Europe + Global Head of QA Macro, Structured Products & Strategies / Barclays), we covered recent challenges in derivatives technology, with Christoph Burgard (Head of Risk Analytics for Global Markets / BAML), Wolfgang Gerhardt (Speaker of the Management Board / Bank Vontobel Europe AG ), Martin Hellmich (Managing Director / SCDM) and Uwe Wystup (Managing Director / MathFinance). The panel concluded that technology will be a deciding factor of survival in the derivatives industry.

Hans Bühler cracked the challenge of implementing a discrete local volatility model, which was triggered by a model review process. He showed how to construct an arbitrage-free surface with very simple linear programming using simple linear constraints. The construction is quick and even bucketed vega risk is easy to calculate. It appears to be an approach ready to go for practitioners.

Kris Wulteputte gave a speech on the future of quant in risk management. Times are changing and require a new way of thinking. The push for increased accuracy of risk measures (not pricing) had to be rethought, especially facing the huge problems after the Lehman default. All illusions about risk models disappeared. Most models work in most cases most of the time, but we need to understand where they can fail. The industry is looking for simplicity, accepting that whatever we do we will miss something. The future quant must be holistic, with the entire enterprise in mind, much in the sense of Christoph Burgard’s view that the portfolio matters more than individual transactions. Kris is convinced that markets need more graphical ways to communicate risk to executives, on a platform, like dashboarding. He points out that there is no perfect model, more important to realize shortcomings. We are plagued by radical uncertainty, and we need to accept that.

Peter Schwendner responds with lots of graphical tools in the subsequent talk on sovereign bond network dynamics. Negative correlation in bond yields is considered a warning sign. Therefore, Peter introduces his concept of net-correlation, which is linked to sheering forces as journalists call it. Dendrograms show hierarchical patterns. His conclusions are

  • In periods of financial and political distress like 2012, the market fears contagion risk, leading to shearing forces between a core and a periphery bloc.
  • Market trusts the guarantee structure of EFSF: market treats EFSF as a “core” issuer.
  • Market believes in stability framework: reattachment of periphery to core since 2013, Greece a special case.
  • The Brexit referendum has caused only very short-term turbulences on the bond market, without triggering a “domino effect”.
  • The French bond spread moves along with the probability of a le Pen victory in the second round, an event which would weaken the countries’ credit.
  • Separation into core and periphery becomes less and less pronounced. Contagion risk decreases.

Is this alarming? Peter says yes!

In his presentation on adaptive market risk measurement in the trading book Peter Quell deals with risk in non-stationary markets. His findings include:

  • Filtered historical simulation with adapted mean and volatility is better suited for markets described by heteroscedastic time series.
  • Adaptive risk measurement also suitable for time series with structural breaks, e.g. moving from normal period into stress periods

Thorsten Schmidt explains in his talk on unbiased estimation of risk measures that expected shortfall can indeed be backtested. The new notion of unbiasedness introduced is motivated from economic principles rather than from statistical reasoning, which links this concept to a better performance in backtesting. Some unbiased estimators, for example the unbiased estimator for value-at-risk in the Gaussian case, can be computed in closed form, while for many other cases numerical methods are available.

Sebastian Schlenkrich presents a quasi-Gaussian model for model validation and pricing analysis. This allows quantifying model risk of long-term structured interest rate derivatives using a very general model, and then considering special cases. Given a formula for swaptions he can calibrate the Quasi-Gaussian model to observable swaption volatility market data. He proves his concept by a callable CMS spread swap and mid-curve option case study. Interested quants my download his code from https://github.com/sschlenkrich/QuantLib/tree/master/ql/experimental/templatemodels.

Natalie Packham continues with current developments in model risk measurement. Her key statement is “In a frictionless, complete market any P&L on a perfectly hedged position is due to hedging in a misspecifed model.” She provides two examples:

  • the gap risk for the down-and-out call option can be quantified;
  • energy derivatives valuation with a complex model is less informative than the valuation with a simple model.

Overall, I learned a lot from all the speakers. I would like to thank all the speakers and sponsors to help make this conference the key content-driven event for quants. We had participants from Europe and Asia, and many quantitative topics. Innovations in risk management and technology are key success factors for the financial industry and it appears that mathematical finance is needed more than ever, thanks to increasing regulation and market uncertainty, but also because of an increasing awareness of the market participants and executives.

Watch a summary of the conference in this video

We hope to see you all again at our future events.

Save the next conference date: 16-17 April 2018.

Uwe Wystup, Managing Director of MathFinance


[1] R. Barker, A. Dickinson, A. Lipton, R. Virmani, “Systemic risks in CCP networks”, Risk, 2017

   


Careers

MathFinance Openings
Senior Quant/ Consultant

We are looking for senior quant/ consultant in the areas of

Insurance

  • Actuary with 5 to 7 years of experience in insurance or re-insurance
  • Experience in quantitative Risk Management in relation to regulatory issues (Solvency II)
  • Experience in Capital Management

Banking

  • Quant with 5 to 7 years of experience in Banking, ideally in Trading
  • Experience in quantitative Risk Management in relation to regulatory issues (Basel III)
  • Experience in Capital Management

Investment

  • Quant with 5 to 7 years of experience in Asset Management (Funds, Insurance and Family Offices), ideally with emphasis on Risk Management
  • Experience in quantitative Risk Management in relation to regulatory issues (German KAGB and KARBV)

Please send us your CV to recruitments@mathfinance.com


Junior Quant

Do the following apply to you?

  • Master degree or diploma in (business) mathematics or physics
  • PhD or CFA is a bonus
  • First experiences in mathematical finance is desirable
  • Very good programming skills, e.g. C++, Python or Matlab
  • Good language skills in German and English
  • Outstanding analytical skills and a problem-solving attitude
  • High motivation to develop your knowledge and skills
  • Good communication skills and team spirit

Then we would like to hear from you. Please send us your CV to recruitments@mathfinance.com