Friday, 13 October 2017
8.30  9.00 Registration 
Morning session (Chair: Peter Friz)
9.00  9.15 Opening remarks 
Conventional measures of fattailedness, namely 1) the tail index for power laws, and 2) Kurtosis for finite moment distributions fail to apply to some distributions, and do not allow comparisons between powerlaws outside the LevyStable basin, or powerlaws to distributions in other classes, or powerlaws for different summands. How can one compare a sum of 100 Student T distributed random variables with 3 degrees of freedom to one in a LevyStable or a Lognormal class? How can one compare a sum of 100 Student T with 3 degrees of freedom to a single Student T with 2 degrees of freedom?
We propose a heuristic measure that allow us to compare nsummed independent variables under all distributions with finite first moment. The method is based on the speed of convergence of the Law of Large numbers for finite sums, specifically nsummands. We get either explicit expressions or simulation results and bounds for the lognormal, exponential, Pareto, and the Student T distributions in their various calibrations in addition to the general Pearson class. 
This talk first gives an overview of dynamic stochastic optimization techniques followed by a discussion of yield curve scenario generation in the current low rate environment. The latter employs a new accurate approximation for the Black (1995) correction to a Gaussian 3factor affine short rate model. The talk goes on to briefly illustrate the two main applications of this technology – pricing complex financial products and dynamic risk controlled asset liability management. The presented examples arise from actual practical applications to variable annuity insurance products with guarantees.

10.45  11.15 COFFEE BREAK
11.15  12.00 Sasha Stoikov: The microprice [Slides] [Hide]
I define the microprice to be the limit of a sequence of expected midprices and provide conditions for this limit to exist. The microprice is a martingale by construction and can be considered to be the ‘fair’ price of an asset, conditional on the information in the order book. The microprice may be expressed as an adjustment to the midprice that takes into account the bidask spread and the imbalance. The microprice can be estimated using high frequency data. I show empirically that it is a better predictor of short term prices than the midprice or the weighted midprice.
The paper is available at here.

I present a set of tools to construct the implied volatility surface and many of its byproducts: variance swap term structure, forward variances, local volatilities, risk neutral densities and when applicable, earning estimates and election predictions. I also consider the case when no options are available, making use either of the surface of a related index or ot the price time series.

12.45  2.30 LUNCH BREAK
Afternoon session (Chair: Dylan Possamai)
2.30  3.15 Marco Avellaneda: Statistics of VIX Futures and Applications to Trading Volatility ETFs [Slides] [Hide]
We study the dynamics of VIX futures and ETFs. We find that VIX and VIX futures exhibit high volatility and skewness compared to other commodities, consistently with the absence of cashandcarry arbitrage for VIX. The VIX futures termstructure can be modeled as a stationary stochastic process, in which the most likely state is a contango with VIX ~ 12% and a longterm futures price ~20%. We analyse the behavior of ETFs and ETNs based on constantmaturity rolling futures strategies, such as VXX, XIV, VXZ, both nonparametrically and through a multifactor model calibrated to historical data.
We find that buyandhold strategies consisting of shorting ETNs which roll long futures or buying ETNs which roll short futures produce sure profits in the long run assuming that CMFs are stationary and ergodic. To quantify this, we estimate a 2factor lognormal model with meanreverting factors using VIX and CMF historical data. The results confirm the profitability of ETN buyandhold strategies, but also indicate that the latter have relatively modest Sharpe ratios, of the order of 0.3 to 0.6, and that the latter are volatile over finitetime windows (e.g. 1 year). This is due to the sporadic surges in VIX and CMF backwardations which are inevitable in volatility markets.

In this talk, we show a bridge representation for the joint probability density of the lognormal fractional SABR model and a small time expansion of the density. Approximation of implied volatility is readily obtained by applying the Laplace asymptotic formula to the call or put prices then comparing coefficients. As an application, we price a volatility linked derivative: the Target volatility option (TVO) in the fractional SABR model. TVOs are a type of derivative instrument that explicitly depends on the evolution of an underlying asset as well as its realized volatility. We present small volatility of volatility approximations of the TVO price by resorting to chaos expansion as well as the decomposition formula.

4.00  4.15 COFFEE BREAK
4.15  5.00 Petter Kolm Multiperiod Portfolio Selection and Bayesian Dynamic Models [Slides] [Hide]
We describe a novel approach to the study of multiperiod portfolio selection problems with time varying alphas, trading costs, and constraints. We show that, to each multiperiod portfolio optimization problem, one may associate a “dual” Bayesian dynamic model. The dual model is constructed so that the most likely sequence of hidden states is the trading path which optimizes expected utility of the portfolio. The existence of such a model has numerous implications, both theoretical and computational. Sophisticated computational tools developed for Bayesian state estimation can be brought to bear on the problem, and the intuitive theoretical structure attained by recasting the problem as a hidden state estimation problem allows for easy generalization to other problems in finance. Time permitting, we discuss several applications to this approach.
This is joint work with Gordon Ritter.

We introduce and establish the main properties of QHawkes (“Quadratic” Hawkes) models that allow all feedback effects in the jump intensity that are linear and quadratic in past returns. Our model exhibits two main properties, that we believe are crucial in the modelling and the understanding of the volatility process: first, the model is timereversal asymmetric, similar to financial markets whose time evolution has a preferred direction. Second, it generates a multiplicative, fattailed volatility process, that we characterize in detail in the case of exponentially decaying kernels. Several other interesting properties of QHawkes processes are discussed, in particular the fact that they can generate long memory without necessarily be at the critical point. A nonparametric fit of the QHawkes model on NYSE stock data shows that the offdiagonal component of the quadratic kernel indeed has a structure that standard Hawkes models fail to reproduce. We provide numerical simulations of our calibrated QHawkes model which is indeed seen to reproduce, with only a small amount of quadratic nonlinearity, the correct magnitude of fattails and time reversal asymmetry seen in empirical time series.

Saturday, 14 October 2017
Morning session (Chair: Stefano De Marco)
9.00  9.45 Julien Guyon: On the Joint Calibration of SPX and VIX Options [Slides] [Hide]
Back in 2007, Jim Gatheral was one of the first to investigate the joint calibration of models to both SPX options and VIX options. Jim showed that, at the time, a diffusive model (the double meanreverting model) could approximately match both markets. Later, others have argued that jumps in the SPX were needed to jointly calibrate both markets. I revisit this problem, trying to answer the following questions: Does there exist a diffusive model on the SPX that jointly calibrates to SPX options, VIX futures, and VIX options? If so, how to build one such model? In the limiting case of instantaneous VIX, the answer is clear and involves convex stochastic ordering and a novel application of martingale transport to finance. The real case of a 30day VIX is more involved.

There's evidence of heterogeneous driving factors that underlie the observable outcome of univariate economic processes. We introduce an Extended Wold Decomposition that decomposes a weakly stationary time series as the sum, across time scales, of uncorrelated components associated with different degrees of persistence. Our representation allows to define multiscale impulse response functions and a multiscale variance decomposition. Using these tools, we revisit the Heterogeneous Market Hypothesis for the volatility dynamics, and quantify the role of investors with different time horizons. Analysis of the multiscale response functions leads the way to a "structural" interpretation of the rough volatility modeling approach.

10.30  11.00 COFFEE BREAK
11.00  11.45 Roel Oomen: Internalisation by electronic FX spot dealers [Slides] [Hide]
Dealers in overthecounter financial markets provide liquidity to customers on a principal basis and manage the risk position that arises out of this activity in one of two ways. They may internalise a customer's trade by warehousing the risk in anticipation of future offsetting flow, or they can externalise the trade by hedging it out in the open market. It is often argued that internalisation underlies much of the liquidity provision in the currency markets, particularly in the electronic spot segment, and that it can deliver significant benefits in terms of depth and consistency of liquidity, reduced spreads, and a diminished market footprint. However, for many market participants the internalisation process can be somewhat opaque, data on it is scarcely available, and even the largest and most sophisticated customers in the market often do not appreciate or measure the impact that internalisation has on their execution costs and liquidity access. This paper formulates a simple model of internalisation and uses queuing theory to provide important insights into its mechanics and properties. We derive closed form expressions for the internalisation horizon and demonstrate  using data from the Bank of International Settlement's triennial FX survey  that a representative tier 1 dealer takes on average several minutes to complete the internalisation of a customer's trade in the most liquid currencies, increasing to tens of minutes for emerging markets. Next, we analyse the costs of internalisation and show that they are lower for dealers that are willing to hold more risk and for those that face more price sensitive traders. The key message is that a customer's transaction costs and liquidity access are determined both by their own trading decisions as well as the dealer's risk management approach. A customer should not only identify the externalisers but also distinguish between passive and aggressive internalisers, and select those that provide liquidity compatible with their execution objectives. Joint work with Max Butz.

Motivated by price impact models and Cournot games with a large number of players, we introduce extended mean field games in a general framework, and review the few instances where they have been solved. The goal is to emphasize the adhoc nature of the existing solutions, and to discuss some of the obstacles to a general solution approach.

12.30  2.00 LUNCH BREAK
Afternoon session (Chair: Blanka Horvath)
2.00  2.45 Masaaki Fukasawa: Hedging and calibration for lognormal rough volatility models [Slides] [Hide]
We discuss miscellaneous things about the celebrated lognornal rough volatility models, putting emphasis on the forward variance curve.

One of Jim Gatheral’s most recent contributions to volatility modelling is the notion of rough volatility. Rough volatility models produce very realistic realised volatility paths and also reproduce the stylised facts of volatility surfaces accurately, but they are still challenging to calibrate due to the lack of analytical pricing formulae. While efficient lowbias Monte Carlo discretisations of rough volatility models have been proposed, their computational cost in Monte Carlo pricing is still too large for practical applications. In this talk, we concoct a cocktail of variance reductions (celebrating Jim’s birthday!) that promises to reduce the computational cost considerably, hopefully paving the way for practical adoption of these models. Joint work with Ryan McCrickerd.

3.30  4.00 COFFEE BREAK
4.00  4.45 Mathieu Rosenbaum: Rough volatility, market impact and noarbitrage [Slides] [Hide]

Prompted by recent works of Jim Gatheral in the context of stochastic volatility models, there is a new interest in stochastic processes driven by fractional Brownian motion, in particular in the rough regime (Hurst index H < 1/2, often even H < 1/4). A general theoretical framework for such processes is, however, currently not available, so works in the recent literature either rely on classical Ito calculus (where made possible by the structure of the model, as, for instance, in the original rough fractional stochastic volatility model or the rough Bergomi model) or on specialized constructions (for instance, in the rough Heston model). Hairer's regularity structures allow us to provide a unified treatment of these problems, at least in principle. We explore this approach in the context of a simple rough Bergomi model and a rough Heston model, and show how one can derive WongZakai type approximation theorems as well as general large deviation principles. More specifically, in which sense the stock price and instantaneous variance processes can be approximated by mollifying the underlying Brownian motions, and provide an LDP for the abstract regularity structure itself. (Joint work with P. Friz, P. Gassiat, J. Martin, B. Stemper.)

Rough stochastic volatility models have attracted a lot of attention recently, in particular for the linear option pricing problem. In this talk, starting with power utilities, we propose to use a martingale distortion representation of the optimal value function for the nonlinear asset allocation problem in a (nonMarkovian) fractional stochastic environment (for all Hurst index H in (0, 1)). We rigorously establish a first order approximation of the optimal value, when the return and volatility of the underlying asset are functions of a stationary slowly varying fractional OrnsteinUhlenbeck process. We prove that this approximation can be also generated by the zeroth order trading strategy providing an explicit strategy which is asymptotically optimal in all admissible controls. Furthermore, we extend the discussion to general utility functions, and obtain the asymptotic optimality of this strategy in a specific family of admissible strategies. If time permits, we will also discuss the problem under fast meanreverting fractional stochastic environment. Joint work with Ruimeng Hu (UCSB).

Sunday, 15 October 2017
Morning session (Chair: Jan Obloj)
9.00  9.45 Peter Carr: Semiparametric Pricing and Hedging of Volatility Derivatives on Jump Diffusions [Slides] [Hide]
Joint with Roger Lee and Matt Lorig.

In this work, motivated by the approximation of Martingale Optimal Transport problems, we are interested in sampling methods preserving the convex order for two probability measures $\mu$ and $\nu$ on $\mathbb{R}^d$, with $\nu$ dominating $\mu$ . When $(X_i)_{1\le i\le I}$ (resp. $(Y_j)_{1\le j\le J}$) are independent and identically distributed according $\mu$ (resp. $\nu$), in general $\mu_I=\frac{1}{I}\sum_{i=1}^I\delta_{X_i}$ and $\nu_J=\frac{1}{J}\sum_{j=1}^J\delta_{Y_j}$ are not rankable for the convex order. We investigate modifications of $\mu_I$ (resp. $\nu_J$) smaller than $\nu_J$ (resp. greater than $\mu_I$) in the convex order and weakly converging to $\mu$ (resp. $\nu$) as $I,J\to\infty$. We first consider the one dimensional case $d=1$, where, according to Kertz and R\"osler (1992), the set of probability measures with a finite first order moment is a lattice for the increasing and the decreasing convex orders. Given $\mu$ and $\nu$ in this set, we define $\mu\vee\nu$ (resp. $\mu\wedge\nu)$ as the supremum (resp. infimum) of $\mu$ and $\nu$ for the decreasing convex order when $\int_\mathbb{R} x\mu(dx)\le\int_\mathbb{R} x\nu(dx)$ and for the increasing convex order otherwise. This way, $\mu\vee\nu$ (resp. $\mu\wedge\nu$) is greater then $\mu$ (resp. smaller than $\nu$) in the convex order. We give efficient algorithms permitting to compute $\mu\vee\nu$ and $\mu\wedge\nu$ (and therefore $\mu_I\vee\nu_J$ and $\mu_I\wedge \nu_J$) when $\mu$ and $\nu$ are convex combinations of Dirac masses. In general dimension, when $\mu$ and $\nu$ have finite moments of order $\rho\ge 1$, we define the projection $\mu\curlywedge_\rho \nu$ (resp. $\mu\curlyvee_\rho\nu$) of $\mu$ (resp. $\nu$) on the set of probability measures dominated by $\nu$ (resp. larger than $\mu$) in the convex order for the Wasserstein distance with index $\rho$. When $\rho=2$, $\mu_I\curlywedge_2 \nu_J$ can be computed efficiently by solving a quadratic optimization problem with linear constraints. It turns out that, in dimension $d=1$, the projections do not depend on $\rho$ and their quantile functions are explicit in terms of those of $\mu$ and $\nu$, which leads to efficient algorithms for convex combinations of Dirac masses. Last, we illustrate by numerical experiments the resulting sampling methods that preserve the convex order and their application to approximate Martingale Optimal Transport problems.

10.30  11.00 COFFEE BREAK
11.00  11.45 Johannes MuhleKarbe: Intermediation, Liquidity, and Asset Prices [Hide]
We develop a dynamic equilibrium model for market liquidity. To wit, we solve for the equilibrium prices at which liquidity takers' demands are absorbed by liquidity providers, who can in turn gradually transfer these positions to a group of end users.
(Joint work in progress with Peter Bank and Ibrahim Ekren)

Using the Fukasawa approach to the ChrissMorokoffGatheral formula we get a closed formula for the VIX in Gatheral and Jacquier SSVI. This allows us to reformulate the SSVI volatility surface with the forward variance curve as a parameter instead of the ATM total implied variance. This is useful in practice in the context of Bergomi and rough Bergomi models.

Inspired by work of Gatheral and Oomen (2010), we evaluate a comprehensive set of highfrequency volatility estimators on real market data, for a wide range of futures products. We compare the realised variance produced by the estimators, with actual price changes across time intervals varying from ten minutes to several hours. We demonstrate that several estimators are quite successful at reducing price changes to unit normal variates. (Joint work with Baruch student Linwei Shang)
