Home Publications Undergraduates Postgraduates Postdocs Calendar Contact

STAFF
  Jeroen Lamb  
  Martin Rasmussen  
  Dmitry Turaev  
  Sebastian van Strien  
HONORARY STAFF
Tiago Pereira
POSTDOCS
Boumediene Hamzi
PHD STUDENTS
George Chappelle
Matheus de Castro
Federico Graceffa
Victoria Klein
Matteo Tabaro
Wei Hao Tey
Kalle Timperi
VISITORS
Ole Peters
Cristina Sargent
Bill Speares
RELATED STAFF
Mauricio Barahona
Davoud Cheraghi
Martin Hairer
Darryl Holm
Xue-Mei Li
Greg Pavliotis
Kevin Webster

DynamIC Seminars (Complete List)

Name Title Date Time Room
John Harlim (Pennsylvania State University)Machine learning of missing dynamical systemsAbstract: Abstract: In the talk, I will discuss a general closure framework to compensate for the model error arising from missing dynamical systems. The proposed framework reformulates the model error problem into a supervised learning task to estimate a very high-dimensional closure model, deduced from the Mori-Zwanzig representation of a projected dynamical system with projection operator chosen based on Takens embedding theory. Besides theoretical convergence, this connection provides a systematic framework for closure modeling using available machine learning algorithms. I will demonstrate numerical results using a kernel-based linear estimator as well as neural network-based nonlinear estimators. If time permits, I will also discuss error bounds and mathematical conditions that allow for the estimated model to reproduce the underlying stationary statistics, such as one-point statistical moments and auto-correlation functions, in the context of learning Ito diffusions. Tuesday, 18 May 2021 14:00
Manjunath Gandhi ()Universal set of Observables for Koopman's Operator through Causal EmbeddingAbstract: The talk is about finding models from data assuming the temporal data was generated from a dynamical system. Data-driven methods employing the Koopman operator, and algorithms in reservoir computing transform data into another phase space for model construction. They have shown great promise in forecasting some chaotic dynamical systems. Finding the right observables for the Koopman operator in data-driven approaches is an open problem -- existing methods like (SINDy) determine a set of observables only after assuming that the vector field to be learned lies in the span of a set of predetermined functions. Igor Mezi? in a Fields symposium talk last year mentioned an ambitious idea of finite faithful representations. This is to obtain observables that determine a topologically conjugate system like in Takens delay embedding. Echo state networks (ECNs) employing the reservoir computing paradigm transform the temporal data as a nonautonomous attractor in another space, but there is no guarantee of the existence of a learnable map. We solve both problems through an idea of causal embedding a notion of embedding temporal data into another space and consider its implementation through recurrent conjugate networks (RCNs), an adaptation of ESNs. RCNs render a learnable map that is topological conjugate to the system generating the data (as in Takens delay embedding), and also determines observables forming a finite faithful representation for the Koopman operator. RCNs give exceptional long-term consistency in numerical forecasting experiments as well. This is a joint work with Adriaan de Clrecq. Tuesday, 25 May 2021 13:00
Qianxiao Li ()Approximation theory for machine learning and dynamical systems Abstract: Abstract: In this talk, I will discuss some recent approximation results on the intersection of dynamical systems and machine learning. In the first direction, I will present some universal approximation theorems for continuous-time idealizations of deep residual networks. In the second, I will focus on the approximation of dynamical relationships by recurrent and convolutional structures for time series, whose analysis reveals some interesting connections between approximation, memory and sparsity. Tuesday, 1 June 2021 12:00
Soon Hoe Lim ()Noisy Recurrent Neural Networks Abstract: We provide a general framework for studying recurrent neural networks (RNNs) trained by injecting noise into hidden states. Specifically, we consider RNNs that can be viewed as discretizations of stochastic differential equations driven by input data. This framework allows us to study the implicit regularization effect of general noise injection schemes by deriving an approximate explicit regularizer in the small noise regime. We find that, under reasonable assumptions, this implicit regularization promotes flatter minima; it biases towards models with more stable dynamics; and, in classification tasks, it favors models with larger classification margin. Sufficient conditions for global stability are obtained, highlighting the phenomenon of stochastic stabilization, where noise injection can improve stability during training. Our theory is supported by empirical results which demonstrate improved robustness with respect to various input perturbations, while maintaining state-of-the-art performance. Tuesday, 1 June 2021 13:00
Hedy Attouch (Université Montpellier, France)Acceleration of first-order optimization algorithms via damped inertial dynamicsAbstract: \begin{abstract} We report on recent advances regarding the acceleration of first-order algorithms for continuous optimization. We rely on the damped inertial dynamics driven by the gradient of the function $ f $ to be minimized, and on the algorithms obtained by temporal discretization. The first (main) part of the lecture is devoted to convex optimization in a general Hilbert framework. We review classical results, from Polyak's heavy ball with friction method to Nesterov's accelerated gradient method. Then we introduce the geometric damping driven by the Hessian which intervenes in the dynamic in the form $\nabla^2 f (x(t)) \dot{x} (t)$. By treating this term as the time derivative of $ \nabla f (x (t)) $, this gives, in discretized form, first-order algorithms [1]. This geometric damping makes it possible to attenuate the oscillations. Besides the fast convergence of the values, the algorithms thus obtained show a rapid convergence towards zero of the gradients. Numerical results for structured optimization and Lasso problems support our theoretical results. Next, we consider the introduction into the dynamics/algorithms of a Tikhonov regularization term with asymptotic vanishing coefficient. Based on a proper tuning of the parameters, we obtain both fast convergence of the values and strong convergence towards the minimum norm solution [2]. Then, for linear constrained convex optimization, we introduce a new dynamic approach to the inertial ADMM algorithms, and thus obtain optimal convergence rates. We finally describe some basic tools to deal with inertial algorithms for non-convex non-smooth optimization: quasi-gradient dynamics, Kurdyka-Lojasewicz inequality, tame analysis and semi algebraic functions [3]. % \end{abstract} \medskip \begin{footnotesize} \noindent [1] \; H. Attouch, Z. Chbani, J. Fadili, H. Riahi, First-order optimization algorithms via inertial systems with Hessian driven damping, Math. Program., (2020)\\ https://doi.org/10.1007/s10107-020-01591-1, arXiv:1907.10536v2 [math.OC] \smallskip \noindent [2] \; H. Attouch, S. Laszlo, Convex optimization via inertial algorithms with vanishing Tikhonov regularization: fast convergence to the minimum norm solution, (2021), arXiv:2104.11987 [math.OC] \smallskip \noindent [3] \; H. Attouch, R.I. Bo\c t, E.R. Csetnek, Fast optimization via inertial dynamics with closed-loop damping, Journal of the European Mathematical Society, (2021),\\ arXiv:2008.02261v3 [math.OC] \end{footnotesize} Tuesday, 8 June 2021 13:00 Online

DynamIC Workshops and Mini-Courses (Complete List)

Title Date Venue
Online Workshop on Critical Transitions in Complex SystemsWednesday, 29 July 2020 – Friday, 31 July 2020online
Workshop on Critical Transitions in Complex SystemsMonday, 25 March 2019 – Friday, 29 March 2019Imperial College London
Workshop on Random Dynamical SystemsWednesday, 20 March 2019 – Friday, 22 March 2019Imperial College London
Symposium on Machine Learning and Dynamical SystemsMonday, 11 February 2019 – Wednesday, 13 February 2019Imperial College London

Short-term DynamIC Visitors (Complete List)

NameAffiliationArrivalDepartureHost
No visitors scheduled currently

Links