4-5 July 2019, Imperial College London
Raimund Ober
Quantitative aspects of single molecule microscopy image analysis: an overview
Dylan Owen
Describing the distribution and dynamics of signalling
molecules using data from single-molecule microscopy
In T cells, like many cell types, signalling pathways are
regulated, at least in part, by the spatio-temporal
arrangement of the proteins that make them up. In particular,
protein nanoclustering has been shown to digitise signal
transduction, setting basal levels and defining thresholds.
Since appropriate T cell activation is crucial for mounting an
immune response while at the same time avoiding autoimmune
disease, understanding the nanoclustering behaviour of
proteins is vital. Single-molecule localisation microscopy
(SMLM), a variety of super-resolution imaging, is a commonly
used tool for acquiring such data. Here, I will present a
variety of tools to statistically analyse the output from SMLM
to provide quantitative descriptions of protein nanoscale
organisation. These will include Bayesian, model-based
methods with extensions to 3D and live-cell SMLM data and
machine-learning approaches applicable to large data sets.
Methods will be demonstrated using a variety of T cell
proteins but in particular PTPN22, mutations in which are
present in the human population, alter its nanoscale
organisation in T cells, produce cell-level phenotypes and
predispose the organism to autoimmune diseases including
arthritis and diabetes. I will also present emerging
challenges to the field going forward.
Scott Ward
Testing complete spatial randomness of the type 6 secretion
system in Pseudomonas aeruginosa
In this talk we shall discuss methodology to explore spatial
properties of the type 6 secretion system (T6SS) of
Pseudomonas aeruginosa (P. aeruginosa). We model the
underlying stochastic mechanism, giving rise to the observed
T6SS, as a point process on the cell membrane of P.
aeruginosa, where we approximate the cell membrane by an
ellipsoid. Current methodologies in spatial point pattern
analysis have been developed for planar and spatial data and
although this theory can be applied a wide variety of
applications it is not directly applicable here. This is
becuase point patterns lie on an ellipsoid rather than being
planar or spatial and the geometry of the space must be
accounted for. We will discuss methodology to test complete
spatial randomness of point patterns observed on an ellipsoid.
By using the invariance of Poisson processes under
transformations between metric spaces (known as the Mapping
Theorem) we can transform one from any ellipsoid to a Poisson
process on the unit sphere and take advantage of its
rotational symmetries to construct functional summary
statistics. In particular, we shall focus on Ripley's
K-function and its inhomogeneous counterpart. Based on these
functional summary statistics we can then determine whether a
pattern observed on an ellipsoid exhibits complete spatial
randomness or not and we highlight features of these derived
summary statistics through simulations.
Juliette Griffié
How much should we trust quantitative bioimaging?
Fluorescence microscopy is an incredible toolbox for the
investigation of the cellular machinery over multiple length
scales. It presents unique advantages such as protein
specificity, multi-color, the access to 2D, 3D and whole cell
data, from both fixed or live sample, to name a few. This,
however, comes at a cost: complex microscopy set ups, with
many user definable parameters to set for the image
acquisition, and no clear guidelines to do so. This lack of
robust framework for the optimization of acquisition
parameters often impacts negatively on the output data: image
quality deterioration and limited reproducibility are common
issues faced by the field. In the face of these limitations,
how much should we trust the statistics extracted from these
images, independently of the analysis tool used? We propose
here a Bayesian based framework for the optimization of
acquisition parameters allowing for user-free microscopy
acquisition, while insuring the reproducibility of the
generated images. We will focus on the application of this
framework to SMLM as well as a novel, real time SMLM simulator
that we have developed for its validation.
Sumeetpal Singh
Identification of multi-object dynamical systems:
consistency and Fisher information
Learning the model parameters of a multi-object dynamical
system from partial and perturbed observations is a
challenging task. Despite recent numerical advancements in
learning these parameters, theoretical guarantees are
extremely scarce. In this article we aim to help fill
this gap and study the identifiability of the model parameters
and the consistency of the corresponding maximum likelihood
estimate (MLE) under assumptions on the different components
of the underlying multi-object system. In order to
understand the impact of the various sources of observation
noise on the ability to learn the model parameters, we study
the asymptotic variance of the MLE through the associated
Fisher information matrix. For example, we show that specific
aspects of the multi-target tracking (MTT) problem such
as detection failures and unknown data association lead
to a loss of information which is quantified in special cases
of interest. To the best of the authors' knowledge, these are
new theoretically backed insights on the subtleties of MTT
parameter learning.
Mansoor Sheikh
Analysis of over-fitting in the regularized
Cox model
The Cox proportional hazards model is ubiquitous in the
regression analysis of time-to-event data. However, when the
data dimension $p$ is comparable to the sample size $N$,
maximum likelihood estimates for its regression parameters are
known to be biased or break down entirely due to overfitting.
This prompted the introduction of regularizers, leading to the
so-called regularized Cox model. In this paper we use the
replica method from statistical physics to investigate the
relationship between the true and inferred regression
parameters in penalised multivariate Cox regression with $L_2$
regularization, in the regime where both $p$ and $N$ are large
but with $\zz=p/N \sim \mathcal{O}(1)$. We thereby generalize
a recent study from maximum likelihood to maximum a posteriori
inference. We also establish a relationship between the
optimal regularization parameter and $\zeta$, allowing for
straightforward overfitting corrections in time-to-event
analysis.
Heba Sailem
Deriving phenotypic signatures of angiogenesis using
advanced image analysis of vascular networks
Angiogenesis plays an important role in many diseases
including cancer invasion, cardiovascular disease, and
Alzheimer disease. Endothelial tube formation assay provides a
powerful approach for studying perturbations effects on
endothelial cells ability to form vascular networks in vitro.
However, the analysis of resulting imaging datasets has been
limited to a few phenotypic features such as the total
branching length or the number of branches and nodes. Here we
develop an image analysis method for detailed quantification
of various aspects of network formation including network
symmetry and hypothetical flow efficiency. Using this approach
we identified six biologically relevant phenotypes based on a
high content screen of 1280 drugs. Clustering analysis
revealed a novel group of proangiogenic drugs that have a
similar mechanism of action and we validate its association
with Alzheimer disease progression. In summary, our work shows
that detailed image analysis of complex phenotypes can be
highly valuable for targeted drug discovery.
Georege Ashdown
Machine learning identification of antimalarials using
high-throughput, high-resolution imaging of malaria parasite
cell development
As microscopy data becomes increasingly complex and dataset
size increases towards population level; fast, automated
acquisition and analysis are required for interpretation. By
applying semi-supervised machine learning analysis to
high-resolution fluorescent images of malaria infected red
blood cells we demonstrate an approach which can successfully
classify the different life-cycle stages within asynchronous
parasite cultures. This architecture successfully organises
parasite images into their natural developmental order and
upon drug treatments of known antimalarials, detects and
segregates phenotypes. This tool can now be extended to
analyse large-scale imaging datasets to drive the discovery of
novel antimalarials and may elucidate mechanism of action with
more sensitivity and much faster than existing methods.
Iain Styles
Quantitative in vivo molecular imaging
Pre-clinical in vivo studies are an essential part of the path
to approval of new drugs, and imaging using bioluminescent or
fluorescent labels is one of the main tools used to assess
where drugs go and the extent to which they affect disease
progression. How much can we trust these experiments though?
Can we really infer quantitative drug effects from these image
when the remitted luminescence as measured at the detector is
influenced by multiple factors that are typically not
controlled for in these experiments – geometry, orientation,
subject optical properties. Motivated by both technical and
ethical concerns, we argue that these experiments should not
be considered to be quantitative in any way, and describe a
program of work in which we systematically account for and
correct the different sources of error and uncertainty to take
steps towards truly quantitative in vivo imaging.
Susan Cox
Information in localisation microscopy
Jean-Christophe Olivo Marin
TBC
Florian Levet
A tale of tiles: gathering tessellation and
point clouds for SMLM data analysis
Over the last decade, single-molecule localization microscopy
(SMLM) has revolutionized cell biology, making it possible to
monitor molecular organization and dynamics with spatial
resolution of a few nanometers. By identifying the molecule
coordinates instead of producing images, SMLM holds an
important paradigm shift towards conventional fluorescence
microscopy. Consequently, developing dedicated analyzing tools
has become essential to properly quantify SMLM data.
Due to their intrinsic geometrical characteristics,
tessellations have been widely used for analyzing problems
from domains as diverse as remeshing, astronomy or geology. In
this family of space-partitioning technique, the Voronoï
diagram is of particular interest as it encapsulates its seeds
inside polytopes. A few years ago, we have developed an
open-source framework called SR-Tesseler [1] in which the
localization coordinates were directly used to reconstruct a
Voronoï diagram. By using the Voronoï polytopes’ properties,
we defined a per-localization local density that enabled a
robust multiscale segmentation of biological systems with
diverse shapes and sizes, ranging from the whole cells to
small molecular complexes. More recently, we have used Voronoï
diagrams to tackle the colocalization analysis of SMLM data.
To this end, we proposed to extend their intrinsic multiscale
capabilities to add a new pair-density descriptor to the
localizations. We then used this new metric to adapt the
well-known Manders’ and Pearson’s coefficients to SMLM data,
resulting in a normalized colocalization analysis of SMLM
data. We validated our method on simulations as well as on
experimental data.
Jean-Baptiste Masson
Single biomolecule at the age of Big Data:
Probabilistic pipeline and unsupervised analysis of random
walks
Twenty years after its inception, the field of Single Molecule
(SM) biology undergoes a transition towards a data-generating
science [1-3]. At the nanometer scale, the dynamics of
individual biomolecules is inherently controlled by random
processes, due to thermal noise and stochastic molecular
interactions. By accessing the distribution of molecular
properties, rather than simply their average value, the great
advantage of SM measurements is thus to identify static and
dynamic heterogeneities, and rare behaviours.
In recent years, these experimental limits have been
progressively alleviated with the advent of new, game-changing
methods. Thanks to photoactivatable probes (protein-based or
synthetic dyes), millions of individual trajectories can now
be recorded in live cells in a few minutes. PALM/STORM images
can be reliably acquired over many hours (or even days),
yielding up to hundreds of millions of individual
localizations.
As SM experiments enter the age of « big data », the
development of a proper and unifying statistical framework
becomes more necessary than ever. « big data »
approaches certainly open up new research venues for our
understanding of biological processes, as they enable the
inference of molecular dynamics. Yet they also come with a
price. Often, adding more data brings both more information
and more variability and noise. Specific tools are required to
handle the complex structure of results associated to large
datasets and to account for the sources of experimental and
systemic variability.
Here, we show a global probabilistic pipeline: TRamWAy [4-7]
that automatically analyse single molecule experiments from
images to random walk analysis. TRamWAy relies on deep neural
network to deconvolve single molecule images, Belief
propagation coupled to ghost graph summing to perform
probabilistic assignments between images, and both supervised
and unsupervised Bayesian analysis to extract information from
random walks.
We demonstrate the approach on two datasets: Glycine receptors
in synapses and GAG dynamics during the formation of the
Virion in HIV-1 [4]. We demonstrate two ways of applying the
probabilistic pipeline TRamWAy. In the first we use
model-based learning with automated results extraction and
statistics. In the second we show that unsupervised learning
with structured inference allows full analysis without
assigning a model to the biomolecules dynamics.
Lekha Patel
A hidden Markov model approach to
characterizing the photoswitching behaviour of fluorophores
Super-resolution imaging via STORM (Stochastic Optical
Reconstruction Microscopy) exploits the inherent stochasticity
of a photo-switchable fluorophore to reconstruct molecular
positions at high resolutions. During an experiment, multiple
blinks from each molecule can lead to misleading
representations of their true spatial locations, therefore
placing great importance in proper identification and
inference of the unknown photo-switching rates. In order to
characterise a molecule's photo-switching behaviour, we model
its true photon emission state as a continuous-time
homogeneous Markov process with m+3 states. The first m+1
states 0, 0_1, ..., 0_m refer to the unknown number of dark
states, 1 refers to the photon-emission (On) state and 2
refers to the photo-bleached (permanently dark) state. During
a single frame, the integral of this process gives rise to a
binary discrete time imaged process indicating whether or not
a molecule is detected. We describe a hidden Markov model
(HMM) relating the observed process with the hidden continuous
time signal, whereby observations are dependent on both
current and past hidden states thereby producing emissions
dependent upon the unknown photo-switching rates. We conceive
transmission matrices that capture all such dependencies and
which are needed to derive the log-likelihood function of
observations. This likelihood can be numerically maximised to
produce rate estimates and the Bayesian Information Criterion
(BIC) for each model considered. We show through simulation
studies the effectiveness of our procedure in rate estimation
and the power of BIC in selecting the correct model given a
range of different proposals.
Edward Avezov
Single particle tracking reveals reveals
nanofluidic properties of the Endoplasmic Reticulum
The Endoplasmic Reticulum (ER), a network of membranous sheets
and pipes, supports functions encompassing biogenesis of
secretory proteins and delivery of functional solutes
throughout the cell periphery. Molecular mobility through the
ER network enables these functionalities. The diffusion-driven
molecular motion (traditionally presumed by default), alone is
not sufficient to explain the kinetics of luminal transport
across supramicron distances. Understanding the ER
structure-function relationship is critical in light of
mutations in ER morphology regulating proteins that give rise
to neurodegenerative disorders.
Applying super-resolution microscopy and stochastic analysis
of single particle trajectories of ER luminal proteins
revealed that the topological organization of the ER
correlates with distinct trafficking modes of its luminal
content: with a dominant diffusive component in tubular
junctions and a fast flow component in tubules. Particle
trajectory orientations resolved over time revealed an
alternating current of the ER contents, whilst live ER fast
structured illumination microscopy analysis identified
energy-dependent tubule contraction events at specific points
as a plausible mechanism for generating active ER luminal
flow. The discovery of active flow in the ER has implications
for timely ER content distribution throughout the cell,
particularly important for cells with expensive ER-containing
projections e.g. neurons, sanctioning efforts to understand
the ER transport through mathematical modeling and biophysical
analysis.
Olaf Ronneberger
Modelling ambiguities and uncertainty in
biomedical image analysis with deep neural networks
Dominic Waithe
Advanced processing and characterisation of scanning
fluorescence correlation spectroscopy aquired through
conventional confocal microscopy
Scanning Fluorescence Correlation Spectroscopy (scanning FCS)
is a variant of conventional point FCS that allows molecular
diffusion at multiple locations to be measured simultaneously.
It enables disclosure of potential spatial heterogeneity in
molecular diffusion dynamics and also the acquisition of a
large amount of FCS data at the same time, providing large
statistical accuracy. In this talk we characterise the
processing and analysis of these large-scale acquired sets of
FCS data. On one hand we present FoCuS-scan, scanning FCS
software that provides an end-to-end solution for processing
and analysing scanning data acquired on commercial turnkey
confocal systems. On the other hand, we provide a thorough
characterisation of large-scale scanning FCS data over its
intended time-scales and applications and propose a unique
solution for the bias and variance observed when studying
slowly diffusing species. Our work enables researchers to
straightforwardly utilise scanning FCS as a powerful technique
for measuring diffusion across a broad range of
physiologically relevant length scales without specialised
hardware or expensive software.
Falk Schneider
Statistical analysis of large sFCS data-sets discloses
hindered diffusion dynamics
The plasma membrane of living cells is a profoundly bioactive
structure and a platform for numerous interactions of
proteins, solutes and lipids. Its functional organisation is
highly spatiotemporally heterogeneous and heavily involved in
regulating cellular function.
Here we use large scanning fluorescence correlation
spectroscopy (sFCS) data-sets to differentiate free (Brownian)
from hindered (non-Brownian) diffusion dynamics enabling a
glance into the molecular membrane heterogeneity. Accurate
determination of diffusion coefficients and diffusion
behaviour can be performed by statistical analysis and fitting
of transit time histograms. We make use of an inherent
sampling bias in sFCS data and present a novel fitting
approach including model selection criteria. Our biological
findings line up with results previously obtained using
super-resolution stimulated emission depletion (STED)
nanoscopy combined with FCS. For instance, we observe free
diffusion for phospholipids in model membranes and cell
membranes whereas sphingolipids or GPI-anchored proteins
undergo more complex diffusion behaviours in cellular plasma
membranes.
Overall we are presenting a novel toolkit to investigate
nano-scale molecular diffusion dynamics for shedding a new
light on membrane organisation and heterogeneity.
Notably, our statistical analysis pipeline can be applied to
data acquired on standard turn-key confocal microscopes using
conventional fluorescent dyes or fluorescent proteins.
Ricardo Henriques
Democratising high-quality live-cell super-resolution
microscopy enabled by open-source analytics in ImageJ
In this talk I will present high-performance open-source
approaches we have recently developed to enable and enhance
optical super-resolution microscopy in most modern
microscopes, these are NanoJ-SRRF, NanoJ-SQUIRREL and
NanoJ-Fluidics. SRRF (reads as surf) is a new super-resolution
method capable of enabling live-cell nanoscopy with
illumination intensities orders of magnitude lower than
methods such as SMLM or STED. The capacity of SRRF for
low-photoxicity, allows unprecedented imaging for long
acquisition times at resolution equivalent or better than
SIM. For the second part of the talk, I will introduce
SQUIRREL, an analytical approach that provides quantitative
assessment of super-resolution image quality, capable of
guiding researchers in optimising imaging parameters. By
comparing diffraction-limited images and super-resolution
equivalents of the same acquisition volume, this approach
generates a quality score and quantitative map of
super-resolution defects. To illustrate its broad
applicability to super-resolution approaches, we demonstrate
how we have used SQUIRREL to optimise several image
acquisition and analysis pipelines. Finally, I will showcase a
novel fluidics approach to automate complex sequences of
treatment, labelling and imaging of live and fixed cells at
the microscope. The NanoJ-Fluidics system is based on low-cost
LEGO hardware controlled by ImageJ-based software and can be
directly adapted to any microscope, providing
easy-to-implement high-content, multimodal imaging with high
reproducibility. We demonstrate its capacity to carry out
complex sequences of experiments such as super-resolved
live-to-fixed imaging to study actin dynamics;
highly-multiplexed STORM and DNA-PAINT acquisitions of
multiple targets; and event-driven fixation microscopy to
study the role of adhesion contacts in mitosis.
David Gaboriau
Super resolution microscopy shows the organisation of the
Escherichia Coli outer membrane
The early events of infection by Herpesvirus, such as genome
uncoating, nuclear transport and the start of transcription
from the viral genome, are not clearly understood.
We have developed new procedures to visualise single genomes
and transcripts in human cells.
We used bioorthogonal chemistry to visualise viral genomes
incorporating traceable precursors (Sekine et al., 2017, PLOS
Pathogens) and single molecule RNA in-situ FISH to detect
single transcripts.
By using these two techniques together, we were able to image
the early stages of infection with widefield microscopy and
deconvolution. Images were then processed with Icy, an
open-source platform for bioimage analysis, with custom-made
automated analysis sequences called protocols.
The multiple outputs generated by the protocols included
number, geometry, intensity and distance measurements, as well
as the position of each genome and transcript within the
nucleus.
With these, we were able to characterise active transcription
events by defining a transcription burst as a transcript of a
certain size, intensity and distance from a genome.
We studied transcription of an immediate-early gene, ICP0,
over time and followed the changes brought in by blocking
viral genome replication and protein synthesis in the cell. We
also used cells infected with a single virus, enabling us to
follow transcription from one infecting genome.
Finally, by multiplexing the detection of transcripts, we
followed transcription of two immediate-early genes, ICP0 and
ICP4.
These results offer new perspectives on very early events of
viral genome presentation after infection and transcription
patterns from those genomes.
Jorge Bernardino de la Serna
Simultaneous spatiotemporal resolution of Lipid lateral
packing and molecular diffusion
Sandip Kumar
Super resolution microscopy shows the organisation of
the Escherichia Coli outer membrane
Antibiotic resistance in bacteria is on the rise for all
classes of antibiotics. In the case of pathogenic
Gram-negative bacteria, many are intrinsically resistant to
some classes of antibiotics due to the outer-membrane (OM)
acting as a diffusion barrier for drug molecules.
Bacteria maintains this barrier function during growth and
division and this is mostly due the structure and organisation
of the OM. The OM is asymmetric, where the outer surface has
lipopolysaccharides (LPS) and outer membrane proteins (OMPs)
exposed to the environment. While a great deal is known about
the biogenesis of OM and how the LPS and OMPs are deposited on
the surface of Gram-negative bacteria, surprisingly little is
known about how these molecules are organised in a live
bacterium. Here we fluorescently label LPS and OMPs and show
their organisation in live E. coli. Fluorescence recovery
(FRAP) and single particle tracking (SPT) shows that OM is
immobile in live bacteria. Time lapse imaging of dividing
cells shows that new LPS is deposited all along the OM surface
while OMPs are preferentially incorporated in mid cell.
Astigmatism based 3D localisation microscopy shows that LPS is
uniformly distributed all along the OM and co-localises with
OMPs.