Pound coins stacked against each other

Research grants

Current research grants

Network Stochastic Processes and Time Series (NeST) -- An EPSRC Programme Grant led by Imperial College London 

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: £6,451,752 (LSE: £668,569)
Grant holders: 9 academics from 6 UK universities, LSE holder is Professor Qiwei Yao
Start/end date: 01/12/2022 - 30/11/2028 

Summary: Dynamic networks occur in many fields of science, technology and medicine, as well as everyday life. Understanding their behaviour has important applications. For example, whether it is to uncover serious crime on the dark web, intrusions in a computer network, or hijacks at global internet scales, better network anomaly detection tools are desperately needed in cyber-security. Characterising the network structure of multiple EEG time series recorded at different locations in the brain is critical for understanding neurological disorders and therapeutics development. Modelling dynamic networks is of great interest in transport applications, such as for preventing accidents on highways and predicting the influence of bad weather on train networks. Systematically identifying, attributing, and preventing misinformation online requires realistic models of information flow in social networks. 

Whilst simple random networks theory is well-established in maths and computer science, the recent explosion of dynamic network data has exposed a large gap in our ability to process real-life networks. Classical network models have led to a body of beautiful mathematical theory, but do not always capture the rich structure and temporal dynamics seen in real data, nor are they geared to answer practitioners' typical questions, e.g. relating to forecasting, anomaly detection or data ethics issues. Our NeST programme will develop robust, principled, yet computationally feasible ways of modelling dynamically changing networks and the statistical processes on them. 

Some aspects of these problems, such as quantifying the influence of policy interventions on the spread of misinformation or disease, require advances in probability theory. Dynamic network data are also notoriously difficult to analyse. At a computational level, the datasets are often very large and/or only available "on the stream". At a statistical level, they often come with important collection biases and missing data. Often, even understanding the data and how they may relate to the analysis goal can be challenging. Therefore, to tackle these research questions in a systematic way we need to bring probabilists, statisticians and application domain experts together. 

NeST's six-year programme will see probabilists and statisticians with theoretical, computational, machine learning and data science expertise, collaborate across six world-class institutes to conduct leading and impactful research. In different overlapping groups, we will tackle questions such as: How do we model data to capture the complex features and dynamics we observe in practice? How should we conduct exploratory data analysis or, to quote a famous statistician, "Looking at the data to see what it seems to say" (Tukey, 1977)? How can we forecast network data, or detect anomalies, changes, trends? To ground techniques in practice, our research will be informed and driven by challenges in many key scientific disciplines through frequent interaction with industrial & government partners in energy, cyber-security, the environment, finance, logistics, statistics, telecoms, transport, and biology. A valuable output of work will be high-quality, curated, dynamic network datasets from a broad range of application domains, which we will make publicly available in a repository for benchmarking, testing & reproducibility (responsible innovation), partly as a vehicle to foster new collaborations. We also have a strategy to disseminate knowledge through a diverse range of scientific publication routes, high-quality free software (e.g. R packages, Python notebooks accompanying data releases), conferences, patents and outreach activities. NeST will also carefully nurture and develop the next generation of highly-trained and research-active people in our area, which will contribute strongly to satisfying the high demand for such people in industry, government and academia.


Statistical Methods in Offline Reinforcement Learning

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: 398,393
Grant holder: Dr. Chengchun Shi
Start/end date: 1/4/2022 - 31/03/2025
 
Summary: Reinforcement learning (RL) is concerned with how intelligent agents take actions in a given environment to learn an optimal policy that maximises the cumulative reward that they receive. It has been arguably one of the most vibrant research frontiers in machine learning over the last few years. According to Google Scholar, over 40K scientific articles have been published in 2020 with the phrase "reinforcement learning". Over 100 papers on RL were accepted for presentation at ICML 2020 (a premier conference in the machine learning area), accounting for more than 10% of the accepted papers in total. Significant progress has been made in solving challenging problems across various domains using RL, including games, robotics, healthcare, bidding and automated driving. Nevertheless statistics as a field, as opposed to computer science, has only recently begun to engage with RL both in depth and in breadth. The proposed research will develop statistical learning methodologies to address several key issues in offline RL domains. Our objective is to propose RL algorithms that utilise previously collected data, without additional online data collection. The proposed research is primarily motivated by applications in healthcare. Most of the existing state-of-the-art RL algorithms were motivated by online settings (e.g., video games). Their generalisations to applications in healthcare remain unknown. We also remark that our solutions will be transferable to other fields (e.g., robotics).

A fundamental question the proposed research will consider is offline policy optimisation where the objective is to learn an optimal policy to maximise the long-term outcome based on an offline dataset. Solving this question faces at least two major challenges. First, in contrast to online settings where data are easy to collect or simulate, the number of observations in many offline applications (e.g., healthcare) is limited. With such limited data, it is critical to develop RL algorithms that are statistically efficient. The proposed research will devise some "value enhancement" methods that are generally applicable to state-of-the-art RL algorithms to improve their statistical efficiency. For a given initial policy computed by existing algorithms, we aim to output a new policy whose expected return converges at a faster rate, achieving the desired "value enhancement" property. Second, many offline datasets are created via aggregating over many heterogeneous data sources. This is typically the case in healthcare where the data trajectories collected from different patients might not have a common distribution function. We will study existing transfer learning methods in RL and develop new approaches designed for healthcare applications, based on our expertise in statistics.

Another question the proposed research will consider is off-policy evaluation (OPE). OPE aims to learn a target policy's expected return (value) with a pre-collected dataset generated by a different policy. It is critical in applications from healthcare and automated driving where new policies need to be evaluated offline before online validation. A common assumption made in most of the existing works is that of no unmeasured confounding. However, this assumption is not testable from the data. It can be violated in observational datasets generated from healthcare applications. Moreover, many offline applications will benefit from having a confidence interval (CI) that quantifies the uncertainty of the value estimator, due to the limited sample size. The proposed research is concerned with constructing a CI for a target policy's value in the presence of latent confounders. In addition, in a variety of applications, the outcome distribution is skewed and heavy-tailed. Criteria such as quantiles are more sensible than the mean. We will develop methodologies to learn the quantile curve of the return under a target policy and construct its associated confidence band.


Change-point analysis in high dimensions

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: £229,022
Grant holder: Dr. Tengyao Wang
Start/end date: 1/03/2021 - /02/2024

Summary: Modern applications routinely generate time-ordered datasets, where many covariates are simultaneously measured over time. Examples include wearable technologies recording the health state of individuals from multi-sensor feedbacks, internet traffic data collected by tens of thousands of routers and functional magnetic resonance imaging (fMRI) scans that record the evolution of certain chemical contrast in different areas of the brain. The explosion in number of such high-dimensional data streams calls for methodological advances for their analysis.

Change-point analysis is an essential statistical technique used in identifying abrupt changes in such data streams. The identified 'change-points' often signal interesting or abnormal events, and can be used to carve up the data streams into shorter segments that are easier to analyse.

Classical change-point analysis methods identify changes in a single variable over time. However, they often suffer from significant performance loss in high-dimensional datasets when applied componentwise. The area of high-dimensional change-point analysis grew out of the need to respond to the challenge created by high-dimensional data streams. A few methods have been proposed in this relatively new area. However, they often require simplifying assumptions that restrict their usefulness in many applications.

In this proposal, I will develop new methods that can handle more realistic data settings. Specifically, I will develop (1) an algorithm that can monitor the data stream 'online' as data points are observed one after another, so that it responds to changes as quickly as possible while maintaining a low rate of false alarms; (2) a change-point procedure that can handle highly correlated component series, a situation that is very common in multi-sensor measurements; (3) a robust method for change-point estimation in the presence of missing or contaminated data. I will provide theoretical performance guarantees for the developed methods and implement them in open-source R packages.


Was that change real? Quantifying uncertainty for change points

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: £323,942
Grant holder: Professor Piotr Fryzlewicz
Start/end date: 1/10/2021 - 30/09/2024

Summary: Detecting changes in data is currently one of the most active areas of statistics. In many applications there is interest in segmenting the data into regions with the same statistical properties, either as a way to flexibly model data, to help with down-stream analysis or to ensure predictions are made based only on relevant data. Whilst in others the main interest lies in detecting when changes have occurred as they indicate features of interest, from potential failures of machinery to security breaches or the presence of genomic features such as copy number variations.

To date most research in this area has been developing methods for detecting changes: algorithms that input data and output a best guess as to whether there have been relevant changes, and if so how many there have been and when they occurred. A comparatively ignored problem is assessing how confident we are that a specific change has occurred in a given part of the data.

In many applications, quantifying the uncertainty around whether a change has occurred is of paramount importance. For example, if we are monitoring a large communication network, and changes indicate potential faults, it is helpful to know how confident we are that there is a fault at any given point in the network so that we can prioritise the use of limited resources available for investigating and repairing faults. When analysing calcium imaging data on neuronal activity, where changes correspond to times at which a neuron fires, it is helpful to know how certain we are that a neuron fired at each time point so as to improve down-stream analysis of the data.

A naive approach to this problem is to first detect changes and then apply standard statistical tests for their presence. But this approach is flawed as it uses the data twice, first to decide where to test and then to perform the test. We can overcome this using sample splitting ideas - where we use half the data to detect a change, and the other half to perform the test. But such methods lose power, e.g. from using only part of the data to detect changes.

This proposal will develop statistically valid approaches to quantifying uncertainty, that are more powerful than sample splitting approaches. These approaches are based on two complementary ideas (i) performing inference prior to detection; and (ii) develop tests for a change that account for earlier detection steps. The output will be a new general toolbox for change points encompassing both new general statistical methods and their implementation within software packages.


Statistical Network Analysis: Model Selection, Differential Privacy, and Dynamic Structures

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: £631,743
Grant holder: Professor Yao
Start/end date: 01/06/2021 - 31/05/2024

Summary: In this proposal we tackle some challenging problems in the following three aspects of statistical network analysis. 

1. Jittered resampling for selecting network models.

The first and arguably the most important step in statistical modelling is to choose an appropriate model for a given data set. We propose a new `bootstrap jittering' or `jittered resampling' method for selecting an appropriate network model. The method does not impose any specific forms/conditions, therefore providing a generic tool for network model selection. 

2. Edge differential privacy for network data.

Network data often contain sensitive individual/personal information. Hence the primary concern for data privacy is two-folded: (a) to release only a sanitized version of the original network data to protect privacy, and (b) the sanitized data should preserve the information of interest such that the analysis based on the sanitized data is still meaningful. We will adopt the so-called dyadwise randomized response approach, and further develop this scheme to handle networks with additional node features/attributes (e.g., social networks with additional information on age, gender, hobby, occupation etc). 

3. Modelling and forecasting dynamic networks.

A substantial proportion of real networks are dynamic in nature. Understanding and being able to forecast the changes over time are of immense importance for, e.g., monitoring anomalies in internet traffic networks, predicting demand and setting pricing in electricity supply networks, managing natural resources in environmental readings in sensor networks, and understanding how news and opinion propagates in online social networks.  Combining recent developments on tensor decomposition and factor-driven dimension reduction with the efficient time series tools such as exponential smoothing and Kalman filters, we will take on this challenge to build some new dynamic models.

Recent grants

Methods for the Analysis of Longitudinal Dyadic Data with an Application to Intergenerational Exchanges of Family Support

Awarding body: ESRC (Economic and Social Research Council) and EPSRC (Engineering and Physical Sciences Research Council)
Total value: £633,392
Grant holder: Professor Fiona Steele
Co-investigators: Siliang Zhang, Professor Jouni Kuha, Professor Irini Moustaki, Professor Chris Skinner, Dr Tania Burchardt (Centre for Analysis of Social Exclusion (CASE), LSE), Dr Eleni Karagiannaki (CASE, LSE), Professor Emily Grundy (University of Essex)
Start/end date: 01/10/2017 - 30/9/2021

Summary: Data on pairs of subjects (dyads) are commonly collected in social research. In family research, for example, there is interest in how the strength of parent-child relationships depends on characteristics of parents and children. Dyadic data provide detailed information on interpersonal processes, but they are challenging to analyse because of their highly complex structure: they are often longitudinal because of interest in dependencies between individuals over time, dyads may be clustered into larger groups (e.g. in families or organisations), and variables of interest such as perceptions and attitudes may be measured by multiple indicators. This research will develop a general latent variable modelling framework for the analysis of clustered multivariate dyadic data, with applications to the study of exchanges of support between non-coresident family members. A particular feature of this framework will be to allow modelling of associations between an individual's exchanges over time, between help given and received (reciprocity), between exchanges of time and money, between respondent-child and respondent-parent exchanges, and between members of the same household. Sensitivity of results to measurement error and non-ignorable attrition will be considered.

Please find more information at this web page.  


New challenges in time series analysis

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: £1,306,110.05
Grant holder: Professor Piotr Fryzlewicz
Start/end date: 01/04/2014 - 31/03/2019

Summary: This research will break new ground in the analysis of non-stationary, high-dimensional and curve-valued time series. Although many of the problems we propose to tackle are motivated by financial applications, our solutions will be transferable to other fields. In particular, we will:
(a) Re-define the way in which people think of non-stationarity. We will define (non-)stationarity to be a problem-dependent, rather than ‘fixed’ property of time series, and propose new statistical model selection procedures in accordance with this new point of view. This will lead to the concept of (non-)stationarity being put to much better use in solving practical problems (such as forecasting) than it so far has been;

(b) Propose new, problem-dependent dimensionality reduction procedures for time series which are both high-dimensional and non-stationary (dimensionality reduction is useful in practice as low-dimensional time series are much easier to handle). We hope that this problem-dependent approach will induce a completely new way of thinking of high-dimensional time series data and high-dimensional data in general;

(c) Propose new methods for statistical model selection in high-dimensional time series regression problems, including the non-stationary setting. Our new methods will be useful in fields such as financial forecasting or statistical market research;

(d) Investigate new methods for statistical model selection in high-dimensional time series (of, e.g., financial returns) in which the dependence structure changes in an abrupt fashion due to `shocks', e.g. macroeconomic announcements;

(e) Propose new multiscale time series models, specifically designed to solve a longstanding problem in finance of consistent modelling of financial returns on multiple time scales, e.g. intraday and intraday;
(f) Propose new ways of analysing time series of curves (e.g. yield curves) which can be non-stationary in a variety of ways.


Tackling Selection Bias in Sentence Data Analysis through Bayesian Statistics and Elicitation of Experts’ Opinions

Awarding body: National Centre for Research Methods
Total value: £106,800 (LSE: £14,524)
Grant holder (LSE): Dr Sara Geneletti 
Start/end date: 01/10/2017 - 31/12/2018

Summary: Statistical models are widely used to investigate how criminal offenders are sentenced in Courts of Law. Through these types of models much has been learnt regarding the functioning and fairness of the processes taking place within courts. However, the validity and reliability of the findings obtained have been limited as a result of the important compromises that researchers dealing with sentencing data have had to make. There are a vast array of sentence outcomes available to judges with which to punish offenders. Importantly, these punishments vary in nature and cannot be measured in a straightforward manner. As a result, most model-based studies have focused on the analysis of simpler sentence outcomes such as the probability of prison and/or the duration of custodial sentences. Focussing on these two specific outcomes involves a tremendous loss of information that reduces the model’s capacity to grasp many of the nuances of the sentencing process while vastly limiting the generalisability of findings. For longer than four decades some of the best statisticians working on the field of Criminal Justice have sought to apply more sophisticated statistical models with which to expand the generalisability of findings based on custodial sentence lengths to the whole realm of sentencing. However, their efforts have been unsatisfactory. Although they manage to incorporate non-custodial outcomes into the model and with it improve the scope of their findings, they do so based on unrealistic assumptions – detailed in the case for support – that undermines the robustness of their approaches. We propose to use a new and more flexible statistical paradigm (based upon Bayesian inference) to develop a model where custodial and non-custodial outcomes could be integrated in a meaningful way. To do so we will rank sentence outcomes in terms of their relative severity, so custody being more severe than a suspended sentence, which in turns is more severe than a community order, and that more severe than a fine. To refine this scale of sentence outcomes’ severity, and given the deeply subjective nature of severity of punishments, the model will be further informed by personal views of Crown Court judges on the topic. The result of this work will be the elaboration of a new framework capable of obtaining more insightful and robust analyses of sentencing data. We will overcome a methodological conundrum that has affected the literature on the topic for far too long. Perhaps more importantly, the application of this new modelling strategy will allow academics and government researchers to provide higher quality information to policy-makers in the field of sentencing. A sector that is currently being reformed by government policy and through the application of sentencing guidelines both in England and Wales, and in Scotland.


Combined efficient large scale integrated urban systems (CELSIUS)

Awarding body: European Commission FP7 (Smart Cities & Communities 2011 call)
Total value (LSE): €411,829
Grant holder (LSE): Professor Henry Wynn
Start/end date: 01/04/2013 - 31/03/2018

Summary: This multi-partner EU project, led by the City of Gothenburg, involves a number of leading utilities organizations as well as academic partners. It aims to maximise carbon savings in cities by maximizing the unused energy saving potential through tackling ways to effectively and efficiently recover energy losses.

http://www.lse.ac.uk/CATS/Research%20Grants/researchGrants.aspx
http://eu-smartcities.eu/content/celsius-smart-district-heating-and-cooling-solutions


Legal norms and crime control: a comparative cross-national analysis

Awarding body: ESRC (Economic and Social Research Council)
Grant holder (LSE Statistics): Dr Jouni Kuha
Principal Investigator: Professor Jonathan Jackson (LSE, Department of Methodology)
Total value: £279,574 (Department of Statistics: £15,139)
Start/end date: 01/07/2014 - 30/06/2016

Summary: This is a comparative, cross-national study into attitudes towards legal authorities, compliance with the law, co-operation with legal authorities and the policing of minority and majority groups. The proposal is to address questions of deterrence, legitimacy, co-operation and compliance using a powerful new dataset that we have generated from national probability sample surveys of 30 different countries. The goal is to mount an ambitious cross-national empirical test of deterrence theory and procedural justice theory.


Methods of analysis and inference for social survey data within the framework of latent variable modelling and pairwise likelihood

Awarding body: ESRC (Economic and Social Research Council)
Total value: £236,809
Grant holder: Dr Myrsini Katsikatsou (ESRC future research leaders fellowship)
Start/end date: 01/10/2014 - 30/09/2017

Summary: This project aims to contribute to methodological research and provide tools for latent variable modelling of social survey data. The methods will be applied to the analysis of data from the OECD Programme for the International Assessment of Adult Competencies (PIAAC) and from the European Social Survey (ESS). The goals of the methodological research are to evaluate, further develop and disseminate innovative statistical methods and practical tools for latent variable modelling of social data regardless of the model complexity and size, the data type and size, or the presence of item non-response (missingness in some survey questions). The research will develop pairwise likelihood (PL) methods of estimation and testing for latent variable modelling (LVM), also known as structural equation modelling (SEM). SEM and LVM are standard well-established tools for modelling social survey data. PL is an emerging method for estimation and inference that has recently become popular in many disciplines because of its computational simplicity and statistical merits.


Using multi-level multi-source auxiliary data to investigate nonresponse bias in UK general social surveys

Awarding body: ESRC (Economics and Social Research Council)
Total value: £322,797 (LSE: £17,124)
Grant holder (LSE): Professor Chris Skinner
Start/end date: 31/08/2014 - 31/05/2016 (extended end date)
Project website

Summary: This project will explore the extent to which the predictive power of various forms of "Big Data" can be harnessed to overcome the impact of poor response to surveys - one of the major challenges facing social research today. Social surveys are a key tool used by the media, policy makers, and academics to understand more about public attitudes and behaviour. However, the value of surveys is put at risk by the fact that a large and growing number of those selected to take part in surveys do not respond. As non-respondents may be very different from respondents, nonresponse can introduce significant bias into the conclusions drawn from survey data. There is a pressing need therefore to understand more about the extent and sources of nonresponse bias. This requires having information about both respondents and nonrespondents. In the absence of interview data being available for non-respondents, this information must be obtained from other, external, sources.


Modelling vast time series

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
Total value: £486,564
Grant holder: Professor Qiwei Yao
Start/end date: 30/03/2014 - 29/03/2017

Summary: The challenges of our project are two-fold: First we need to develop the statistical inference methods and the associated theory for identifying the sparse structure and for fitting sparse VAR models with large dimensions. Let p denote the dimension of the time series. We aim to reduce the number of model parameters from the order of the square of p to the order of p, and to develop the valid inference methods when log(p)= o(n). Secondly, we need to identify the linear transformation to identify the latent segmentation structure, i.e. the block-diagonal autocovariance structure when such a structure exists.


The regression discontinuity design: a novel approach to evaluating the effects of drugs and treatments in primary care

Awarding body: MRC (Medical Research Council)
Total value (LSE): £24,078
Grant holder (LSE): Dr Sara Geneletti
Start/end date: 02/09/2013 - 01/02/2016

Summary: A fundamental task in clinical practice is to determine whether a particular drug is being prescribed in the most effective way. While Randomised Clinical Trials (RCTs) are considered to be the best scientific method for evaluation of drug efficacy, these studies often have poor external validity. Prescription guidelines are not always evidence based and it typically falls to clinical experts to set them. The regression discontinuity design (RDD) is an econometric quasi-experimental  design aimed at estimating the causal effects of a treatment by exploiting naturally occurring treatment rules. It was first introduced in the educational economics literature in the 1960s but it has not been widely used outside of this field until recently. This project has both substantive and methodological aims: the assessment of statin effectiveness in primary care and application and development of the RDD in epidemiology.


Topics on probability and convexity in finance (PROCONFIN)

Awarding body: European Commission FP7: Marie Curie International Career Integration Grant (CIG)
Total value: €100,000
Grant holder (LSE): Dr Kostas Kardaras
Stat/end date: 01/08/2013 - 31/07/2017

Summary: While the field of Financial Mathematics has witnessed a plethora of major achievements, there is ever-present need for more in-depth resolution of important problems. This project aims at addressing a representative collection of three areas: (1) Financial equilibria with heterogeneous agents in incomplete markets; (2) Viability of financial models with investment constraints and infinite number of traded assets; and (3) Hedging under model uncertainty. All three directions are related to recent or current scrutinised study, stemming from a desire to improve the quality of financial modelling, to allow for imperfections appearing in real markets and seek to comprehend them, as well as to manipulate the risk involved with complicated financial positions by exploiting the structure of simpler traded assets. Especially the last point is of direct practical importance, since the field of Financial Mathematics has been criticised exactly for having failed to correctly appreciate the risks associated with the introduction of financial instruments of vast complexity, the incorrect valuation of which is a major factor that resulted in the recent economic crisis.


Item nonresponse and measurement error in cross-national surveys: methods of data collection and analysis

Awarding body: NCRM (National Centre for Research Methods); ESRC (Economic and Social Research Council)
NCRM/ESRC grant # DU/512589106
Total value: £192,247
Grant holder (LSE): Dr Jouni Kuha
Start/end date: 01/04/2013 - 30/09/2014

Summary: Cross-national surveys are one of the key resources of social sciences. The complexity of the surveys raises methodological challenges, which need to be met in order to make the best use of the data. Two of these are problems of data quality: measurement error where the answers by survey respondents are in some way erroneous, and nonresponse where some questions are not answered at all. The goal of this project is to develop and evaluate research methods for these problems.


Bayesian inference on implied volatility

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
EPSRC grant # EP/K001264/1
Total value: £129,460
Grant holder (LSE): Dr Kostas Kalogeropoulos
Start/end date: 01/02/2013 - 31/01/2015

Summary: A substantial amount of publicly available datasets represent educated predictions on the evaluation of stochastic processes. These include financial derivative instruments, such as option prices, that can be formulated as expectations of the underlying price process. This project consider models with latent diffusion processes that can be linked to direct observations, but also to such conditional expectations. The goal is to utilise advanced computational methods to estimate that data generating mechanism from both datasets; moreover, to develop a general inferential framework to handle parameter and model uncertainty.


Advances in algebraic statistics

Awarding body: The Leverhulme Trust
Leverhulme Trust Emeritus Fellowship # EM-2011-046
Total value: £11,800
Grant holder (LSE): Professor Henry Wynn
Start/end date: 01/08/2011 - 30/09/2013

Summary: Algebraic statistics is a fast-moving area on the interface between statistics and computational algebraic geometry. The project will consolidate research in a number of sub-areas in which the grant holder is heavilty engaged in collaboration with research colleagues in Italy, Spain and Japan; for example, the application of the theory of monomial ideals in reliability, experimental design and hierarchical model structures. 


Evaluation of interventions and diagnostics of neglected tropical diseases in sub-Saharan Africa 

Awarding body: MRC (Medical Research Council)
MRC grant # G0902130
Total value: £348,381 (LSE: £13,387)
Grant holder: Dr Artemis Koukounai (Imperial College)
Start/end date: 10/01/2011 - 31/08/2013

Summary: To use advanced biostatistical analysis to further understanding of the effect upon the prevalence and intensity of schistosomiasis and of the ocular bacteria causing trachoma, and the likelihood of their elimination, of interventions based on Mass Drug Administration (MDA), as well as to evaluate the performance of the diagnostic tools currently used for the Monitoring & Evaluation (M&E) of these two infections.


High-Dimensional Time Series, Common Factors, and Nonstationarity

Awarding body: EPSRC (Engineering and Physical Sciences Research Council)
EPSRC grant # EP/H010408/1
Total value: £331,455
Grant holder (LSE): Professor Qiwei Yao
Start/end date: 01/06/2010 - 31/05/2013
Summary: http://stats.lse.ac.uk/q.yao/qyao.links/project/epsrc09.html


Enhancing the use of information on survey data quality

Awarding body: ESRC (Economic and Social Research Council)
ESRC grant # ES/H004343/1
Total value: £256,091
Grant holder (LSE): Professor Chris Skinner
Start/end date: 01/10/2011 - 31/01/2013

Summary: The quality of data collected in surveys is subject to a wide range of threats in the modern world, including the public's declining willingness to take part at all. Yet sources of information about this quality are increasing, in particular as a by-product of the evolving technologies used in survey data collection. This fellowshipinvestigates new ways of using this information to address a range of data quality issues which face social science researchers when anaysing survey data. The research addresses methodological questions such as: is it possible to improve anyalyses by giving greater emphasis to parts of the data which are of higher relative quality and if so how?


Latent variable modelling of categorical data: Tools of analysis for cross-national surveys

Awarding body: ESRC (Economic and Social Research Council)
ESRC grant # ES/H030796/1
Total value: £215,000
Grant holder (LSE): Dr Jouni Kuha
Start/end date: 01/04/2010 - 30/09/2012

Summary: To develop and encourage the use of particular statistical tools that will lead to better utilization of data collection of cross-national social surveys, more valid conclusions, and more relevant input into social science and public policy making.
Research project website.

Older grants

i. End-to-End Quantification of Uncertainty for Impacts Prediction (EQUIP)
Awarding body: NERC (Natural Environment Research Council)
NERC grant # NE/H003479/1
Total value: £185,630
Grant holder (LSE): Professor Leonard Smith
Start/end date: 01/10/2009 - 30/09/2012
Summary: EQUIP brings together the UK climate modelling, statistical modelling and impacts communities to work closely together for the first time on developing rik-based prediction for decision making in the face of climate variability and change.

ii. RAPID-RAPIT
Awarding body: NERC (Natural Environment Research Council)
NERC grant # NE/G015392/1
Total value: £67,116
Grant holder (LSE): Dr David Stainforth
Start/end date: 01/10/2009 - 30/09/2013
Summary: A NERC funded collaborative project led by the National Oceanography Centre, Southampton, that will attempt to quantify the likelihood of a shut down in the Meridional Overturning Circulation (MOC) in the North Atlantic.

iii. Dimension reduction with factor models: with application to finance
Awarding body: STICERD (Suntory and Toyota International Centres for Economic and Related Disciplines)
Total value: £20,000
Grant holder (LSE): Dr Clifford Lam
Start/end date: 01 April 2009 - 31 July 2010
Summary: Applying the factor model, and possible relaxation and further regularisations, to reduce the dimension of multivariate time series, with applications in finance.

iv. Climate Change and the Insurance Industry
Awarding body: EC FP-7
EC grant # FP7-PEOPLE-3-1-IAPP
EC FP7 People 'Industry and Academia Partnerships and Pathways' scheme
Total value: 927,974 Euros LSE budget: 294,883 Euros
Grant holder (LSE): Professor Henry Wynn
Start/end date: August 2008- July 2012
Summary: To develop methods to assess uncertainty in large scale mathematical models in a variety of scientific areas, particularly those models in computer simulators.

v. Managing Uncertainty in Complex Models (MUCM)
(Consortium project with Universities of Sheffield, Durham, Aston, Southampton and LSE).
Awarding body: EPSRC
EPSRC grant # EP/D048893/1
Total value: £2,167,671 LSE budget: £295,210
Grant holder (LSE): Professor Henry Wynn
Start/end date: 01/06/2006 - 31/05/2010
Summary: a multidisciplinary project concerned with quantifying and reducing uncertainty in the predictions of complex models across a wide range of application areas, including basic science, environmental science, engineering, technology, biosciences, and economics.

vi. Ensemble-based Predictions of Climate Changes and their Impacts (ENSEMBLES)
Awarding body: EU 6th framework programme / Integrated project
Grant #: GOCE-CT-2003-505539-ENSEMBLES
Total value: LSE CATS budget is £108,306
Grant holder: Professor Leonard Smith
Start/end date: 01/09/2004 - 31/12/2009
Summary: To develop an ensemble prediction system of climate changes and their impacts.


vii. Valuation and Hedging of Life Insurance Derivatives
Awarding body: EPSRC
EPSRC first grant scheme # EP/E013120/1
Total value: £161,635
Grant holder (LSE): Dr Thorsten Rheinlander
Start/end date: 29/06/2007 - 28/09/2009
Summary: Valuation and hedging of general insurance contracts and their embedded financial options. Exploring the transfer of systematic mortality risk to the financial market via the design of mortality derivatives and the study of their risk management.

viii. Dimension Reduction Modelling
Awarding body: EPSRC
EPSRC grant #: EP/C549058/1
Total value: £167,573
Grant holder: Professor Howell Tong, Professor Qiwei Yao, Dr Jeremy Penzer
Start/end date: 01/10/2005 - 30/09/2008
Summary: Dimension reduction for multivariate time series: modelling and first and second conditional moments.

ix. Nonlinear Analysis & Prediction Statistics from Timeseries & Ensemble-forecast Realizations (NAPSTER)
Awarding body: NERC
NERC grant #: NE/D00120X/1
Total value: £152,481
Grant holder: Professor Leonard Smith
Start/end date: 01/11/2005 - 31/10/2007
Summary: To set a basis for an innovative knowledge transfer mechanism between science base and users of the environmental predictions.

x. Micro-scale Robust Engineering Design (m-RED)
Awarding body: EPSRC
EPSRC grant #: GR/S63502/01
Total value: £144,633
Grant holder: Professor Henry Wynn, Dr Ron Bates
Start/end date: 01/04/2004 - 31/03/2007
Summary: To study micro-scale systems and components.

xi. Holistic Integrated Process Control (HIPCON)
Awarding body: European Commission
Grant #: NMP2-CT-20030505467
Total value: £346,667
Grant holder: Professor Henry Wynn
Start/end date: 01/01/2004 - 31/12/2006

xii. Volatility of Time Series
Awarding body: EPSRC
EPSRC grant #: GR/R97436/01
Total value: £154,142
Grant holder: Professor Qiwei Yao
Start/end date: 28/06/2003 - 31/08/2006
Summary: The study of statistical inference for volatility of time series.

xiii. Climate Variability
Awarding body: University of California, San Diego
Grant #: 10255373
Total value: £16,026
Grant holder: Professor Leonard Smith
Start/end date: 01/11/2005 - 30/06/2006
Summary: Ensemble simulations of observed climate variability.

xiv. Weather Risk Management
Awarding body: University Corporation for Atmospheric Research (UCAR)
UCAR grant #: S05-54803
Total value: £10,526
Grant holder: Professor Leonard Smith
Start/end date: 16/05/2005 - 15/04/2006
Summary: Improving operational weather risk management, demand forecasts and the use of joint distributions.

xv. Spatial and Spatio-temporal modelling
Awarding body: The Leverhulme Trust
Grant #: F/07004/O
Total value: £127,823
Grant holder: Professor Qiwei Yao
Start/end date: 01/10/2002 - 31/01/2006

xvi. Overseas Travel grant: To measure the risk and uncertainty at the interface of insurance and finance
Awarding body: EPSRC
EPSRC grant #: GR/T23879/01
Total value: £8,590
Grant holder: Dr Pauline Barrieu
Start/end date: 01/07/2004 - 30/09/2005

xvii. Direct & Inverse Modelling in End-to-End Environmental Estimation (DIME)
Smith Institute Faraday Partnership
Awarding body: EPSRC
EPSRC grant #: GR/R92363/01
Total value: £94,360
Grant holder: Professor Leonard Smith
Start/end date: 01/03/2003 - 31/08/2005
Summary: To track uncertainty, both from model inadequacy and from the unknown initial state of the atmosphere, all the way through the modelling process, to yield estimates of the uncertainty in quantities of industrial interest.

xviii. Real-time Modelling of Nonlinear Datastreams (REMIND)
Awarding body: EPSRC
EPSRC grant #: GR/R92271/01
Total value: £85,827
Grant holder: Professor Leonard Smith
Start/end date: 01/03/2003 - 28/02/2005

xix. Credit Risk Modelling (Levy processes for credit risk modelling and pricing)
Awarding body: Credit Suisse First Boston Europe Ltd.
Total value: £18,000
Grant holder: Professor Henry Wynn, Dr Rafael Schmidt
Start/end date: 13/12/2004 - 28/02/2005

xx. Towards identifying and increasing the Socio-Economic Value of High-Impact Weather Forecasts
Awarding body: National Oceanic & Atmospheric Administration (NOAA)
(See also UCAR grant above)
NOAA grant #: Lenny Smith - NOAA
Total value: £94,538
Grant holder: Professor Leonard Smith
Start/end date: 01/10/2003 - 30/09/2004
Summary: To support a Pembroke research fellowship in applied probabilistic meteorology.

xxi. Improved Risk Management via Probabilistic Weather Forecasts
Awarding body: Royal Dutch Shell
Total value: £21,873
Grant holder: Professor Leonard Smith
Start/end date: 01/06/2002 - 01/01/2004

xxii. Nonlinear time series modelling of periodically fluctuating vertebrate population: a spatio-temporal approach
Awarding body: BBSRC/EPSRC
Total value: £113,772
Grant holder: Professor Howell Tong, NC Stenseth & Professor Qiwei Yao
Start/end date: 01/01/2000 - 31/01/2002

Miscellaneous grants

Professor Pauline Barrieu
Awarded £6,000 for a six-month visit by Dr Giacomo Sandolo from the University of Verona. STICERD visitors' programme. (2013)

Dr Angelos Dassios
Awarded £11,000 from the Leverhulme Trust for 7-month period visiting universities in the US (2003/4).

Professor Ragnar Norberg
Mathematical Finance Network, Grant No. 9800335 from the Danish Social Science Research Council.

Dr Jeremy Penzer
ESRC 1+3 recognition for MSc and PhD programmes in the Department of Statistics, LSE. Recognition awarded 2001.

Professor Leonard Smith
2002 EC Marie Curie Postdoctoral Fellowship, held by Dr. Antjie Weisheimer, to work within the Centre for the Analysis of Time Series at LSE on the predictability in large climate models with LAS.