It is a great pleasure to announce the following Keynote Speakers for the geoENV 2014 conference:
- Håvard Rue
- Gerard Heuvelink
- Laurent Bertino
Håvard Rue is professor in statistics at the Department of Mathematical Sciences, Norwegian University of Science and Technology. His research interest includes Bayesian computing and spatial statistics, which is summarised through R-INLA package, see www.r-inla.org. He has been an associate editor for JRSS series-B, Scandinavian Journal of Statistics, Statistic Surveys, Annals of Statistics and Environmetrics. His main research interest has been Gaussian Markov random fields (GMRF) models, and with Leonhard Held he has written a monograph on the subject published by Chapman & Hall. GMRFs is also a main ingredient doing (fast and accurate) approximate Bayesian analysis for latent Gaussian models using integrated nested Laplace approximations (INLA), which is published as a discussion paper for JRSS series B 2009 co-authored with S.Martino and N.Chopin. GMRFs also appears within geostatistics using stochastic partial differential equations as the bridge, which provides an explicit link between certain Gaussian fields and GMRFs in triangulated lattices (published as a discussion paper for JRSS series B in 2011, with F.Lindgren and J.Lindstrøm).
Keynote lecture: The stochastic partial differential equation (SPDE) approach to represent Gaussian random fields
Continuously indexed Gaussian fields (GFs) are the most important ingredient in spatial statistical modelling and geostatistics. The specification through the covariance function gives an intuitive interpretation of the field properties. On the computational side, GFs are hampered with the big-n problem, since the cost of factorizing dense matrices is cubic in the dimension. Although computational power today is at an all time high, this fact seems still to be a computational bottleneck in many applications. Along with GFs, there is the class of Gaussian Markov random fields (GMRFs) which are discretely indexed. The Markov property makes the precision matrix involved sparse, which enables the use of numerical algorithms for sparse matrices, that for fields in R2 only use the square root of the time required by general algorithms. The specification of a GMRF is through its full conditional distributions but its marginal properties are not transparent in such a parameterization. In this talk, I will review our ongoing work on an alternative approach; using an approximate stochastic weak solution to (linear) stochastic partial differential equations, we can, for some GFs in the Matérn class, provide an explicit link, for any triangulation between GFs and GMRFs, formulated as a basis function representation. The consequence is that we can take the best from the two worlds and do the modelling by using GFs but do the computations by using GMRFs. Perhaps more importantly and somewhat surprisingly, our approach generalizes to other covariance functions generated by SPDEs, including oscillating and non-stationary GFs, as well as GFs on manifolds and multivariate GFs.
Gerard Heuvelink holds an MSc in Applied Mathematics of Twente Technical University (1987) and a PhD in Environmental Sciences of Utrecht University (1993). He was assistant professor in Geostatistics and Stochastic Simulation with the University of Amsterdam from 1991 to 2003. Between 2003 and 2011 he worked as senior researcher in Geostatistics with Alterra, Wageningen. He is currently employed as associate professor in Geostatistics with the Soil Geography and Landscape group of Wageningen University (since 2003) and senior researcher Pedometrics and Digital Soil Mapping with ISRIC World Soil Information (since 2011). Dr. Heuvelink has written over 200 scientific publications on geostatistics, spatial uncertainty analysis and pedometrics, about 85 of which appeared in peer-reviewed international journals. He has been involved in many research projects dealing with spatial uncertainty in environmental modelling and spatial analysis and is worldwide recognised as a leading scientist in pedometrics and spatial uncertainty analysis, mainly through his publications and active involvement as Program Committee member in various GIS and accuracy-related conference series. Dr. Heuvelink chaired the Pedometrics commission of the International Union of Soil Sciences from 2003 to 2006 and was president of the Netherlands Soil Science Society from 2004 to 2007. He is associate editor of the European Journal of Soil Science and Spatial Statistics, and editorial board member of Geoderma, Environmental and Ecological Statistics, International Journal of Applied Earth Observation and Geoinformation and Geographical Analysis.
Expertise: Geostatistics, Spatial Uncertainty Analysis, Statistics, Pedometrics, Sampling Design Optimisation, Spatial Aggregation and Disaggregation, Space-Time Kriging
Keynote lecture: Much ado about spatial uncertainty
Research into statistical modelling and analysis of spatial uncertainty in geo-information science dates back to the 1980s. It developed more or less independently from concurring advances in geostatistics. The focus was on modelling the errors in the position and attribute values of spatial objects and on analysing how these errors propagate through GIS operations. We give a brief historical account and present the main achievements until present, while concentrating on approaches that use a probabilistic description of uncertainty. We discuss initial simple approaches such as the epsilon-band method for modelling positional error and the first-order Taylor method for analysing uncertainty propagation in simple GIS operations. Next we describe how more elaborate approaches were developed and applied, where geostatistical methods were introduced to model spatial dependence in positional and attribute uncertainty and where Monte Carlo and ensemble methods were used to analyse the propagation of uncertainty in complex, spatially distributed, dynamic models. We also review developments in communication and visualisation of spatial uncertainty and the development of dedicated software tools. The second part of the presentation looks to the future and outlines a comprehensive framework for integrated handling of input, parameter, model and observational error in spatial uncertainty analysis. While uncertainties about model inputs are derived from external sources, such as through instrument and lab specifications, the kriging variance or through expert elicitation, uncertainties about model parameters and model structure are obtained using Bayesian calibration, whereby model predictions are compared with independent observations. In a first stage, model structural uncertainty may be represented by an additive stochastic residual, but more complex approaches such as through stochastic PDEs are within practical reach. The Bayesian calibration procedure must take into account that part of the discrepancies between observed and predicted model output is caused by input uncertainty and observational error. Methodological development and implementation of the framework will take time and requires the joint effort from multiple disciplines, but once realised it will truly help end users and decision-makers take more informed decisions and perform risk-assessments. The various methods and tools that are reviewed in this presentation and the proposed integrated framework are all illustrated with real-world examples.
Laurent Bertino is researcher at the Nansen Environmental and Remote Sensing Center in Bergen, Norway. He holds a PhD from Ecole des Mines de Paris (2001) in data assimilation applied to the hydrodynamics of the Odra Lagoon, which he has reformulated in a geostatistical framework, opening new theoretical developments based on the concepts of geostatistics, like the use of the Gaussian anamorphosis. He has since then applied the Ensemble Kalman Filter to the primitive equations HYCOM model and has been responsible for the development of the TOPAZ operational ice-ocean forecasting system since January 2003. He has authored or co-authored about 25 peer-reviewed publications in the field data assimilation and operational oceanography.
Expertise: Data Assimilation, Geostatistics, Numerical Ocean Modeling
Keynote lecture: The use of Geostatistics for data assimilation in operational oceanography
Data assimilation aims at solving a hidden space Markov Chain model, with a non-linear forward model in most practical cases of geophysical fluids. A regular flow of observations is available from several satellites and autonomous drifting buoys but only cover a small fraction of the hidden state vector (a few of the variables of interest). The temporal evolution of the hidden state obeys the general laws of fluid mechanics (and solid mechanics in the case of sea ice), among which the conservation of quantities like momentum and mass that make the core of a large numerical simulation code, similar to that of climate models, applied on a large 3D spatial grid. The resulting state space thus reaches a dimension of 10^8 in present-day operational oceanography systems, while the number of observations to be assimilated on a weekly basis in about half a million. In the perspective of operational oceanography, additional time constraints impose a focus on methods that can provide a forecast within a short time. Data assimilation is therefore requested to do a lot with little … and do it fast! It is therefore important that the numerical forward model is used optimally in data assimilation: it should not only provide a first guess, but also the error covariances necessary to carry out the spatial, temporal and multivariate interpolation. Modern sequential data assimilation methods have therefore started their developments from the square root version of the Kalman Filter. The latter being only optimal in linear systems, several extensions have been gradually set forward, one of them being the Ensemble Kalman Filter (EnKF) which uses a sequence of ensemble non-linear forward simulations and conditions the simulations to observations by a – linear – least squares update step.
The presentation will review the data assimilation methods currently used in operational ocean and sea ice forecasts, stressing the links with known concepts of Geostatistics. Some Geostatistical tools have made their way into the data assimilation community but others have not, would it be due to technical limitations, operational constraints, or too little dedicated efforts. The presentation will provide such examples of hard puzzles to solve in data assimilation.