National Physical Laboratory

Data Analysis and Uncertainty Evaluation

Measurement provides data that is used as the basis for making inferences, informing decisions and controlling systems. These activities depend on the reliability, traceability and mutual recognition of measurement results derived from the data, which are established through modelling, mathematical and statistical methods of data analysis implemented in validated software, and valid statements of measurement uncertainty. We undertake research to build mathematical, statistical and computational capability in the areas of metrology data analysis and measurement uncertainty evaluation. We influence standardisation and regulatory bodies through the maintenance and development of Standards and Guides. We disseminate good practice through journal papers, guides, reports, training and validated software.

Research undertaken in the science area has two themes. The theme 'Mathematics and modelling to support traceability and the SI' addresses conventional measurements for which models are well understood and model inputs are well characterised. Major drivers for the theme are represented by the 'Guide to the expression of uncertainty in measurement (GUM)', the primary document regarding measurement uncertainty evaluation, and the 'Mutual Recognition Arrangement (MRA)', supported by key comparisons, which are used to establish comparability of measurements made by NMIs and underpin traceability. With the evolution of sophisticated measuring systems, problems are becoming increasingly complex and call for research into Bayesian approaches that do not depend on assumptions and simplifications inherent in conventional data analysis approaches.

The theme 'Mathematics and modelling to address societal challenge' addresses measurements made to support societal challenges, such as in climate and the environment, energy and sustainability, smart infrastructure, health and wellbeing, and high-value manufacturing. To address these challenges, a new metrology paradigm is needed in which models are complex and only partially understood, data is of limited integrity, and data may be obtained using a network of distributed sensors.

Current and recent projects

Recent publications

  • P. M. Harris, C. E. Matthews, M. G. Cox, and A. B. Forbes. Summarizing the output of a Monte Carlo method for uncertainty evaluation. Metrologia, 51:243-252, 2014
  • A. B. Forbes and H. D. Minh. Design of linear calibration experiments. Measurement, 46(9):3730-3736, 2013
  • A. B. Forbes. A two-stage MCM/MCMC algorithm for uncertainty evaluation. In F. Pavese, M. Bär, J.-R. Filtz, A. B. Forbes, L. Pendrill and K. Shirono, editors, Advanced Mathematical and Computational Tools for Metrology IX, 159-170 (2012)
  • A. B. Forbes. Approaches to evaluating measurement uncertainty. Int. J. of Metrol. and Qual. Eng., 3:71-77, 2012
  • A. B. Forbes and H. D. Minh. Generation of numerical artefacts for geometric form and tolerance assessment. Int. J. Metrol. Qual. Eng., pp 145-150, 2012
  • A.B. Forbes. Weighting observations from multi-sensor coordinate measuring systems. Meas. Sci. Technol., Vol. 23, 2012
  • M. G. Cox, A. B. Forbes, P. M. Harris, and C. E. Matthews. Numerical aspects in the evaluation of measurement uncertainty. In A. Dienstfrey and R. Boisvert, editors, Uncertainty Quantification in Scientific Computing, pp 180-194, Boston, 2012. Springer
  • A. B. Forbes. An MCMC algorithm based on GUM supplement 1 for uncertainty evaluation. Measurement, 45:1188-1199, 2012
  • A. B. Forbes and J. A. Sousa. The GUM, Bayesian inference and the observation and measurement equations. Measurement, Vol. 44, 1422-1435 (2011)


Please note that the information will not be divulged to third parties, or used without your permission