The digital revolution and new measurement modalities, or ways of making measurements, within the medical area are creating increasingly vast amounts of data, which are often high-dimensional. The data come from disparate sources and are of varying quality. Healthcare will increasingly rely on the integration of large datasets, and need trusted and robust tools to analyse the data. Accessing and linking different sources of medical data is a difficult task due to the multi-modal and confidential nature of the data.
Our wide experience in understanding and evaluating measurement uncertainty will underpin the degree of confidence in medical data. Research into algorithm stability, uncertainty, data curation and data fusion are all key. Machine learning algorithms are increasingly being promoted as an effective means to analyse and process the vast amounts of imaging and non-imaging data that are routinely generated in healthcare. We will assess the reliability of these algorithms for these applications.
We improve the evaluation and propagation of uncertainty within quantitative imaging and inferences drawn from the images. For instance, if imaging shows and artery is 70% blocked, it isn't clear whether this is between 69% and 71% blocked, or between 50% and 90% blocked. There is no standardised approach to applying uncertainty analysis to quantitative medical imaging systems and applications, whether clinical or research. We are developing a framework for quantitative imaging which will support standardisation.
Our work includes the following themes:
Our research and measurement solutions support innovation and product development. We work with companies to deliver business advantage and commercial success. Contact our Customer Services team on +44 20 8943 7070