Skip to main content



The project aims to develop a good practice guide with accompanying code repository and benchmark problems, which specifies how uncertainties of machine learning models should be quantified when applied to Photoplethysmography signal evaluation. In particular, the project will investigate performance, accuracy and reliability of machine learning models when applied to data from subjects with diverse physiological backgrounds such as different skin tones, sexes or ages, to prevent model bias and discrimination.

Read more

PPG Benchmark Problems

To ensure uptake of the developed methods by industry, clinically relevant benchmark problems including a broad range of data from diverse physiological backgrounds to test machine learning algorithms will be created by the QUMPHY project. Such benchmarks are essential to establish reliability of predictions and comparability between different methods.

Machine Learning

Machine learning algorithms are indispensible to analyse medical data and detect physiological parameters or disease patterns. The QUMPHY project will implement and investigate a variety of existing high-performance machine learning algorithms and compare their performance when applied to clinically relevant diagnostic tasks involving PPG signals.

Uncertainty Quantification

Knowledge about uncertainties of algorithms used to evaluate medical data is essential in the prevention of false diagnosis and potential mistreatment of patients. The QUMPHY project will investigate how uncertainties of machine learning methods trained on PPG data can be  quantified. The developed methods will be described in a good practice guide, which will also include hands-on examples for the specification of uncertainty budgets.


The QUMPHY project will provide the implementation of machine learning models and uncertainty quantification methods for the analysis of PPG signals in an open access software repository including a FAIR dataset to test and apply the methods provided. Furthermore, the methods will be described in a corresponding user guide.