Вы находитесь на странице: 1из 1

Exploring the Application of High

Performance Computing to Enable the


Analysis of Physiological Brain Injury Data

Martin Shaw1, Laura Moss1, Ian Piper1, Annie ODonnell2, Andrew Judson2
1Dept. Of Clinical Physics & Bioengineering, NHS Greater Glasgow & Clyde, 2Aridhia Informatics

Aim Results
Traumatic Brain Injury (TBI) is devastating; not only to the victim but also to carers and Benchmark tests were carried out to determine both the optimum size of data (to be
society. TBI is a leading worldwide health problem. Country-based estimates of the processed at one time) and the optimum processor combination. A subset of these
annual incidence of hospitalization following TBI range from 108-332 new cases per results are shown in Table 1.
100,000 inhabitants.
To aid in the treatment and management of TBI patients, most modern TBI centres Number of processor cores Time taken (minutes) to process 1
now have sophisticated physiological bedside monitoring devices. This use of sensor million rows of data
technology has transformed clinical medicine, and consequently large volumes of Single 10.10
high-frequency patient data are now available.
Two 5.55

Four 3.91

Eight 2.02

Table 1: Results of benchmark testing

Following the benchmark testing, the timings from down-sampling 12 days of patient
data (470 million rows of data) using the existing and best performing high
performance computing infrastructure (8 cores, 1million rows at a time) were
compared; the HPC infrastructure took 48 minutes, whereas the existing computing
environment took 16 hours

Figure 1: Example patient monitoring equipment in a neurointensive care unit.

To make treatment more effective, high-frequency patient data could be linked to lower
frequency clinical data to enable the detection and prediction of clinically relevant
events.

For these analyses to be used in clinical practice, the rate of sampled physiological
data has to be reduced (down-sampled). However, due to the datas size, these
analyses cannot be completed in clinically meaningful timescales.

We detail a pilot study between NHS GG&C, and Aridhia to research the use of high
performance computing (HPC) as a solution.

Method Conclusion
The pilot study indicates that HPC is viable for the processing and analysis of high
First Stage frequency data from brain injured patients.
In the first stage of analysis, an existing algorithm for down-sampling data, written for
a single (computer) processor (a common NHS computing environment) was Recently awarded Innovate UK funding for the CHART-ADAPT project will build on this
optimised. This consisted of the following steps: initial research. High frequency data will be automatically extracted from in-hospital
patient monitoring devices, anonymised and transferred to the AnalytiXagility platform;
Identifying the optimal sample size of data to be down-sampled in each iteration of this will provide data storage, deployment of clinical analysis algorithms using their
the algorithm HPC infrastructure, and apps will enable clinicians to control their analysis. Finally,
results will be presented back to clinicians at the patients bedside.
Algorithm re-written using the memory dplyr function to reduce the iterations and
time taken. Dplyr enables efficient management of data within the R statistical
package. This work has the potential to revolutionise the treatment of brain-injured patients.
Redistribution and indexing of data in the database by the timestamp signal.

Second Stage Further Details


The second stage utilised the collaborative aspect and HPC infrastructure of Aridhias
AnalytiXagility data science platform. Data storage was optimised to use distribution
across multiple nodes and the algorithm to work across multiple parallel processors.
www.chartadapt.org

Вам также может понравиться