The Hyve is a partner in the recently started project RADAR-CNS, a research programme by the Innovative Medicines Initiative (IMI) with a focus on new ways to monitor and treat brain disorders. The project has a 22 million euro budget, and focuses on 3 disease area’s in particular: depression, MS and epilepsy. The goal of RADAR-CNS is to come up with new ways to monitor relapse in these diseases through smart use of wearable devices and other digital data streams. This will help to move to a new paradigm of managing health: instead of only focusing only on curing diseases in healthcare, we also want to give patients tools to manage their own health and move to prevention.
Together with the other data processing work package partners, a.o. King’s College London, Northwestern University, and Intel, The Hyve will build an open source infrastructure for the processing of data from wearable devices, and use this infrastructure for a number of clinical studies in the project. The architecture is at the moment under discussion between the partners, but so far the elements are:
- OpenmHealth Shimmer (for connecting to cloud APIs from existing devices) and Purple Robot (for data collection from sensors on the mobile phones)
- Apache Kafka, possibly together with Apache Storm or Apache Spark Streaming, to collect and process the data
- Hadoop HDFS to persist the data and to provide a storage layer
- TranSMART as an analysis data warehouse to combine processed wearables data with clinical data endpoints
The endpoints we will be using in RADAR vary a bit between the diseases. For example, epilepsy has a gold standard the video EEG (a device that is not very portable!), and the goal is to find a combination of sensors that can approximate similar readouts to register seizures. For depression on the other hand, the range of possible relevant digital data streams is much more broad, varying from activity data and other body sensor readouts to audio processing and email usage. For MS, a complicating factor is that patients in more advanced stages are not very mobile, and might have trouble putting on devices, let alone charging them etc. Overall, it’s clear that we will very much need the multidisciplinary collaboration between the top clinicians, device engineers and software engineers that we have in the consortium to realise these goals!
According to the Description of Work, the main task is to adapt tranSMART to load biosignature data captured in Purple Robot, and other programmatic changes needed to search, display, and retrieve those data in conjunction with clinical data in an intuitive manner (e.g. to better deal with time series as typically observed in data collected from wearables). The Hyve is open to use any open source technologies available to accomplish these tasks, and foresees a close collaboration with the other parties to work on cross-infrastructural aspects such as security, scalability etc.
Please get in contact to see how we can support your mHealth project too.