Functional Neuroimaging
Functional neuroimaging and the associated computing technologies play a principal role in this and other efforts of this kind, with functional magnetic resonance imaging (fMRI) and electro-encephalography (EEG) being the most widely used and thoroughly studied modalities for brain mapping.
Their objectives include analysis of the activity and functional connectivity of the brain, and their popularity arises from their non-invasive nature and their ability to provide maps of high spatial (fMRI) or temporal (EEG) resolution.
Each of these modalities exhibits its own advantages and disadvantages and provides an alternative view of the brain function. fMRI has been used to answer research questions in a wide range of disciplines, such as cognitive neuroscience and experimental psychology.
There are various objectives in the analysis of fMRI data, the most common of which being the localization of brain regions activated by a certain task, the determination of distributed functional brain networks (FBNS) that correspond to certain brain functions, and the prediction of the evolution or outcome of certain diseases, either via classification of the subjects or via computation of diagnostic biomarkers.
use case
Determination of FBNS and activated regions.
During an fMRI experiment and while the subject performs a set of tasks responding to external stimuli (task-related fMRI) or no tasks (resting-state fMRI), a series of 3-D brain images is acquired.
The localization of the activated brain areas is a challenging Blind Source Separation (BSS) problem, in which the sources consist of a combination of spatial maps (areas activated) and time-courses (timings of activation).
During a task-related fMRI acquisition (for example), our brain is not activated only from the task information we process, but also from external and (sometimes) even unconscious stimuli (e.g. heart beating, breathing, thinking of personal and family issues not connected to the task, etc.), hence the separation of the activation directly connected to the hand in task is a difficult and complex problem.
We can perform different types of analysis in order to study the activated areas in single or multi-subject studies.
As mentioned above, the use of the activated areas of subjects can be used in different fields, from psychological analysisto marketing purposes.
Press to see the activated regions of the brain
C
T
S
Connectivities
The human brain is a really complex machine, different types of connections exist among the different regions and different modalities are being used in order to visualize and study them.
The study of connectivity of the brain is an important factor both το better understand how the human brain works and to delineate areas that need to stay untouched during a brain surgery to prevent severe counter-effects.
Structural (or anatomical) connectivity refers to the existence and structural integrity of tracts connecting different brain areas (i.e. white matter tracts connecting cortical areas/nuclei), while MRI and diffusion MRI are being used in order to study and visualize it.
Functional and effective connectivity are neuroimaging terms. Functional connectivity refers to the statistical dependence – correlation of the signal from different brain areas (usually with the use of resting state fMRI).
Effective connectivity is a bridge between the two different connectivity measures and brings in the element of causation (i.e. a signal, activation in one area directly causes a change or signal, activation or depression, in another area).
We can offer different connectivity analysis and visualizations based on the task at hand.
Epilepsy detection / prediction
Worldwide, more than 70 million people suffer from epilepsy, a neurological condition characterized by seizures that affect their quality of life. Despite the different treatments currently available, still 35% of the patients continue to experience seizures for the rest of their lives.
Different stakeholders can benefit from improved monitoring of epilepsy patients. The quality of life of epilepsy patients and their relatives can be improved substantially either by an increased feeling of security at home or a better treatment selection (decreased seizure frequency). Neurologists get access to more accurate and objective information from the ongoing treatments through the automated seizure diaries that allow better treatment selection.
Such an automated monitoring system allows the pharmaceutical industry to select more efficiently the right type of patients for their clinical trials, which could lead to large cost savings. The developed methods can also be reused for other health applications by the medical device industry. Insurance companies can also see a decrease in expenses as a seizure warning system decreases the risk of injuries or post-ictal complications.
Automated seizure warning and detections systems can be used to improve the patients’ quality of life. Such systems can automatically detect ongoing seizures and alert the relatives. This way, both patients and relatives can feel reassured, knowing someone will help them when a seizure occurs in their everyday environment. Such applications require mobile measurements. Based on the type of the seizure that the patient exhibits, multimodal algorithms can be employed and data from different sources can be collected (e.g. Accelerometer, ECG, EMG). As an example, for the detection of seizures in a patient with absence seizures only the EEG signal is used, in patients with tonic clonic seizures a combination of EMG and EEG is optimal (possibly also ECG), whereas for frontal seizures ECG and EEG are employed. We offer different types of models and algorithms that can be used for accurate seizure detection and prediction.
Use Case
Automated seizure diary
Currently, treatment evaluation relies on manual seizure diaries maintained by the patient itself, which have shown to have an accuracy under 50%. An objective and automated offline seizure diary could lead to better patient treatment due to more reliable information for the clinician.
Automated seizure detection in hospital environment
In order to help neurologists decrease the time needed to review and annotate full-day recordings from a large number of patients. We offer ML algorithms than can pre-classify segments with high probability of being a seizure. The neurologist only needs to review and annotate only such segments instead of going through long recordings. Algorithms with high sensitivity are a perquisite for this application.
Artificial Intelligence for the early prediction of Atrial Fibrillation
Definitions and epidemiology
Atrial fibrillation (AF) is a supraventricular tachyarrhythmia characterized by an uncoordinated electrical activation of the heart atria (the upper chambers of the heart). This irregular atrial electrical activity leads to an ineffective atrial contraction. AF can be classified based on its duration as paroxysmal (intermittent episodes of arrhythmia), persistent or permanent. AF is the most common arrhythmia in adults globally, thus it is a major cause of morbidity and mortality. It is estimated that AF’s prevalence is about 2-4% in the adult population. Increasing age is the most prominent risk factor for the development of AF along with several other comorbid risk factors such as diabetes mellitus, hypertension, coronary artery disease, chronic kidney disease and more. Modifiable risk factors such as smoking, physical inactivity and obesity also play an important role in the risk of AF development.
Clinical Characteristics and Outcomes
AF can have various clinical presentations from being asymptomatic or silent ,to symptomatic with variable severity of symptoms from palpitations and chest discomfort to more severe symptoms such as syncope and cardiogenic shock. Regardless of its clinical presentation the main cause of AF morbidity is its deleterious outcomes. AF is a major cause of stroke since it causes about 20-30% of all ischemic strokes. Moreover, a significant portion of AF patients (about 20-30%) will develop heart failure. Furthermore, AF predisposes to cognitive decline, depression and impaired quality of life. Finally, AF patients will have an increased annual hospitalization rate and, more importantly, an increased mortality rate since they are at a 1,5 to 3,5 fold increased risk of death.
Diagnosis and screening for Atrial Fibrillation
The diagnosis of AF requires rhythm documentation. An electrocardiogram (ECG) showing atrial fibrillation for at least 30 seconds is diagnostic for clinical AF. Since AF can be asymptomatic, however equally hazardous regarding its outcomes (eg stroke), screening and early diagnosis could be of great importance in reducing its associated morbidity and mortality. Various wearable devices ( wearable monitors, smartwatches, smartphones etc) can provide important data in the screening of AF.
Use Case
Atrial Fibrillation risk assessment for early diagnosis
AINIGMA technologies is developing AI-based applications for the prediction of the development of Atrial Fibrillation (AF). We use demographics, medical history, physical activity, simple clinical and laboratory data and also ECG recordings to train our algorithms. We aim to provide a sensitive and user friendly application that will be able to predict which patients are at high risk of developing AF in the near future, and unmask which of them could already have subclinical AF. Early risk assessment of AF will lead to closer medical attention and more aggressive screening, thus early diagnosis and treatment of AF.
Sepsis Prediction / Detection
Sepsis definitions
Sepsis is a life-threatening syndromic condition characterized by a new-onset organ dysfunction due to a disproportioned host response to infection. Sepsis is diagnosed as an increase of a minimum of 2 points in the Sequential Organ Failure Assessment (SOFA) score in a patient with infection. The SOFA score includes clinical and laboratory parameters that assess the cognitive, metabolic, circulatory and ventilatory function of the patient. Septic shock is the most devastating subset of sepsis, in which profound circulatory, cellular and metabolic abnormalities are associated with greater risk of mortality.
Sepsis epidemiology and burden of disease
Sepsis is a major global health problem and the World Health Organization (WHO) is calling for multidisciplinary actions to improve the morbidity and mortality due to sepsis. It is estimated that sepsis affects about 49 million people annually and causes about 11 million related deaths. The incidence of sepsis shows heterogenicity among the global regions, which tends to differentiate depending on their respective development status. Apart from the serious death toll, the burden of sepsis is further increased by its various social and economic implications, since sepsis is a major cause of increased length of in-hospital stay, readmission and disability.
Sepsis in the hospital setting
Of all sepsis cases, a substantial amount is Hospital Acquired (HA). It is estimated that HA sepsis represents 24% of the total in-hospital treated sepsis cases. This is attributed to Hospital-Acquired Infections (HAI), and is a major cause of mortality in the Intensive Care Unit (ICU) setting. A crucial factor that determines sepsis outcome is the time of intervention. Earlier medical intervention leads to decreased in-hospital mortality in sepsis patients. The development of tools that raise the awareness of healthcare providers could significantly decrease the burden of in-hospital sepsis.
Use Case
Early sepsis prediction and detection
AINIGMA technologies develops AI-enabled algorithms for early prediction and detection of sepsis incidents in the hospital setting. The algorithm is trained by processing real-world clinical and laboratory data and their variations in all-cause hospital admissions (excluding the admissions where sepsis is the initial diagnosis). We aim to develop a sensitive tool that will predict the risk of sepsis in admitted patients and detect sepsis incidents early in the course of the disease, hence raising the awareness of healthcare providers, to intervene as soon as possible.
In-home estimation and Prediction of Sarcopenia
Sarcopenia is an age-related or secondary to comorbid conditions, generalized and progressive disorder of the skeletal muscles. Sarcopenia is diagnosed when low muscle strength occurs in combination with low muscle quantity or quality, while when the above findings are associated with low physical performance, sarcopenia is considered severe. Muscle wasting is an important finding in cancer patients, since it is a hallmark finding in cancer cachexia, a multifactorial clinical condition with deleterious effects in the patients’ Quality of Life and disease prognosis. The loss of muscle mass is a common feature in sarcopenia and cachexia, while sarcopenia in cancer patients can manifest in isolation or as a partial component of cancer cachexia. Muscle wasting throughout the course of a neoplastic disease is generally attributed to poor nutrition, physical inactivity and cytokine-mediated inflammation and this mechanism is further exacerbated in older patients.
The early identification of muscle loss in cancer patients could be a key measure for early enough intervention. Current state-of-the-art defines muscle loss (sarcopenia) using measurements of muscle thickness, muscle quantity and physical performance. All measurements are usually performed in the hospital environment. Until now, to assess muscle quantity,imaging modalities are being used, such as MRI, CT or Xrays.
AINIGMA develops an innovative solution for remote monitoring of muscle mass and the early detection of sarcopenia, with the aim to provide timely intervention for the prevention of cachexia. The proposed solution will be based on the combination of measurements from a handheld dynamometer, triaxial accelerometers and ultrasound measurements. The resulting signals, combined with the dynamometer measurements will be automatically processed by signal processing and machine learning algorithms These measurements will be monitored at a weekly level, and will allow clinicians and caregivers to measure the rate of muscle mass loss and adapt the nutrition and exercise schedule of patients to prevent the onset of cachexia and associated comorbidities. The timely detection/prediction of sarcopenia will lead to early nutritional and psychological interventions, hence improving the palliative care in this frail patient group.
Prediction of evolution of Multiple Sclerosis
Multiple sclerosis (MS) is a chronic, non-communicable, incurable inflammatory disorder of the brain and spinal cord. The body’s immune system incorrectly attacks its central nervous system, causing variable, unpredictable symptoms. It is the leading cause of non-traumatic disability of young and middle-aged adults in many developed countries. It affects more than 2.5 million people worldwide and though it remains impossible to cure MS, early and effective treatment can slow down progression.
State-of-the-art literature, which shows that predicting disability progression due to MS can be achieved to a significant degree. However, to date, there is no Decision Support System for MS progression available to facilitate the accurate prediction of MS and be able to be employed in a sufficiently trustworthy manner in a real-world setting. Several methodological issues still remain, such as the lack of assessment of probabilistic calibration, possible bias in the cohort selection and mainly the lack of using high-dimensional data which are essential for sensitive predictions of progression.
AINIGMA technologies offers a DSS which evaluates different types of biomarkers in order to offer a multimodal estimation of the evolution of MS. We facilitate the following biomarkers in model fusion approaches: Neuro-imaging biomarkers which are considered to be the gold standard for disease activity. Electrophysiological biomarkers such as Evoked Potential disturbances which are widely utilised in MS to demonstrate the involvement of sensor, visual, auditory and motor pathways and can be considered as a functional counterpart of the anatomical findings on MRI and digital biomarkers. The devices from which digital biomarkers stem are quite broad, and range from wearables that collect patients’ activity during digitized cognitive tests (e.g., the MS Performance Test, and speech test) to digitalized diagnostic procedures (e.g., optical coherence tomography) and biomarkers extracted from serious games.