Multimodal Data Fusion

Tulay Adali, Department of Computer Science and Electrical Engineering
Ruchir Saheba, Department of Computer Science and Electrical Engineering

Data collection from different sensors or modalities is becoming increasingly popular in neurological studies, since each modality is expected to provide unique, yet complementary, information about the brain function. Maximizing the utilization of the joint information available in such interrelated modalities is, therefore, the fundamental motivation for performing a fusion on multimodal data. Since the relationship among modalities are is not well understood, it is important to reduce the assumptions placed on the data and let the modalities fully interact with each other. To that end, the emphasis of this research project is to develop data-driven fusion methods based on blind source separation (BSS) techniques such as independent component analysis (ICA), its multi-dataset version independent vector analysis (IVA) and canonical correlation analysis (CCA) to jointly analyze multimodal neurological data. Data-driven fusion methods minimize modelling assumptions placed on the data, and thus enables full interaction among the modalities. This results in meaningful decompositions of the multimodal data that can be either used as informative features or interpretable biomarkers of disease.