Brain Computer Interfaces and Human-Robot Interaction

Dr. Ramana Kumar Vinjamuri, Department of Computer Science and Electrical Engineering

The human central nervous system (CNS) effortlessly performs complex hand movements with the control and coordination of multiple degrees of freedom. It is hypothesized that the CNS might use kinematic synergies to reduce the complexity of movements, but how these kinematic synergies are encoded in the CNS remains unclear. In order to investigate the neural representations of kinematic synergies, scalp electroencephalographic (EEG) signals and hand kinematics are recorded during representative types of hand grasping. These multimodal signals are analyzed using high performance computing. The results have promising applications in noninvasive neural control of synergy-based prostheses and exoskeletons and human robot interaction. Emotionally intelligent interactions between humans and robots are essential to accomplish shared tasks in collaborative processes. Robots may exploit various communication channels to recognize human emotions, including hearing, speech, sight, touch, and learning. They can then use this information to adapt their behavior in a way that is socially appropriate for humans. In this research, we focus on neurophysiological and behavioral signals of communication.