Two research engineer positions (Human-Robot-Interaction Engineer, and EEG/MoCAP Data Scientist) are open at INSERM U1093 CAPS and its Robot Cognition Laboratory (RCL) in Dijon. Both positions are for 24 months with possibility of subsequent employment at Bioserenity. We wish to hire with starting dates no later than October 1, 2022. Candidates should have a MSc or equivalent.
INSERM CAPS / RCL is a pioneer in human-robot interaction working with the iCub since 2009, and Pepper since 2017. In the new environment in Dijon we benefit from state of the art motion capture and electrophysiology which significantly extends our human-robot interaction capabilities.
The two positions are part of a new GIS “Starter” ReadapTIC in direct cooperation with Bioserenity, which aims to use innovative strategies and artificial intelligence for the rehabilitation of motor function and the preservation of autonomy.
Position 1 – Human-Robot-Interaction Engineer:
The research project studies the physical human-robot interaction. The project will use humanoid robots (iCub and Pepper), a state-of-the-art motion capture system (Vicon), and EEG recording systems (Bioserenity Neuronaute, Biosemi). Work will include synchronizing signals from these systems for offline and real-time interactions. The engineer will work with these robots, including maintenance and installation of software updates using Github and YARP. The engineer will develop behavioral scripts for the robots to control human-robot interaction experiments, and will interface the robots with the 3D motion analysis and EEG systems to help researchers develop protocols for human-robot interaction. The engineer will adapt our existing control software (SWoOz) that will manage an interface between the Vicon motion capture system and the iCub controllers: this can include systems to implement real-time teleoperation of the iCub based on Vicon motion capture. The candidate should be motivated for hands-on work with robots!
Position 2 – EEG/MoCAP Data Scientist
The research project will explore how state-of-the-art physiological measures (kinematics, electroencephalography) can be used to quantify and adapt human motor function training. Data will be generated in experiments where human subjects perform motor tasks – human-robot interaction, upper limb movement, whole body movement – in a real or virtual reality environment and under various sensory stimulation conditions (visual, proprioceptive, vestibular). The level of human performance in the environment will be quantified by various state-of-the-art techniques (kinematic, EEG, electromyographic), and neural correlates will be sought in electroencephalographic measurements. The project will use humanoid robots (iCub and Pepper), a state-of-the-art motion capture system (Vicon), and EEG recording systems (Bioserenity Neuronaute and Biosemi).
The data scientist will collect and analyze these behavioral and electroencephalographic movement data. In particular, they will develop automatic analysis tools capable of detecting markers of motor learning/adaptation, common to the different signals recorded in the experiments. These markers will then be used to regulate the stimulations in on-line motor training.