The interest for virtual reality has been strongly growing in the last decade, and found various applications from entertainment, to rehabilitation therapies, to the simulation of complex teleoperation scenarios. A successful interaction with a virtual system requires robust and reliable control interfaces, which should rely on intuitive command inputs to ensure rapid proficiency and minimize the task-associated workload (1), and provide appropriate feedback (visual, auditory, haptic) to strengthen the awareness of the operator (2). However, most current interfaces consist of simple third-party devices (joystick, remote control) and show limited performance even with systems with few degrees of freedom (2). The development of intuitive interfaces becomes even more challenging in “non-homologous” interactions, that is when the operators’ command behaviors significantly differ from the machine’s realizable behavior.
In this project, we are studying and developing Body-to-Machine Interfaces (3, 4) as more intuitive approaches to control and interact with virtual systems. In particular, we aim at determining whether, for a given non-homologous scenario, there can be an efficient one-fits-all model. In addition, we are using these unconventional interactions and coordination patterns to provide a better understanding of motor learning.
1. J. V. Draper, L. M. Blair, in Proceedings of IEEE International Conference on Robotics and Automation (1996), vol. 2, pp. 1030–1035 vol.2.
2. J. Y. C. Chen, E. C. Haas, M. J. Barnes, Human performance issues and user interface design for teleoperated robots. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 37, 1231–1245 (2007).
3. M. Casadio, R. Ranganathan, F. A. Mussa-Ivaldi, The Body-Machine Interface: A new perspective on an old theme. J. Mot. Behav. 44, 419–433 (2012).
4. C. Pierella et al., Remapping residual coordination for controlling assistive devices and recovering motor functions. Neuropsychologia. 79, Part B, 364–376 (2015).
If you are interested in this research topic and wish to learn more, don’t hesitate to contact us:
Jenifer Miehlbradt (firstname.lastname@example.org)