Name: | Description: | Size: | Format: | |
---|---|---|---|---|
1.13 MB | Adobe PDF |
Advisor(s)
Abstract(s)
Since the demographics of population, with respect to age, are continuously changing, politicians and scientists start to pay more attention to the needs of senior individuals. Additionally, the well-being and needs of disabled individuals are also becoming highly valued in the political and entrepreneurial society. Intelligent wheelchairs are adapted electric wheelchairs with environmental perception, semi-autonomous behaviour and flexible human-machine-interaction. This paper presents the specification and development of a user-friendly multimodal interface, as a component of the IntellWheels Platform project. The developed prototype combines several input modules, allowing the control of the wheelchair through flexible user defined input sequences of distinct types (speech, facial expressions, head movements and joystick). To validate the effectiveness of the prototype, two experiments were performed with a number of individuals who tested the system firstly by driving a simulated wheelchair in a virtual environment. The second experiment was performed using the real IntellWheels wheelchair prototype. The results achieved proved that the multimodal interface may be successfully used by people, due to the interaction flexibility it provides.
Description
Keywords
Multimodal interface Intelligent robotics Intelligent wheelchair IntellWheels
Citation
Reis, L. P., Faria, B. M., Vasconcelos, S., & Lau, N. (2015). Invited paper: multimodal interface for an intelligent wheelchair. Informatics in Control, Automation and Robotics, 325, 1–34. https://doi.org/10.1007/978-3-319-10891-9_1