Publication
Empiric evaluation of a real-time robot dancing framework based on multi-modal events
dc.contributor.author | Oliveira, João Lobato | |
dc.contributor.author | Reis, Luis Paulo | |
dc.contributor.author | Faria, Brigida Monica | |
dc.date.accessioned | 2025-03-27T15:45:37Z | |
dc.date.available | 2025-03-27T15:45:37Z | |
dc.date.issued | 2012-12 | |
dc.description.abstract | Musical robots have already inspired the creation of worldwide robotic dancing contests, as RoboCup-Junior's Dance, where school teams, formed by children aged eight to eighteen, put their robots in action, performing dance to music in a display that emphasizes creativity of costumes and movement. This paper describes and assesses a framework for robot dancing edutainment applications. The proposed robotics architecture enables the definition of choreographic compositions, which result on a conjunction of reactive dancing motions in real-time response to multi-modal inputs. These inputs are shaped by three rhythmic events (representing soft, medium, and strong musical note-onsets), different dance floor colors, and the awareness of the surrounding obstacles. This layout was applied to a LegoNXT humanoid robot, built with two Lego-NXT kits, and running on a hand-made dance stage.We report on an empirical evaluation over the overall robot dancing performance made to a group of students after a set of live demonstrations. This evaluation validated the framework's potential application in edutainment robotics and its ability to sustain the interest of the general audience by offering a reasonable compromise between musical-synchrony, animacy and dance performance’s variability. | pt_PT |
dc.description.version | info:eu-repo/semantics/publishedVersion | pt_PT |
dc.identifier.citation | Oliveira, J. L., Reis, L. P., & Faria, B. M. (2012). Empiric evaluation of a real-time robot dancing framework based on multi-modal events. ResearchGate, 10 (8), 1917–1928. doi:10.11591/telkomnika.v10i8.1327 | pt_PT |
dc.identifier.doi | 10.11591/telkomnika.v10i8.1327 | pt_PT |
dc.identifier.eissn | 2087-278X | |
dc.identifier.uri | http://hdl.handle.net/10400.22/29910 | |
dc.language.iso | eng | pt_PT |
dc.peerreviewed | n/a | |
dc.publisher | Universitas Ahmad Dahlan | pt_PT |
dc.relation | SFRH/BD/43704/2008 and SFRH/BD/44541/2008; RIPD/ADA/109636/2009 | |
dc.rights.uri | N/A | |
dc.subject | Robot dancing | pt_PT |
dc.subject | Human-robot interaction evaluation | pt_PT |
dc.subject | Edutainment robotics | pt_PT |
dc.title | Empiric evaluation of a real-time robot dancing framework based on multi-modal events | pt_PT |
dc.type | journal article | |
dspace.entity.type | Publication | |
oaire.citation.endPage | 1928 | pt_PT |
oaire.citation.startPage | 1917 | pt_PT |
oaire.citation.title | TELKOMNIKA | pt_PT |
oaire.citation.volume | 10 (8) | pt_PT |
oaire.version | http://purl.org/coar/version/c_970fb48d4fbd8a85 | |
person.familyName | Faria | |
person.givenName | Brigida Monica | |
person.identifier | R-000-T1F | |
person.identifier.ciencia-id | 0D1F-FB5E-55E4 | |
person.identifier.orcid | 0000-0003-2102-3407 | |
person.identifier.rid | C-6649-2012 | |
person.identifier.scopus-author-id | 6506476517 | |
rcaap.rights | openAccess | pt_PT |
rcaap.type | article | pt_PT |
relation.isAuthorOfPublication | 85832a40-7ef9-431a-be0c-78b45ebbae86 | |
relation.isAuthorOfPublication.latestForDiscovery | 85832a40-7ef9-431a-be0c-78b45ebbae86 |