Repository logo
 
Publication

Empiric evaluation of a real-time robot dancing framework based on multi-modal events

dc.contributor.authorOliveira, João Lobato
dc.contributor.authorReis, Luis Paulo
dc.contributor.authorFaria, Brigida Monica
dc.date.accessioned2025-03-27T15:45:37Z
dc.date.available2025-03-27T15:45:37Z
dc.date.issued2012-12
dc.description.abstractMusical robots have already inspired the creation of worldwide robotic dancing contests, as RoboCup-Junior's Dance, where school teams, formed by children aged eight to eighteen, put their robots in action, performing dance to music in a display that emphasizes creativity of costumes and movement. This paper describes and assesses a framework for robot dancing edutainment applications. The proposed robotics architecture enables the definition of choreographic compositions, which result on a conjunction of reactive dancing motions in real-time response to multi-modal inputs. These inputs are shaped by three rhythmic events (representing soft, medium, and strong musical note-onsets), different dance floor colors, and the awareness of the surrounding obstacles. This layout was applied to a LegoNXT humanoid robot, built with two Lego-NXT kits, and running on a hand-made dance stage.We report on an empirical evaluation over the overall robot dancing performance made to a group of students after a set of live demonstrations. This evaluation validated the framework's potential application in edutainment robotics and its ability to sustain the interest of the general audience by offering a reasonable compromise between musical-synchrony, animacy and dance performance’s variability.pt_PT
dc.description.versioninfo:eu-repo/semantics/publishedVersionpt_PT
dc.identifier.citationOliveira, J. L., Reis, L. P., & Faria, B. M. (2012). Empiric evaluation of a real-time robot dancing framework based on multi-modal events. ResearchGate, 10 (8), 1917–1928. doi:10.11591/telkomnika.v10i8.1327pt_PT
dc.identifier.doi10.11591/telkomnika.v10i8.1327pt_PT
dc.identifier.eissn2087-278X
dc.identifier.urihttp://hdl.handle.net/10400.22/29910
dc.language.isoengpt_PT
dc.peerreviewedn/a
dc.publisherUniversitas Ahmad Dahlanpt_PT
dc.relationSFRH/BD/43704/2008 and SFRH/BD/44541/2008; RIPD/ADA/109636/2009
dc.rights.uriN/A
dc.subjectRobot dancingpt_PT
dc.subjectHuman-robot interaction evaluationpt_PT
dc.subjectEdutainment roboticspt_PT
dc.titleEmpiric evaluation of a real-time robot dancing framework based on multi-modal eventspt_PT
dc.typejournal article
dspace.entity.typePublication
oaire.citation.endPage1928pt_PT
oaire.citation.startPage1917pt_PT
oaire.citation.titleTELKOMNIKApt_PT
oaire.citation.volume10 (8)pt_PT
oaire.versionhttp://purl.org/coar/version/c_970fb48d4fbd8a85
person.familyNameFaria
person.givenNameBrigida Monica
person.identifierR-000-T1F
person.identifier.ciencia-id0D1F-FB5E-55E4
person.identifier.orcid0000-0003-2102-3407
person.identifier.ridC-6649-2012
person.identifier.scopus-author-id6506476517
rcaap.rightsopenAccesspt_PT
rcaap.typearticlept_PT
relation.isAuthorOfPublication85832a40-7ef9-431a-be0c-78b45ebbae86
relation.isAuthorOfPublication.latestForDiscovery85832a40-7ef9-431a-be0c-78b45ebbae86

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ART_Brígida Faria.pdf
Size:
3.11 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: