Repository logo
 
Publication

DeepSpatial: Intelligent Spatial Sensor to Perception of Things

dc.contributor.authorSimões Teixeira, Marco António
dc.contributor.authorNeves JR, Flávio
dc.contributor.authorKoubaa, Anis
dc.contributor.authorRamos de Arruda, Lúcia Valéria
dc.contributor.authorSchneider de Oliveira, André
dc.date.accessioned2021-02-25T14:33:53Z
dc.date.available2021-02-25T14:33:53Z
dc.date.issued2021
dc.description.abstractThis paper discusses a spatial sensor to identify and track objects in the environment. The sensor is composed of an RGB-D camera that provides point cloud and RGB images and an egomotion sensor able to identify its displacement in the environment. The proposed sensor also incorporates a data processing strategy developed by the authors to conferring to the sensor different skills. The adopted approach is based on four analysis steps: egomotive, lexical, syntax, and prediction analysis. As a result, the proposed sensor can identify objects in the environment, track these objects, calculate their direction, speed, and acceleration, and also predict their future positions. The on-line detector YOLO is used as a tool to identify objects, and its output is combined with the point cloud information to obtain the spatial location of each identified object. The sensor can operate with higher precision and a lower update rate, using YOLOv2, or with a higher update rate, and a smaller accuracy using YOLOv3-tiny. The object tracking, egomotion, and collision prediction skills are tested and validated using a mobile robot having a precise speed control. The presented results show that the proposed sensor (hardware + software) achieves a satisfactory accuracy and usage rate, powering its use to mobile robotic. This paper's contribution is developing an algorithm for identifying, tracking, and predicting the future position of objects embedded in a compact hardware. Thus, the contribution of this paper is to convert raw data from traditional sensors into useful information.pt_PT
dc.description.versioninfo:eu-repo/semantics/publishedVersionpt_PT
dc.identifier.doi10.1109/JSEN.2020.3035355pt_PT
dc.identifier.issn1558-1748
dc.identifier.urihttp://hdl.handle.net/10400.22/17150
dc.language.isoengpt_PT
dc.peerreviewedyespt_PT
dc.publisherInstitute of Electrical and Electronics Engineerspt_PT
dc.relation.publisherversionhttps://ieeexplore.ieee.org/abstract/document/9248019/authors#authorspt_PT
dc.subjectSpatial sensorpt_PT
dc.subjectEgomotionpt_PT
dc.subjectYOLOpt_PT
dc.subjectMobile robotpt_PT
dc.titleDeepSpatial: Intelligent Spatial Sensor to Perception of Thingspt_PT
dc.typejournal article
dspace.entity.typePublication
oaire.citation.endPage3976pt_PT
oaire.citation.issue4pt_PT
oaire.citation.startPage3966pt_PT
oaire.citation.titleIEEE Sensors Journalpt_PT
oaire.citation.volume21pt_PT
person.familyNameKoubaa
person.givenNameAnis
person.identifier989131
person.identifier.ciencia-idCA19-2399-D94A
person.identifier.orcid0000-0003-3787-7423
person.identifier.scopus-author-id15923354900
rcaap.rightsopenAccesspt_PT
rcaap.typearticlept_PT
relation.isAuthorOfPublication0337d7df-5f77-46a4-8269-83d14bd5ea6b
relation.isAuthorOfPublication.latestForDiscovery0337d7df-5f77-46a4-8269-83d14bd5ea6b

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ART_CISTER_Koubaa_2021.pdf
Size:
6.01 MB
Format:
Adobe Portable Document Format