Repository logo
 
Publication

On-Board Deep Q-Network for UAV-Assisted Online Power Transfer and Data Collection

dc.contributor.authorLi, Kai
dc.contributor.authorNi, Wei
dc.contributor.authorTovar, Eduardo
dc.contributor.authorJamalipour, Abbas
dc.date.accessioned2020-01-16T14:06:53Z
dc.date.embargo2120
dc.date.issued2019
dc.description.abstractUnmanned Aerial Vehicles (UAVs) with Microwave Power Transfer (MPT) capability provide a practical means to deploy a large number of wireless powered sensing devices into areas with no access to persistent power supplies. The UAV can charge the sensing devices remotely and harvest their data. A key challenge is online MPT and data collection in the presence of on-board control of a UAV (e.g., patrolling velocity) for preventing battery drainage and data queue overflow of the devices, while up-to-date knowledge on battery level and data queue of the devices is not available at the UAV. In this paper, an on-board deep Q-network is developed to minimize the overall data packet loss of the sensing devices, by optimally deciding the device to be charged and interrogated for data collection, and the instantaneous patrolling velocity of the UAV. Specifically, we formulate a Markov Decision Process (MDP) with the states of battery level and data queue length of devices, channel conditions, and waypoints given the trajectory of the UAV; and solve it optimally with Q-learning. Furthermore, we propose the on-board deep Q-network that enlarges the state space of the MDP, and a deep reinforcement learning based scheduling algorithm that asymptotically derives the optimal solution online, even when the UAV has only outdated knowledge on the MDP states. Numerical results demonstrate that our deep reinforcement learning algorithm reduces the packet loss by at least 69.2%, as compared to existing non-learning greedy algorithms.pt_PT
dc.description.versioninfo:eu-repo/semantics/publishedVersionpt_PT
dc.identifier.doi10.1109/TVT.2019.2945037pt_PT
dc.identifier.issn1939-9359
dc.identifier.urihttp://hdl.handle.net/10400.22/15286
dc.language.isoengpt_PT
dc.peerreviewedyespt_PT
dc.publisherInstitute of Electrical and Electronics Engineerspt_PT
dc.relationARNET, ref. POCI-01-0145-FEDER029074pt_PT
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/8854903pt_PT
dc.subjectUnmanned aerial vehiclept_PT
dc.subjectMicrowave power transferpt_PT
dc.subjectOnline resource allocationpt_PT
dc.subjectDeep reinforcement learningpt_PT
dc.subjectMarkov decision processpt_PT
dc.titleOn-Board Deep Q-Network for UAV-Assisted Online Power Transfer and Data Collectionpt_PT
dc.typejournal article
dspace.entity.typePublication
oaire.citation.endPage12226pt_PT
oaire.citation.issue12pt_PT
oaire.citation.startPage12215pt_PT
oaire.citation.titleIEEE Transactions on Vehicular Technologypt_PT
oaire.citation.volume68pt_PT
person.familyNameLi
person.familyNameTovar
person.familyNameJamalipour
person.givenNameKai
person.givenNameEduardo
person.givenNameAbbas
person.identifier.ciencia-idEE10-B822-16ED
person.identifier.ciencia-id6017-8881-11E8
person.identifier.orcid0000-0002-0517-2392
person.identifier.orcid0000-0001-8979-3876
person.identifier.orcid0000-0002-1807-7220
person.identifier.scopus-author-id7006312557
rcaap.rightsclosedAccesspt_PT
rcaap.typearticlept_PT
relation.isAuthorOfPublication21f3fb85-19c2-4c89-afcd-3acb27cedc5e
relation.isAuthorOfPublication80b63d8a-2e6d-484e-af3c-55849d0cb65e
relation.isAuthorOfPublication81b6f15e-ec7f-4610-819c-57aedf9abc08
relation.isAuthorOfPublication.latestForDiscovery80b63d8a-2e6d-484e-af3c-55849d0cb65e

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
ART_CISTER_kaili_2019.pdf
Size:
2.17 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: