Please use this identifier to cite or link to this item: http://hdl.handle.net/10773/26100
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPiunovskiy, Alexeypt_PT
dc.contributor.authorPlakhov, Alexanderpt_PT
dc.contributor.authorTorres, Delfim F. M.pt_PT
dc.contributor.authorZhang, Yipt_PT
dc.date.accessioned2019-05-23T15:08:53Z-
dc.date.available2019-05-23T15:08:53Z-
dc.date.issued2019-
dc.identifier.issn0363-0129pt_PT
dc.identifier.urihttp://hdl.handle.net/10773/26100-
dc.description.abstractUsing the tools of the Markov Decision Processes, we justify the dynamic programming approach to the optimal impulse control of deterministic dynamical systems. We prove the equivalence of the integral and differential forms of the optimality equation. The theory is illustrated by an example from mathematical epidemiology. The developed methods can be also useful for the study of piecewise deterministic Markov processes.pt_PT
dc.language.isoengpt_PT
dc.rightsopenAccesspt_PT
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/pt_PT
dc.subjectDynamical Systempt_PT
dc.subjectImpulse Controlpt_PT
dc.subjectTotal Costpt_PT
dc.subjectDiscounted Costpt_PT
dc.subjectRandomized strategypt_PT
dc.subjectPiecewise Deterministic Markov Processpt_PT
dc.titleOptimal impulse control of dynamical systemspt_PT
dc.typearticlept_PT
dc.description.versionin publicationpt_PT
dc.peerreviewedyespt_PT
degois.publication.firstPage1pt_PT
degois.publication.lastPage29pt_PT
degois.publication.titleSIAM Journal on Control and Optimizationpt_PT
dc.identifier.essn1095-7138pt_PT
Appears in Collections:CIDMA - Artigos
AGG - Artigos

Files in This Item:
File Description SizeFormat 
impF.pdf527.87 kBAdobe PDFView/Open


FacebookTwitterLinkedIn
Formato BibTex MendeleyEndnote Degois 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.