Please use this identifier to cite or link to this item:
http://hdl.handle.net/10773/26100
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Piunovskiy, Alexey | pt_PT |
dc.contributor.author | Plakhov, Alexander | pt_PT |
dc.contributor.author | Torres, Delfim F. M. | pt_PT |
dc.contributor.author | Zhang, Yi | pt_PT |
dc.date.accessioned | 2019-05-23T15:08:53Z | - |
dc.date.available | 2019-05-23T15:08:53Z | - |
dc.date.issued | 2019 | - |
dc.identifier.issn | 0363-0129 | pt_PT |
dc.identifier.uri | http://hdl.handle.net/10773/26100 | - |
dc.description.abstract | Using the tools of the Markov Decision Processes, we justify the dynamic programming approach to the optimal impulse control of deterministic dynamical systems. We prove the equivalence of the integral and differential forms of the optimality equation. The theory is illustrated by an example from mathematical epidemiology. The developed methods can be also useful for the study of piecewise deterministic Markov processes. | pt_PT |
dc.language.iso | eng | pt_PT |
dc.rights | openAccess | pt_PT |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | pt_PT |
dc.subject | Dynamical System | pt_PT |
dc.subject | Impulse Control | pt_PT |
dc.subject | Total Cost | pt_PT |
dc.subject | Discounted Cost | pt_PT |
dc.subject | Randomized strategy | pt_PT |
dc.subject | Piecewise Deterministic Markov Process | pt_PT |
dc.title | Optimal impulse control of dynamical systems | pt_PT |
dc.type | article | pt_PT |
dc.description.version | in publication | pt_PT |
dc.peerreviewed | yes | pt_PT |
degois.publication.firstPage | 1 | pt_PT |
degois.publication.lastPage | 29 | pt_PT |
degois.publication.title | SIAM Journal on Control and Optimization | pt_PT |
dc.identifier.essn | 1095-7138 | pt_PT |
Appears in Collections: | CIDMA - Artigos AGG - Artigos |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.