Check nearby libraries
Buy this book
The intent of this book is to present recent results in the control theory for the long run average continuous control problem of piecewise deterministic Markov processes (PDMPs). The book focuses mainly on the long run average cost criteria and extends to the PDMPs some well-known techniques related to discrete-time and continuous-time Markov decision processes, including the so-called average inequality approach'',
vanishing discount technique'' and
policy iteration algorithm''. We believe that what is unique about our approach is that, by using the special features of the PDMPs, we trace a parallel with the general theory for discrete-time Markov Decision Processes rather than the continuous-time case. The two main reasons for doing that is to use the powerful tools developed in the discrete-time framework and to avoid working with the infinitesimal generator associated to a PDMP, which in most cases has its domain of definition difficult to be characterized. Although the book is mainly intended to be a theoretically oriented text, it also contains some motivational examples. The book is targeted primarily for advanced students and practitioners of control theory. The book will be a valuable source for experts in the field of Markov decision processes. Moreover, the book should be suitable for certain advanced courses or seminars. As background, one needs an acquaintance with the theory of Markov decision processes and some knowledge of stochastic processes and modern analysis.
Check nearby libraries
Buy this book
Previews available in: English
Edition | Availability |
---|---|
1
Continuous Average Control of Piecewise Deterministic Markov Processes
2013, Springer New York, Imprint: Springer
electronic resource /
in English
146146983X 9781461469834
|
aaaa
|
Book Details
Edition Notes
Classifications
The Physical Object
Edition Identifiers
Work Identifiers
Community Reviews (0)
November 2, 2021 | Edited by ImportBot | import existing book |
June 29, 2019 | Created by MARC Bot | import new book |