Office of Academic Resources
Chulalongkorn University
Chulalongkorn University

Home / Help

AuthorMiller, Boris M. author
TitleImpulsive Control in Continuous and Discrete-Continuous Systems [electronic resource] / by Boris M. Miller, Evgeny Ya. Rubinovich
ImprintBoston, MA : Springer US : Imprint: Springer, 2003
Connect to
Descript XII, 447 p. online resource


Impulsive Control in Continuous and Discrete-Continuous Systems is an up-to-date introduction to the theory of impulsive control in nonlinear systems. This is a new branch of the Optimal Control Theory, which is tightly connected to the Theory of Hybrid Systems. The text introduces the reader to the interesting area of optimal control problems with discontinuous solutions, discussing the application of a new and effective method of discontinuous time-transformation. With a large number of examples, illustrations, and applied problems arising in the area of observation control, this book is excellent as a textbook or reference for a senior or graduate-level course on the subject, as well as a reference for researchers in related fields


1 Introduction -- 1.1 Concept of discrete-continuous (hybrid) system and some typical problems -- 1.2 Robust and non-robust discrete-continuous systems -- 1.3 The structure of the book -- 2 Discrete-continuous systems with impulse control -- 2.1 Definition of impulsive control and the system solution -- 2.2 Stability (robustness) conditions for systems with impulse control -- 2.3 Generalized solutions of systems with impulse control and their representations -- 3 Optimal impulse control problem with restricted number of impulses -- 3.1 Introduction and the problem statement -- 3.2 Auxiliary optimal control problem -- 3.3 Necessary and sufficient optimality conditions -- 4 Representation of generalized solutions via differential equations with measures -- 4.1 Generalized solutions of nonlinear differential equations -- 4.2 Generalized solutions of differential equations with affine dependence on unbounded controls -- 5 Optimal control problems within the class of generalized solutions -- 5.1 Statement of the optimal control problems with phase constraints -- 5.2 Existence of the optimal generalized solution -- 5.3 Optimal control problems for DCS with ordinary and impulse controls -- 5.4 Optimal generalized solutions in nonlinear hybrid systems -- 6 Optimality conditions in control problems within the class of generalized solutions -- 6.1 Introduction -- 6.2 Generalized maximum principle -- 6.3 Applications of generalized maximum principle -- 6.4 Generalized maximum principle in linear-convex problems -- 7 Observation control problems in discrete-continuous stochastic systems -- 7.1 Statement of the problem -- 7.2 Generalized solutions in observation control problems -- 7.3 Convex properties of observation control problems -- 7.4 Examples of observation control problems -- 7.5 Observation control problem with constrained number of the observation instants -- 8 Appendix. Differential equations with measures -- 8.1 Auxiliary results -- 8.2 Linear differential equations with a measure -- 8.3 Nonlinear differential equations with a measure

Mathematics Difference equations Functional equations Differential equations System theory Calculus of variations Mathematics Calculus of Variations and Optimal Control; Optimization Systems Theory Control Ordinary Differential Equations Difference and Functional Equations


Office of Academic Resources, Chulalongkorn University, Phayathai Rd. Pathumwan Bangkok 10330 Thailand

Contact Us

Tel. 0-2218-2929,
0-2218-2927 (Library Service)
0-2218-2903 (Administrative Division)
Fax. 0-2215-3617, 0-2218-2907

Social Network


facebook   instragram