The book is devoted to systems with discontinuous control. The study of discontinuous dynamic systems is a multifacet problem which embraces mathematical, control theoretic and application aspects. Times and again, this problem has been approached by mathematicians, physicists and engineers, each profession treating it from its own positions. Interestingly, the results obtained by specialists in different disciplines have almost always had a significant effect upon the development of the control theory. It suffices to mention works on the theory of oscillations of discontinuous nonlinear systems, mathematical studies in ordinary differential equations with discontinuous righthand parts or variational problems in nonclassic statements. The unremitting interest to discontinuous control systems enhanced by their effective application to solution of problems most diverse in their physical nature and functional purpose is, in the author's opinion, a cogent argument in favour of the importance of this area of studies. It seems a useful effort to consider, from a control theoretic viewpoint, the mathematical and application aspects of the theory of discontinuous dynamic systems and determine their place within the scope of the present-day control theory. The first attempt was made by the author in 1975-1976 in his course on "The Theory of Discontinuous Dynamic Systems" and "The Theory of Variable Structure Systems" read to post-graduates at the University of Illinois, USA, and then presented in 1978-1979 at the seminars held in the Laboratory of Systems with Discontinous Control at the Institute of Control Sciences in Moscow.
The lectures gathered in this volume present some of the different aspects of Mathematical Control Theory. Adopting the point of view of Geometric Control Theory and of Nonlinear Control Theory, the lectures focus on some aspects of the Optimization and Control of nonlinear, not necessarily smooth, dynamical systems. Specifically, three of the five lectures discuss respectively: logic-based switching control, sliding mode control and the input to the state stability paradigm for the control and stability of nonlinear systems. The remaining two lectures are devoted to Optimal Control: one investigates the connections between Optimal Control Theory, Dynamical Systems and Differential Geometry, while the second presents a very general version, in a non-smooth context, of the Pontryagin Maximum Principle. The arguments of the whole volume are self-contained and are directed to everyone working in Control Theory. They offer a sound presentation of the methods employed in the control and optimization of nonlinear dynamical systems.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.