This is one of the first books presenting stabilizability of nonlinear systems in a well-organized and detailed way, the problem, its motivation, features and results. Control systems defined by ordinary differential equations are dealt with. Many worked examples have been included. The main focus is on the mathematical aspects of the problem, but some important applications are also described. This book will be suitable as a textbook for advanced university courses, and also as a tool for control theorists and researchers. An extensive list of references is included.
This advanced textbook introduces the main concepts and advances in systems and control theory, and highlights the importance of geometric ideas in the context of possible extensions to the more recent developments in nonlinear systems theory. Although inspired by engineering applications, the content is presented within a strong theoretical framework and with a solid mathematical background, and the reference models are always finite dimensional, time-invariant multivariable linear systems. The book focuses on the time domain approach, but also considers the frequency domain approach, discussing the relationship between the two approaches, especially for single-input-single-output systems. It includes topics not usually addressed in similar books, such as a comparison between the frequency domain and the time domain approaches, bounded input bounded output stability (including a characterization in terms of canonical decomposition), and static output feedback stabilization for which a simple and original criterion in terms of generalized inverse matrices is proposed. The book is an ideal learning resource for graduate students of control theory and automatic control courses in engineering and mathematics, as well as a reference or self-study guide for engineers and applied mathematicians.
This book presents a modern and self-contained treatment of the Liapunov method for stability analysis, in the framework of mathematical nonlinear control theory. A Particular focus is on the problem of the existence of Liapunov functions (converse Liapunov theorems) and their regularity, whose interest is especially motivated by applications to automatic control. Many recent results in this area have been collected and presented in a systematic way. Some of them are given in extended, unified versions and with new, simpler proofs. In the 2nd edition of this successful book several new sections were added and old sections have been improved, e.g., about the Zubovs method, Liapunov functions for discontinuous systems and cascaded systems. Many new examples, explanations and figures were added making this book accessible and well readable for engineers as well as mathematicians.
This book offers a complete and detailed introduction to the theory of discrete dynamical systems, with special attention to stability of fixed points and periodic orbits. It provides a solid mathematical background and the essential basic knowledge for further developments such as, for instance, deterministic chaos theory, for which many other references are available (but sometimes, without an exhaustive presentation of preliminary notions). Readers will find a discussion of topics sometimes neglected in the research literature, such as a comparison between different predictions achievable by the discrete time model and the continuous time model of the same application. Another novel aspect of this book is an accurate analysis of the way a fixed point may lose stability, introducing and comparing several notions of instability: simple instability, repulsivity, and complete instability. To help the reader and to show the flexibility and potentiality of the discrete approach to dynamics, many examples, numerical simulations, and figures have been included. The book is used as a reference material for courses at a doctoral or upper undergraduate level in mathematics and theoretical engineering.
From the reviews: "The book is an excellent combination of theory and real-world applications. Each application not only demonstrates the power of the theoretical results but also is important on its own behalf." IEEE Control Systems Magazine
This book offers a complete and detailed introduction to the theory of discrete dynamical systems, with special attention to stability of fixed points and periodic orbits. It provides a solid mathematical background and the essential basic knowledge for further developments such as, for instance, deterministic chaos theory, for which many other references are available (but sometimes, without an exhaustive presentation of preliminary notions). Readers will find a discussion of topics sometimes neglected in the research literature, such as a comparison between different predictions achievable by the discrete time model and the continuous time model of the same application. Another novel aspect of this book is an accurate analysis of the way a fixed point may lose stability, introducing and comparing several notions of instability: simple instability, repulsivity, and complete instability. To help the reader and to show the flexibility and potentiality of the discrete approach to dynamics, many examples, numerical simulations, and figures have been included. The book is used as a reference material for courses at a doctoral or upper undergraduate level in mathematics and theoretical engineering.
This advanced textbook introduces the main concepts and advances in systems and control theory, and highlights the importance of geometric ideas in the context of possible extensions to the more recent developments in nonlinear systems theory. Although inspired by engineering applications, the content is presented within a strong theoretical framework and with a solid mathematical background, and the reference models are always finite dimensional, time-invariant multivariable linear systems. The book focuses on the time domain approach, but also considers the frequency domain approach, discussing the relationship between the two approaches, especially for single-input-single-output systems. It includes topics not usually addressed in similar books, such as a comparison between the frequency domain and the time domain approaches, bounded input bounded output stability (including a characterization in terms of canonical decomposition), and static output feedback stabilization for which a simple and original criterion in terms of generalized inverse matrices is proposed. The book is an ideal learning resource for graduate students of control theory and automatic control courses in engineering and mathematics, as well as a reference or self-study guide for engineers and applied mathematicians.
This book presents a modern and self-contained treatment of the Liapunov method for stability analysis, in the framework of mathematical nonlinear control theory. A Particular focus is on the problem of the existence of Liapunov functions (converse Liapunov theorems) and their regularity, whose interest is especially motivated by applications to automatic control. Many recent results in this area have been collected and presented in a systematic way. Some of them are given in extended, unified versions and with new, simpler proofs. In the 2nd edition of this successful book several new sections were added and old sections have been improved, e.g., about the Zubovs method, Liapunov functions for discontinuous systems and cascaded systems. Many new examples, explanations and figures were added making this book accessible and well readable for engineers as well as mathematicians.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.