stochastic optimal contorl: theory and application, robert f. stengel, wiley, new york, 1986, isbn...

1
88 BOOK REVIEWS STOCHASTIC OPTIMAL CONTROL: THEORY AND APPLICATION, Robert F. Stengel, Wiley, New York, 1986, ISBN 0-471-86462-5, Price f47.50, xvi + 638 pp. This book gives an excellent general background to stochastic optimal control theory. Most of the previous books dealing with the subject are difficult for students to understand. As the authors state in the Preface, this book is intended to introduce stochastic optimal control concepts for application to real problems, with sufficient mathematical background to justify their use. It is suitable for readers who have a fairly basic mathematics background and are beginning to learn optimal control. This book provides rich introductory material such as matrix algebra properties and a review of control and estimation in the first two chapters. The advanced reader might skip these chapters entirely and proceed directly to Chapter 3. Chapter 3 addresses optimal control of systems which may be non-linear and time-varying and illustrates how open-loop control generalizes to closed-loop control laws when the system dynamics are linear and the cost function is quadratic. Chapter 4 is a useful chapter for readers who are interested in optimal state estimation. It covers most of the general topics and avoids the more complex theoretical issues of optimal estimation for highly non-linear systems. In Chapter 5 the author discusses the general problems of stochastic optimal control such as linear quadratic design. The asymptotic stability of the linear quadratic regulator and the Kalman-Bucy filter are illustrated with a nice example. Chapter 6 is the most important part for control engineers. It has updated information regarding linear multivariable control. The robustness properties of a control system are well explained in Section 6.5. Many interesting developments in control system analysis have also been mentioned. The main text of the book is covered in 622 pages with references after each chapter. This book is especially good for practical engineers looking for guidance with a particular problem. The comprehensive overview of different subjects provides the reader with a deeper understanding of optimal control before tackling the more advanced texts. To conclude, this book is highly recommended for control engineers. It can be used as a textbook for undergraduate teaching, although the high price may not be suitable for students. DANIEL Ho Industrial Control Unit University of Strathclyde GIasgo w, U. K. MATHEMATICS FOR DYNAMIC MODELING, Edward Beltrami, Academic Press, Orlando, 1987, lSBN 0-12-085555-0, Price f20.50, 227 pp. This book will be welcomed by everyone who is interested in non-linear modelling without being conversant in the formal language of mathematics. Only a few important theorems and proofs are employed in conveying the information. Instead, theory is put to work from the start: readers learning the steps of analysis are immediately rewarded with an insight into the mathematics of heart activity, as an example. By the end of the book, an understanding even of advanced concepts, like bifurcation, catastrophe and chaos, will appear within reach of the non- mathematician. The majority of non-linear equations cannot be solved explicitly but can be analysed by surrogate methods. In the first four chapters, comprising about a third of the book, the reader is equipped with the basic set of mathematical tools. The remainder of the book then deals with problems which are still of interest in current research. Simple differential equation models, familiar from physics, are recalled in Chapter 1. They arise from one of two principles: Newton’s laws of motion or the conservation of mass. The use of state space descriptions, trajectories and equilibrium states and the importance of stability are explained. The next chapter proceeds to a stability analysis of linearized non-linear systems. We learn about eigenvectors and eigenvalues of the Jacobian matrix and their relationship to equilibria, saddle points and manifolds. Pendulum and mass- spring models illustrate the phase plane methods presented. In Chapter 3 Liapunov’s theorem is proved in

Upload: daniel-ho

Post on 06-Jun-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

88 BOOK REVIEWS

STOCHASTIC OPTIMAL CONTROL: THEORY AND APPLICATION, Robert F. Stengel, Wiley, New York, 1986, ISBN 0-471-86462-5, Price f47.50, xvi + 638 pp. This book gives an excellent general background to stochastic optimal control theory. Most of the previous books dealing with the subject are difficult for students to understand. As the authors state in the Preface, this book is intended to introduce stochastic optimal control concepts for application to real problems, with sufficient mathematical background to justify their use. It is suitable for readers who have a fairly basic mathematics background and are beginning to learn optimal control. This book provides rich introductory material such as matrix algebra properties and a review of control and estimation in the first two chapters. The advanced reader might skip these chapters entirely and proceed directly to Chapter 3.

Chapter 3 addresses optimal control of systems which may be non-linear and time-varying and illustrates how open-loop control generalizes to closed-loop control laws when the system dynamics are linear and the cost function is quadratic.

Chapter 4 is a useful chapter for readers who are interested in optimal state estimation. It covers most of the general topics and avoids the more complex theoretical issues of optimal estimation for highly non-linear systems.

In Chapter 5 the author discusses the general problems of stochastic optimal control such as linear quadratic design. The asymptotic stability of the linear quadratic regulator and the Kalman-Bucy filter are illustrated with a nice example.

Chapter 6 is the most important part for control engineers. It has updated information regarding linear multivariable control. The robustness properties of a control system are well explained in Section 6.5. Many interesting developments in control system analysis have also been mentioned.

The main text of the book is covered in 622 pages with references after each chapter. This book is especially good for practical engineers looking for guidance with a particular problem. The comprehensive overview of different subjects provides the reader with a deeper understanding of optimal control before tackling the more advanced texts.

To conclude, this book is highly recommended for control engineers. It can be used as a textbook for undergraduate teaching, although the high price may not be suitable for students.

DANIEL Ho Industrial Control Unit

University of Strathclyde GIasgo w, U. K.

MATHEMATICS FOR DYNAMIC MODELING, Edward Beltrami, Academic Press, Orlando, 1987, lSBN 0-12-085555-0, Price f20.50, 227 pp. This book will be welcomed by everyone who is interested in non-linear modelling without being conversant in the formal language of mathematics. Only a few important theorems and proofs are employed in conveying the information. Instead, theory is put to work from the start: readers learning the steps of analysis are immediately rewarded with an insight into the mathematics of heart activity, as an example. By the end of the book, an understanding even of advanced concepts, like bifurcation, catastrophe and chaos, will appear within reach of the non- mathematician.

The majority of non-linear equations cannot be solved explicitly but can be analysed by surrogate methods. In the first four chapters, comprising

about a third of the book, the reader is equipped with the basic set of mathematical tools. The remainder of the book then deals with problems which are still of interest in current research.

Simple differential equation models, familiar from physics, are recalled in Chapter 1. They arise from one of two principles: Newton’s laws of motion or the conservation of mass. The use of state space descriptions, trajectories and equilibrium states and the importance of stability are explained.

The next chapter proceeds to a stability analysis of linearized non-linear systems. We learn about eigenvectors and eigenvalues of the Jacobian matrix and their relationship to equilibria, saddle points and manifolds. Pendulum and mass- spring models illustrate the phase plane methods presented.

In Chapter 3 Liapunov’s theorem is proved in