סילבוסים תואר שני

בקרה אופטימלית לנה"ט Optimal Control for Managers

אופן ההוראה: סמינר
ש"ס: 3
נ"ז: 3

נושאי הקורס:

Mathematical foundations: state-space representation of systems, continuous-time and discrete-time representations. Controllability, observability, stability.

Least-squares method.

Necessary optimality conditions. Dynamic programming method. State feedback control, closed-loop systems. Linear quadratic regulator  (LQR). Stabilization. 

 

 

 





 

ביבליוגרפיה

 

1. B. Anderson and J. Moore. Optimal Control: Linear Quadratic Methods. Prentice-Hall, 1989.

2. A. Ioffe (Editor), et al., Calculus of Variations and Optimal Control : Technion 1998 (Chapman & Hall/CRC Research Notes in Mathematics, No. 411)Paperback, 1999.

3. S. Barnett, R. G. Cameron (Photographer). Introduction to Mathematical Control Theory , Paperback, 1991.

4. A. E. Bryson. Dynamic Optimization. Hardcover, 1998.

5. D. C. Luenberger, D. G. Luenberger. Introduction to Dynamic Systems: Theory, Models, and Applications, Hardcover, 1979.

6. A. Saberi, et al., Control of Linear Systems With Regulation and Input Constraints (Communications and Control Engineering),  Hardcover, 1999.

7. G. M. Siouris. An Engineering Approach to Optimal Control and Estimation Theory, Paperback, 1996.

8. William S. Levine(Editor), The Control Handbook (Electrical Engineering Handbook Series), Hardcover,  1996.

9. William J., III Palm, Control Systems Engineering , Paperback, 1986. 10. J. Zabczuk. Mathematical Control Theory: An Introduction. BirkHauser, 1992.

11. A. Balakrishnan, L. Neustadt,  Mahematical Theory of Control, N.Y.: Springer, 1967.

12. A. Balakrishnan, Applied functional Analysis, N.Y.: Spirnger, 1976.

13 . M. Hestenes , Calculus of Variations and Optimal Control Theory, N.Y.:Viley, 1966.

14. L. Neishtadt. Optimization. A Theory of Necessary Conditions. N.Y.: Spirnger, 1976.

15. P. Rokafellar, Convex Analysis, 1973.