Read our cookies policy and privacy statement for more information.
×Dekalb, Illinois
Introduction to the basic theory and methods for the optimization of control system problems. Topics include matrix calculus, optimization with and without constraints, calculus of variations, dynamic programming with applications, optimal control of continuous and discrete systems, state estimation, and Kalman filters with electrical engineering applications
Units: 3.0