Skip to content
Mehdi Zallaghi edited this page Aug 2, 2023 · 1 revision

Welcome to the Optimal-Control wiki!

Optimal control theory is a mathematical framework that finds the best control strategy for a dynamic system to optimize a certain performance criterion. This work presents a numerical analysis of optimal control problems covering the fundamental concepts, solution methods, and computational aspects. Firstly, the basic concepts in optimal control are presented to provide a foundation for understanding and formulating optimal control problems. Then, various solution methods for optimal control problems, including direct methods, indirect methods, dynamic programming, and data-driven methods, are discussed. In addition, some challenges, such as computational complexity and scalability, handling uncertainties, and robustness, are addressed.

Solution methods for optimal control problems:

  • Calculus of Variations
  • Pontryagin's Maximum Principle
  • Dynamic Programming
  • Numerical Optimization
  • Model Predictive Control
  • Machine Learning
Clone this wiki locally