Instituto de Ciencias Matemáticas, CSIC-UAM-UC3M-UCM, 28006 Madrid, Spain
Department of Fundamental Mathematics, Faculty of Mathematics, University of La Laguna, C/Astrofisico Fco. Sanchez s/n, 38071, La Laguna, Tenerife, Canary Islands, Spain
Instituto de Ciencias Matematicas, C/ Serrano 123, 28006 Madrid, Spain
In addition, the Hamilton-Jacobi-Bellman equation is a partial differential equation which is central to optimal control theory. The equation is a result of the theory of dynamic programming which was pioneered in the 1950s by Richard Bellman and coworkers. The corresponding discrete-time equation is usually referred to as the Bellman equation. In continuous time, the result can be seen as an extension of earlier work in classical physics on the Hamilton-Jacobi equation by Hamilton and Jacobi.
This special issue on Hamilton-Jacobi theory wants to bring specialists coming from different areas of research and show how the Hamilton-Jacobi theory is so useful in their domains: completely integrable systems, nonholonomic mechanics, Schrödinger equation, optimal control theory, and, in particular, applications in engineering and economics.
2019 Impact Factor: 0.649
[Back to Top]