Instituto de Ciencias Matemáticas, CSIC-UAM-UC3M-UCM, 28006 Madrid, Spain
Department of Fundamental Mathematics, Faculty of Mathematics, University of La Laguna, C/Astrofisico Fco. Sanchez s/n, 38071, La Laguna, Tenerife, Canary Islands, Spain
Instituto de Ciencias Matematicas, C/ Serrano 123, 28006 Madrid, Spain
In addition, the Hamilton-Jacobi-Bellman equation is a partial differential equation which is central to optimal control theory. The equation is a result of the theory of dynamic programming which was pioneered in the 1950s by Richard Bellman and coworkers. The corresponding discrete-time equation is usually referred to as the Bellman equation. In continuous time, the result can be seen as an extension of earlier work in classical physics on the Hamilton-Jacobi equation by Hamilton and Jacobi.
This special issue on Hamilton-Jacobi theory wants to bring specialists coming from different areas of research and show how the Hamilton-Jacobi theory is so useful in their domains: completely integrable systems, nonholonomic mechanics, Schrödinger equation, optimal control theory, and, in particular, applications in engineering and economics.
Urszula Ledzewicz, Marek Galewski, Andrzej Nowakowski, Andrzej Swierniak, Agnieszka Kalamajska, Ewa Schmeidel. Preface. Discrete and Continuous Dynamical Systems - B, 2014, 19 (8) : i-ii. doi: 10.3934/dcdsb.2014.19.8i
2021 Impact Factor: 0.737
[Back to Top]