-
Previous Article
Cauchy problems for stationary Hamilton-Jacobi equations under mild regularity assumptions
- JGM Home
- This Issue
- Next Article
Preface
1. | Instituto de Ciencias Matemáticas, CSIC-UAM-UC3M-UCM, 28006 Madrid, Spain |
2. | Department of Fundamental Mathematics, Faculty of Mathematics, University of La Laguna, C/Astrofisico Fco. Sanchez s/n, 38071, La Laguna, Tenerife, Canary Islands, Spain |
3. | Instituto de Ciencias Matematicas, C/ Serrano 123, 28006 Madrid, Spain |
In addition, the Hamilton-Jacobi-Bellman equation is a partial differential equation which is central to optimal control theory. The equation is a result of the theory of dynamic programming which was pioneered in the 1950s by Richard Bellman and coworkers. The corresponding discrete-time equation is usually referred to as the Bellman equation. In continuous time, the result can be seen as an extension of earlier work in classical physics on the Hamilton-Jacobi equation by Hamilton and Jacobi.
This special issue on Hamilton-Jacobi theory wants to bring specialists coming from different areas of research and show how the Hamilton-Jacobi theory is so useful in their domains: completely integrable systems, nonholonomic mechanics, Schrödinger equation, optimal control theory, and, in particular, applications in engineering and economics.
[1] |
Eduard Feireisl, Mirko Rokyta, Josef Málek. Preface. Discrete & Continuous Dynamical Systems - S, 2008, 1 (3) : i-iii. doi: 10.3934/dcdss.2008.1.3i |
2019 Impact Factor: 0.649
Tools
Metrics
Other articles
by authors
[Back to Top]