Home > Department of Statistics > Events > abstracts > Time inconsistent stochastic control

 

Time inconsistent stochastic control

 

When Thursday 25th February at 5.15pm
Where B617, Leverhulme Library, Columbia House
Presentations  
Speaker Tomas Bjork
From Stockholm School of Economics
Abstract We present a theory for stochastic control problems which, in various ways, are time inconsistent in the sense that they do not admit a Bellman optimality principle. We attach these problems by viewing them within a game theoretic framework, and we look for subgame perfect Nash equilibrium points.

For a general controlled Markov process and a fairly general objective functional we derive an extension of the standard Hamilton-Jacobi-Bellman equation, in the form of a system of non-linear equations, for the determination for the equilibrium strategy as well as the equilibrium value function. All known examples of time inconsistency in the literature are easily seen to be special cases of the present theory. We also prove that for every time inconsistent problem, there exists an associated time consistent problem such that the optimal control and the optimal value function for the consistent problem coincides with the equilibrium control and value function respectively for the time inconsistent problem. We also study some concrete examples.

For further information Sabina Allam (Postgraduate Administrator) Ext. 6879
Department of Statistics, Columbia House
Share:Facebook|Twitter|LinkedIn|