Abstract
The problem of uniformly optimal control is posed in the context of linear time-varying discrete-time systems; the controller is optimal uniformly over a pre-specified set of exogenous signals. Existence of an optimal controller is proved and a formula for the minimum cost is derived. The time-invariant case is treated in the frequency domain. It is shown that for time-invariant systems an optimal time-varying controller is no better than an optimal time-invariant controller.
Original language | English |
---|---|
Pages (from-to) | 563-574 |
Number of pages | 12 |
Journal | Automatica |
Volume | 21 |
Issue number | 5 |
DOIs | |
State | Published - 1 Jan 1985 |
Keywords
- Multivariable control systems
- functional analysis
- optimal control
- regulator theory
- time-varying systems
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering