You are on page 1of 18

Thomas F.

Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06


Recursive Least Squares Parameter
Estimation for Linear Steady State and
Dynamic Models
Thomas F. Edgar
Department of Chemical Engineering
University of Texas
Austin, TX 78712
1
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Outline
Static model, sequential estimation
Multivariate sequential estimation
Example
Dynamic discrete-time model
Closed-loop estimation
2
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Least Squares Parameter Estimation
Linear Time Series Models
ref: PC Young, Control Engr., p. 119, Oct, 1969
scalar example (no dynamics)
model y = ax
data
least squares estimate of a:
( ) a
( )
2
*
1

min
=

k
i i
i
a
ax y
* : error y ax = +
(1)
3
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
The analytical solution for the minimum (least squares) estimate is
p
k
, b
k
are functions of the number of samples
This is the non-sequential form or non-recursive
form
1
2 *
1 1


k k
k i i i
i i
k
k
b
p
a x x y


| | | |
=
| |
\ \


Simple Example
(2)
4
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
To update based on a new data pt. (y
i
, x
i
), in Eq. (2) let

k
a

=
= = +

1 2 1 2
1
1
k
k i k k
i
p x p x
and

=
= = +
1
1
* *
k
k i i k k k
i
b x y b x y
(3)
(4)
Sequential or Recursive Form
5
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Recursive Form for Parameter Estimation
( )

=

*
1 1
estimation error

k k k k k k
a a K x a y
(5)
where
( )


= +
1
2
1 1
1
k k k k k
K p x p x
(6)
To start the algorithm, need initial estimates
and p
0
. To update p,
( )
1
2 2 2
1 1 1
1
k k k k k k
p p p x p x


= +
(Set p
0
= large positive number)
Eqn. (7) shows p
k
is decreasing with k
(7)

0
a
6
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Estimating Multiple Parameters
(Steady State Model)
1 1 2 2
T
n n
y x a a x a x a x = = + + +
( )
2
*
1
min

k
T
i i
i
x a y
a
=

(non-sequential solution requires


n x n inverse)
To obtain a recursive form for
, a
1 1
1
*
1
x

T
k k k k
k k k k
P P x
B B x y

= +
= +
(8)
(9)
(10)
7
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
( ) ( )
1
*
1 1 1 1
estimation
error

1
T T
k k k k k k k k k k
k
K
a a P x x P x x a y


= +

1
1 1 1 1
1
T T
k k k k k k k k k
P P P x x P x x P


(
= +

need to assume
0
0

(vector)
(diagonal matrix)
a
P
(
(
=
(
(

11
22 0
0
0
nn
P
P P
P
l arge
ii
P
Recursive Solution
(11)
(12)
8
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Simple Example (Estimate Slope & Intercept)
Linear parametric model
input, u 0 1 2 3 4 10 12 18
output, y 5.71 9 15 19 20 45 55 78
y = a
1
+ a
2
u
9
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Sequential vs. Non-sequential
Estimation of a
2
Only (a
1
= 0)
Covariance Matrix vs. Number
of Samples Using Eq. (12)
10
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Sequential Estimation Using Eq. (11)

of and
1 2
a a
11
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Application to Digital Model and Feedback Control
Linear Discrete Model with Time Delay:
1 2
( ) ( 1) ( 2) ( )
n
y t a y t a y t a y t n = + + +
1 2
( 1 ) ( 2 ) bu t N b u t N + + + + ( )
r
b u t r N d +
y: output u: input d: disturbance N: time delay
(13)
12
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Recursive Least Squares Solution
( ) ( 1) ( 1) ( )
T
y t t t t = +
where

=

=
1 2 1 2
( 1) [ ( 1), ( 2),. . . y( ),
( 1 ),. . . ( ), 1]
( 1) [ , ,. . . , , , . . . , ].
T
T
n r
t y t y t t n
u t N u t r N
t a a a b b b d

( ) ( 1) ( ) ( 1)[ ( ) ( 1) ( 1)]
T
t t P t t y t t t = +
2
1
"least squares"
(predicted value of y)
min
( 1) ( ) ( )

t
T
i
i i y i

=
(

14
(14)
(15)
(16)
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
1
( ) ( 1) ( 1) ( 1)[ ( 1) ( 1) ( 1) 1]
T
P t P t P t t t P t t

= +
( 1) ( 1)
T
t P t
( 1) ( 1)
( )
1 ( 1) ( 1) ( 1)
T
P t t
K t
t P t t



=
+
( ) [ ( ) ( 1)] ( 1)
T
P t I K t t P t =

( ) ( 1) ( )[ ( ) ( )] t t K t y t y t = +
K: Kalman filter gain
Recursive Least Squares Solution
15
(17)
(18)
(19)
(20)
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
There are three practical considerations in
implementation of parameter estimation algorithms
- covariance resetting
- variable forgetting factor
- use of perturbation signal
Closed-Loop RLS Estimation
16
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Enhance sensitivity of least squares estimation algorithms
with forgetting factor
2
1
( ( )) ( 1) ( ) ( )
t
t i T
i
J t i i y i

=
(
=

= +
1
1
( ) [ ( 1) ( 1) ( 1)[ ( 1) ( 1) ( 1) ] . P t P t P t t T t P t t
( 1) ( 1)]
T
t P t

( ) ( 1) ( ) ( 1)[ ( ) ( 1) ( 1)]
T
t t P t t y t t t = +
prevents elements of P from becoming too small
(improves sensitivity) but noise may lead to incorrect
parameter estimates
~ 0.98 typical
1.0 all data weighted equally 0 < < 1.0
17
(21)
(22)
(23)
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
Perturbation signal is added to process input (via set point)
to excite process dynamics
large signal: good parameter estimates
but large errors in process output
small signal: better control but more sensitivity to noise
Guidelines: Vogel and Edgar, Comp. Chem. Engr., Vol. 12,
pp. 15-26 (1988)
1. set forgetting factor = 1.0
2. use covariance resetting (add diagonal matrix D to P
when tr (P) becomes small)
Closed-Loop Estimation (RLS)
18
Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06
3. use PRBS perturbation signal only when
estimation error is large and P is not small.
Vary PRBS amplitude with size of elements
proportional to tr (P).
4. P(0) = 10
4
I
5. filter new parameter estimates
( ) ( 1) (1 ) ( )
c c
k k k = +
: tuning parameter
(
c
used by controller)
6. Use other diagnostic checks such as the sign of the
computed process gain
19

You might also like