Вы находитесь на странице: 1из 11

LMS & more

1
i.i.d , stationarity assumption as before!

2
Want to minimize the instantaneous value of the cost function:

Time

3
Using steepest descent

Instantaneous estimate Memory of LMS

Heard of stochastic descent? Difference?

Instantaneous error minimization is achieved by stochastic


approximation of steepest descent. (no longer the smooth trajectory
towards minima)
4
The distinct advantage of LMS:
With large number of samples for adaptation, LMS stochastically
approaches Wiener’s solution, even without
• The need to know environment statistics
• The need to store infinitely large samples

It works with either random initialization or even with

Steps

still an issue

5
In terms of X and d

Time delay, storage

6
LMS vs Wiener
Difference in the weights

Optimum Wiener solution LMS’s approximation


(estimation)

Wiener’s error
7
Noise perturbation
Updation / transition matrix

Markov model of the LMS algorithm

8
LMS: Statistical Analysis

Ergodicity
9
Eigenvalue diagonal matrix

10
Discrete time version of Langevin eqn (Thermodynamics)

0 mean, white

Non-equilibrium behavior of LMS, as Brownian motion around


Wiener’s solution (even for small η)

Convergence characteristics of LMS

11

Вам также может понравиться