Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

Adaptive Filters

Introduction

Dr. Yogananda Isukapalli


Introduction
The principal topic is adaptive filters. Introduction focuses
on the key issues, back ground of adaptive filters and some
applications of adaptive filters.
Contents

• Filters
• Adaptive filters
• Linear filter structures
• Adaptive algorithms
• Applications
• Historical notes
Filter
According to Merriam-Webster,
A device or material for suppressing or minimizing waves
or oscillations of certain frequencies (as of electricity, light,
or sound).

In the signal processing context,


The filter is a system designed to extract information
about a prescribed quantity of interest from the corrupted
data. Filter is also called as estimator for the reasons
explained below.
Common forms of data corruption
Intersymbol interference: Occurs if the channel impulse
response is not an impulse.

Noise: Some form of noise is present at the output of every


Communication channel.

Estimation (or filter) theory is statistical in nature due to the


unavoidable presence of noise.
Basic forms of Estimation
Filtering, Smoothing and Prediction
n Filtering involves the use of data samples up to
the time of interest t (includes the sample at ‘ t ’).
n A smoother uses future samples, i.e., also samples
at t and so on, for estimating the quantity of
interest at time t´ .
n A predictor uses only the past samples up to
discrete time t , for estimating the quantity of
interest at time t+τ.
The difference between the three kinds of estimation are
Clearly illustrated in the above figure.
Types of filters
Linear Filters
n A filter is linear if its output is a linear function of the
input, i.e., if
Input Output
x1 → y1
x2 → y2
a1 x1 + a2 x2 → a1 y1 + a2 y2
The principle of superposition.

If a filter doesn’t obey the above principle it is called a


“nonlinear filter”. The course mainly focuses on linear filters.
The linear optimum filters
n In the statistical approach to the problem, parameters
like mean and correlation are assumed to be known.
n The filter is designed according to some statistical
criterion, like minimizing the mean square error.
n For stationary environment the resulting filter is
known as the “Wiener filter”.
n A highly successful solution to the non stationary
environment (Markov model) is the “kalman filter”.
The course deals with discrete-time signals only, the extension
to continuous signals is straight forward in most cases.
Adaptive Filters
An Adaptive filter is essentially a digital filter with self-
Adjusting characteristics. It adapts, automatically, to
changes in its input signals.
Why adaptive filters?
Contamination of a signal of interest by other unwanted,
often larger signals or noise is a problem encountered in
many applications. Where the signal and noise occupy fixed
and separate frequency bands, conventional linear FIR filters
with fixed coefficients can be used to extract the signal.
But when there is a spectral overlap between the signal
and noise as shown in the figure below or if the band
occupied by the noise is unknown or varies with time,
fixed coefficient filters are inappropriate.

Interference
Spectrum
Desired S/g
Spectrum

Fig 2. Spectral Overlap


Adaptive filters are generally used in the following contexts,

• When it is necessary for the filter characteristics to be


variable, adapted to changing conditions.
• When there is a spectral overlap between the signal
and noise.
• If the band occupied by the noise is unknown or
varies with time.
Strictly speaking adaptive filters are nonlinear, as they
involve hard outputs instead of soft outputs.
“Neural networks” another important topic of the course
Deals with the nonlinear adaptive filters.
Performance Measures of Adaptive Filters
• Rate of convergence
• Misadjustment
• Tracking
-- Performance in nonstationary environment
• Robustness
-- Impact of small disturbances
• Computational complexity
• Structure
-- Modularity, parallelism
• Numerical properties
-- Numerical stability and accuracy
-- Impact of quantization.
Filter Structures
n Conventional adaptive filters are linear
n Nonlinear adaptive filters:
-- Volterra-based
-- Neural networks
n Structure of the filter
Finite impulse response (FIR)
-- Transversal
-- Lattice
-- Systolic array
Infinite impulse response (IIR).
n Adaptive algorithm.
Transversal Filter
Transversal filter or tapped-delay line filter consists of
• Unit-delay elements
• Multiplier(s)
• Adder(s)
M
Filter output y ( n) = å wk*u (n - k ) = w H u(n)
k =0

Where w = [ w0 , w1 , w2 ,.....wM ]T , and


u(n) = [u (n), u (n - 1), u (n - 2),......u (n - M + 1)]T

Transversal filter is the most commonly used structure.


Transversal Filter
Multistage Lattice Filter
Multistage Lattice Filter

• Predictor consisting of several individual stages, lattices.


κ m ... is the mth reflection coefficient.
u (n)...is the predictor input at n th instance.
Forward prediction error f m (n) = f m-1 (n) + κ m* bm-1 (n - 1)
Backward prediction error bm (n) = bm-1 (n - 1) + κ m f m-1 (n)
• Correlated input sequence drawn from a statistical process
gives uncorrelated backward prediction errors. This
reduces the no. of computations required.
Systolic Array

Fundamental Blocks

Triangular systolic array gives the optimum solution


Systolic Array
• Practical application oriented structure
• Capable of implementing matrix vector products.
Applications
Matrix triangularization,
Matrix equation solving
(Backsubstitution).
•Efficient VLSI implementation due to,
Modularity,
Local interconnections,
Pipelined and synchronized processing.
IIR Filter
FIR filters are usually preferred within adaptive filtering
due to better stability. The figure shown below represents
a general IIR structure.
Adaptive Algorithms
Stochastic gradient algorithms
• This approach uses a transversal filter (tapped-delay line)
• Cost function based on statistical model (mean squared
Error sense)
• Iterative minimization in the direction of negative
Gradient.
– Solution to the filter computation problem
• Stochastic approximation for the gradient
– Solution to the statistical parameter estimation problem
– Simplest approximations (LMS algorithm) based on
Reference signals (also called as training sequence).
Least squares estimation
•The cost function is defined as the sum of weighted
Error squares.
• Minimizes the error of the filter output with respect to
a reference signal (training).
– No statistical model directly involved
• Recursive computation to simplify implementation
Recursive least squares (RLS) algorithms,
– Standard RLS
-Based on matrix inversion lemma, numerically unstable
– Square-root RLS
- Based on QR decomposition, numerically stable.
– Fast RLS
-Less computation by exploiting the matrix structure.
Applications
Four basic classes of adaptive filtering applications
I. Identification
In this class adaptive filters are used to provide a linear
model that represents the best fit to an unknown plant.

Applications: System identification and Layered earth modeling


II. Inverse modeling
In a linear system the inverse model has a transfer
function equal to the reciprocal of the plant’s transfer
function, such that the combination gives an ideal
transmission system.

Application: Equalization
III. Prediction
In this class the function of adaptive filter is used to
provide the best prediction of the present value of the
random signal.

Applications: Predictive coding and Spectrum analysis.


IV. Interference Cancellation
In this class of adaptive filters are used to cancel
unknown interference contained in a primary signal,
with the cancellation being optimized in some sense.
the primary signal serves as the reference signal.

Applications: Noise cancellation and Beamforming


Adaptive Equalization
• Adaptive equalization to remove intersymbol
interference (ISI).
• Adaptation based on reference signal obtained from
• Training
– Unimodal error surface
• Decision-direction
– Multimodal error surface
• Blind equalization
– Higher order statistics
– Cyclostationarity.
Adaptive Differential Pulse-Code Modulation
• Predicts the new value of signal waveform.
• Quantize only the uncorrelated information (innovation
process)
--Reduction in number of bits
--Data compression.
Adaptive Spectrum Estimation
• Parametric model of a stochastic process.
• Linear filter (autoregressive) model
input: white noise, output: the observed signal
find the model parameters (filter coefficients) by
an adaptive algorithm.
Adaptive Noise Cancellation
•Reference output uncorrelated to the desired signal, but
Correlated to noise
•If correlation is unknown, adaptive algorithm is needed.
Sensor separation in space, time or frequency.
Applications
• Electrocardiography (ECG)
• Acoustic noise in speech
• Adaptive speech enhancement.
Adaptive Beamforming
• Multiple sensors in space
--Results in spatial sampling.
• By appropriate combining gain selection, the beam can
be steered.
--An alternative to mechanical steering
• In changing or unknown environment, adaptive
algorithms are needed.
Application areas:
• Radar, sonar, radio communications
• Geophysical exploration
• Astrophysical exploration
• Biomedical signal processing.
Adaptive Beamforming
Historical Notes
• Linear estimation theory
-Method of least squares by Gauss in 1795
-Minimum mean squared error estimation in late 1930s and early
1940s
-Discrete-time Wiener-Hopf equation by Levinson in 1947
-Kalman filter by Swerling in 1958 and by Kalman in 1960.
• Stochastic gradient algorithms in late 1950s
-Stochastic approximation by Robins and Monro in 1951
-LMS algorithm by Widrow and Hoff in 1959
-Gradient adaptive lattice (GAL) algorithm by Griffiths in 1977-78
• Recursive least-squares algorithms
-Standard RLS algorithm by Plackett in 1950
-Kalman filter . Godard algorithm by Godard in 1974
-Exact relationship between RLS and Kalman filter by Sayed &
Kailath in 1994.
-QR decomposition based systolic array by Gentleman and
Kung in 1981
-Fast RLS algorithms in 1970s, in particular by Morf in 1974
• Neural networks
-Logical calculus for neural networks by McCulloch & Pitts in
1943
-Perceptron by Rosenblatt in 1958
-Back-propagation algorithm to train multilayer perceptrons by
Rumelhart, Hinton &williams in 1986
– Actually already by Werbos’s PhD thesis in 1974
-Radial basis function network by Broomhead & Lowe in 1988
– Idea already by Bashkirov, Braverman & Muchnick in
1964.
•Applications
-Adaptive equalization in 1960s
– Zero-forcing equalizer by Lucky in 1965
– MMSE equalizer by Gersho in 1969 and Proakis & Miller in
1969
---- LMS analysis by Ungerboeck in 1972
– Godard algorithm by Godard in 1974
– Fractionally space equalizer (FSE) by Brady in 1970
– Decision-feedback equalizer by Austin in 1967 and MMSE by
Monsen in1971.
• Speech coding
– maximum likelihood speech prediction by Saito & Itakura in
1966
– linear predictive coding (LPC) by Atal in 1970 and Atal &
Hanauer in 1971
– adaptive lattice predictor by Nakhoul & Cossell in 1981
• spectrum analysis,
―basics in early 1900s
– maximum entropy method by Burg in 1967
– method of multiple windows by Thomson in 1982
•adaptive noise cancellation started around 1965
• adaptive beamforming
– intermediate frequency (IF) sidelobe canceler by Howells in
late1950s
– control law (maximum SINR) for adaptive array antenna by
Applebaum in 1966
– application of LMS algorithm by Widrow et al. In 1967
– minimum variance distortionless response (MVDR) beamformer
by Capon in 1969
– simplified Gentleman-Kung systolic array for RLS estimation by
McWhirter in1983.

You might also like