MTH3230 Exam Sheet .pdf

File information


Original filename: MTH3230 Exam Sheet.pdf

This PDF 1.5 document has been generated by TeX / MiKTeX pdfTeX-1.40.17, and has been sent on pdf-archive.com on 14/12/2017 at 17:46, from IP address 110.22.x.x. The current document download page has been viewed 393 times.
File size: 210 KB (2 pages).
Privacy: public file


Download original PDF file


MTH3230 Exam Sheet.pdf (PDF, 210 KB)


Share on social networks



Link to this file download page



Document preview


Sequence Definitions
δ = (. . . , 0, 1 , 0 . . . ), U = (. . . , 0, 1 , 1 . . . )
U0

δ0

Periodic Inputs =⇒ ∃(a ∈ CN ) :
N
−1
X
2πkn
ak ei N
xn =

Identity System
(I ◦ x)n = xn
Backshift Operator
(B ◦ x)n = xn−1

ARMA(p, q)
p
q
X
X
θk Zn−k
Xn −
φj Xn−j = Zn +
j=1

k=0

yn =

N
−1
X

i 2πk
N

H(e

)ak e

Weak Stationarity
E[Xn ] = constant ∀(n ∈ Z)
E[Xn Xn+h ] = γ(h) ∀(n, ∈ Z)

k=0

xn ∈ R =⇒ X(eiλ ) = X(eiλ )

Convolution
X
xk x
ˆn−k
(x ∗ x
ˆ)n =

H(eiλ ) =

k∈Z

Impulse Response
hn = (T ◦ δ)n
yn = (T ◦ x)n = (x ∗ h)n

b0 + b1 e−iλ + · · · + bp e−iλp
1 + a1 e−iλ + · · · + aq e−iλq

Linearity
T ◦ (ax + bˆ
x) = aT ◦ x + bT ◦ x
ˆ

Orthogonality. E[XY ] = 0
Projection
Yˆ = a1 X1 + · · · + an Xn
E[(Y − Yˆ )Yi ] = 0

Fundamental Theorem of Algebra
k
X
p(z) =
cj z j , cj ∈ R

Mean-Square-Error
M SE = E[(Y − Yˆ )2 ]

j=0

Stability

Partial Fraction Decomposition
1
A1
Ak
=
+ ··· +
p(z)
1 − a1 z −1
1 − ak z −1

z q /Y (z) = 0 =⇒ |z| < 1
X
|hn | < ∞
or

Difference of Squares
a2 − b2 = (a + b)(a − b)

n∈Z

ˆ
Uniqueness. If ∀(z ∈ ROC), X(z) = X(z),
then xn = x
ˆn
Convolution Theorem yn = (x ∗ h)n
Transfer Function
X
H(z) =
hn z −n = Y (z)/X(z)

H(z) =

n∈N

z −p

P (z)
b0 + b1
+ · · · + bp
=
1 + a1 z −1 + · · · + aq z −q
Q(z)

Frequency Response = H(eiλ )
N
X
If xn =
ak eirk n
k=0

Then yn =

N
X

H(eirk )ak eirk n

k=0

Sequence
xn

Autocovariance Xn =

Step Function

δn
Un
an Un
n+k−1 n
a Un
k−1

Trigonometric

Linearity

g(x)fX (x) dx

h∈Z

f (λ) = f (λ + 2nπ), f (λ) = f (−λ)
Inverse ZFourier TransformZ
π
π
γ(h) =
eihλ f (λ) dλ =
cos(hλ)f (λ) dλ
−π
−π
X
General ARMA Yn =
ψk Xn−k

Variance
var(X) = E[X 2 ] − E[X]2
Covariance
cov(X) = E[XY ] − E[X]E[Y ]
Correlation
cov(X, Y )
ρ(X, Y ) = p
var(X) var(Y )
Autocovariance
γ(m, n) = cov(Xn , Xm )
γ(h) = γ(−h)

k∈Z

γY (h) =

ψk ψj γX (h + k − j)

k∈Z j∈Z

Z–Transform

Discrete F –Transform

X

X

xn z −n

XX

fY (λ) = |Ψ(e−iλ )|2 fX (λ)

X(eiλ ) =

xn e−iλn

Inversion
I
1
xn = 2πi
X(z)z n−1 dz
C

n∈Z

1

1

1
1−z −1

1
1−e−iλ

1
1−az −1

1
1−ae−iλ

k
1
1−az −1

k
1
1−ae−iλ

cos(ωn)Un

1−z −1 cos ω
(z −1 −cos ω)2 +sin2 ω

1−e−iλ cos ω
(e−iλ −cos ω)2 +sin2 ω

sin(ωn)Un

z −1 sin ω
(z −1 −cos ω)2 +sin2 ω

e−iλ sin ω
(e−iλ −cos ω)2 +sin2 ω

axn + bˆ
xn

ˆ
aX(z) + bX(z)

ˆ iλ )
aX(eiλ ) + bX(e

xn−k

z −k X(z)

e−ikλ X(eiλ )

Modulation

an xn

X( az )

X( ea )

nxn

d
−z dz
X(z)

d
i dλ
X(eiλ )

(x ∗ x
ˆ)n

ˆ
X(z)X(z)

(x · x
ˆ)n

ψk Zn−k

k∈Z

Shifting

Convolution

X

Spectral Density ∈ R > 0
1
1 X
f (λ) =
Γ(eiλ ) =
γ(h)e−ihλ

2π h∈Z
X
1
= 2π
γ(h) cos(λh)

Cauchy-Schwarz Inequality
E[XY ]2 6 E[X 2 ]E[Y 2 ]

X(z) =

j = 1, . . . , n

k∈Z
X
ψk ψk+h
γ(h) = σ 2

x∈Z
Z

n∈Z

Delta Function

ak E[Xk Xj ] = E[Y Xj ]

k=0

R

k=0

z −1

n
X

Expectation
X
E[g(X)] =
g(x)P(X = x)
E[g(X)] =

j = 0, . . . , n

k=0

Euler’s Formulæ, eix = cos x + i sin x
1
cos x = 21 (eix +e−ix ), sin x = 2i
(eix −e−ix )

n∈Z

j=1

Yule-Walker Equations
n
X
ak Xk )Xj ] = 0
E[(Y −

∃λ ∈ Ck : p(z) = d(z −λ1 )(z −λ2 ) . . . (z −λk )

Causality
No dependence on future values,
or ∀(n < 0), hn = 0 ∨ xn = 0

General Difference Equation
q
p
X
X
yn = −
aj yn−j +
bk xn−k ,

Causality. ∀(|z| < 1), Φ(z) 6= 0
Invertibility. ∀(|z| < 1), Θ(z) 6= 0

Geometric Series (|a| < 1)
X
1
an =
1

a
n∈N

Time Invariance
T ◦B◦x=B◦T ◦x

k=1

Φ(B)Xn = Θ(B)Zn

i 2πkn
N



ˆ iλ )
X(eiλ )X(e
1


Z

π
−π

ˆ i(λ−s) ) ds
X(eis )X(e

xn =

1


π

Z

X(eiλ )eiλn dλ

−π

If Xn = Ψ(B)Zn
f (λ) =

σ2
|Ψ(e−iλ )|2


Ideal Low-Pass Filter (
H(eiλ ) = 1{|λ|<C} =
(
hn =

C
π

sinc( C
n)
π

=

= 1{|λ|>C} =

(
1 − C/π
hn =
− sin(Cn)/πn

Statistically Designed Filters
Wn ∼ N (0, ν 2 )

if |λ| < C
if |λ| > C

C/π
sin(Cn)/πn

Ideal High-Pass Filter (
H(eiλ )

1
0

if n = 0
if n =
6 0

Wiener Filter Y = Xn + Wn
fXY (λ) ?
fX (λ)
A(eiλ ) =
=
fY (λ)
fX (λ) + ν 2 /2π
Mean-Square-Error

0
1

if |λ| < C
if |λ| > C

ˆ n )2 = γX (0) −
E(Xn − X

N
X

aj γXY (j)

j=0

if n = 0
if n =
6 0

π

Z

fX (λ) −

=
−π

|fXY (λ)|2

fY (λ)

H(eiλ ) + H(eiλ ) = 1{|λ|<C} + 1{|λ|>C} = 1 Adaptive Projection
Ln = hYn , Yn−1 , . . . , Y0 i
=⇒ hn + hn = δn
P (X|Ln+1 ) = P (X|Ln )
Statistical Mean
+ β(Yn+1 − P (Yn+1 |Ln ))
x1 + · · · + xN
where
x=
E[(X−P (X|Ln ))(Yn+1 −P (Yn+1 |Ln ))]
β=
N
E[(Y
−P (Y
|L ))2 ]
n+1

Sample Autocovariance
N −|h|
1 X
γ
ˆ (h) =
(xn − x)(xn+|h| − x)
N n=1
Sample Autocorrelation
ρˆ(h) =

γ
ˆ (h)
γ
ˆ (0)

n

L2 Norm, Inner Product
hX, Y i = E[XY ] = E[Y X] = hY, Xi
haX, Y i = aE[XY ] = hX, aY i
hX1 + Xp
2 , Y i = hXp
1 , Y i + hX2 , Y i
||X|| = E[X 2 ] = hX, Xi
||X||2 = E[X 2 ] = hX, Xi
|hX, Y i| = |E[XY ]| 6 ||X|| × ||Y ||
||X + Y || 6 ||X|| + ||Y || (Triangle Inequality)
||X + Y ||2 = ||X||2 + ||Y ||2 when E[XY ] = 0
Projection Theorem
∃Yˆ (unique)
||Y − Yˆ || 6 ||Y − X|| ∀X ∈ K
If X ∈ K ∧ ||Y − Yˆ || = ||Y − X||
Then X = Yˆ
P (aY + bZ | K) = aP (X | K) + bP (Z | K)

Kalman Filter Yn = Xn + Wn
P (Xn+1 |Ln+1 ) = (1 − βn )P (Xn+1 |Ln )
+ βn Yn+1
where
E[(Xn+1 − P (Xn+1 |Ln ))2 ]
βn =
E[(Xn+1 − P (Xn+1 |Ln ))2 ] + ν 2
and

Periodogram
IN (λ) =

n+1

Mean-Square-Error
E(X − P (X|Ln+1 ))2 = E[(X − P (X|Ln ))]2
E[(X − P (X|Ln ))(Yn+1 − P (Yn+1 |Ln ))]2

E[(Yn+1 − P (Yn+1 |Ln ))2 ]

R2 Norm
v · w = v1 w1 + v2 w2 = |v||w| cos θ = w · v
(av) · w = v · (aw) = a(v · w)
(u + v) · w = u · w + v · w
|v|2 = v12 + v22 = v · v
|v · w| = |v||w|| cos θ| 6 |v||w|
|v + w| 6 |v| + |w| (Triangle Inequality)
|v + w|2 = |v|2 + |w|2 when v · w = 0

N
2
1 X

xk e−iλk

N k=1

Vn+1 = E[(Xn+1 − P (Xn+1 |Ln+1 ))2 ]
=

E[(Xn+1 − P (Xn+1 |Ln ))2 ]ν 2
E[(Xn+1 − P (Xn+1 |Ln ))2 ] + ν 2

Kalman Gain

Steady State Kalman Estimation
V∞ is determined by the unique positive
root of
(a2 V∞ + σ 2 )ν 2
V∞ = 2
a V∞ + σ 2 + ν 2

P (Xn+1 |Ln+1 ) =

Steady State Kalman Prediction
ν2
V∞ is determined by the unique positive
P
(X
|L
)
n
n+1
E[(Xn+1 − P (Xn+1 |Ln ))2 ] + ν 2
root of
E[(Xn+1 − P (Xn+1 |Ln ))2 ]
(a2 ν 2 + σ 2 )V∞ + ν 2 σ 2
Yn+1
+
V∞ =
2
2
E[(Xn+1 − P (Xn+1 |Ln )) ] + ν
V∞ + ν 2
Process: Xn = aXn−1 + Zn , Observation Proces: Yn = Xn + Wn
Kalman Equations

P (Xn+1 |Ln+1 ) =



E[(X
2
ν2
n+1 − P (Xn+1 |Ln )) ]
P (Xn+1 |Ln ) +
Yn+1
2
2
2
2
E[(Xn+1 − P (Xn+1 |Ln )) ] + ν
E[(Xn+1 − P (Xn+1 |Ln )) ] + ν

Vn+1 = E[(Xn+1 − P (Xn+1 |Ln+1 ))2 ] =

E[(Xn+1 − P (Xn+1 |Ln ))2 ]ν 2
E[(Xn+1 − P (Xn+1 |Ln ))2 ] + ν 2

Kalman Estimation

a2 V + σ 2

ν2
n
ˆ n+1 =
ˆn +
aX
Yn+1
X
2
2
2
a Vn + σ + ν
a2 Vn + σ 2 + ν 2
Vn+1 =

(a2 Vn + σ 2 )ν 2
a2 Vn + σ 2 + ν 2

ˆ 0 = P (X0 |Y0 ) and V0 = E[(X0 − X
ˆ 0 )2 ]
with X

Kalman Predication
V

ν2
n
ˆ n+1 =
ˆn +
aX
aYn
X
Vn + ν 2
Vn + ν 2
Vn+1 =

(a2 ν 2 + σ 2 )Vn + ν 2 σ 2
Vn + ν 2

ˆ 1 = P (X1 |Y0 ) and V1 = E[(X1 − X
ˆ 1 )2 ]
with X


Document preview MTH3230 Exam Sheet.pdf - page 1/2

Document preview MTH3230 Exam Sheet.pdf - page 2/2

Related documents


mth3230 exam sheet
kalman and bayesian filters in python
mth3260 exam sheet
progressive report
math words
final image based report mbuvha wang 2016

Link to this page


Permanent link

Use the permanent link to the download page to share your document on Facebook, Twitter, LinkedIn, or directly with a contact by e-Mail, Messenger, Whatsapp, Line..

Short link

Use the short link to share your document on Twitter or by text message (SMS)

HTML Code

Copy the following HTML code to share your document on a Website or Blog

QR Code

QR Code link to PDF file MTH3230 Exam Sheet.pdf