0 views

Uploaded by süleyman fatih Kara

- Intro to Markov Chains
- Mio Petri Net
- Book
- Linear Quadratic Control for Sampled-data Systems with Stochastic Delays
- Markov Process
- Tang
- Queuing Theory
- Classification of high resolution sar images using textural features
- Sales Planning and Control Using Absorb In
- Risk game markov
- Markov Chain
- Problemsheet_0
- Plugin Science 28
- 6.pdf
- 1988 Paper Sutton Tdlearning
- c.qcampus
- M1905 Topic 2A Probability
- HW1
- 6 Models of PA
- Mat

You are on page 1of 6

yk =C(rk )xk + D(rk )vk (2)

where

• The properly sized matrices A(·), B(·), C(·) and D(·) are known functions of the mode

state.

(time invariant) Markov chain with transition probability matrix (TPM) Π = [πij , P (rk =

j|rk−1 = i)]. The problem is to estimate or approximate the posterior distribution p(xk |y0:k ) in

a computationally tractable way.

We have seen in the class that the optimal p(xk |y0:k ) is a mixture of Gaussians with an

exponentially growing number of components. Hence approximations are necessary. The so-

called interacting multiple model (IMM) filter [1] makes the approximation

Nr

X

p(xk |y0:k ) ≈ µik N (xk ; x̂ik|k , Σik|k ) (3)

i=1

where

are the posterior mode probabilities. When such an approximation is given, one can calculate

the overall posterior mean x̂k|k and covariance Σk|k using the standard Gaussian mixture mean

and covariance formulas as

Nr

X

x̂k|k = µik x̂ik|k (5)

i=1

Nr

X h i

Σk|k = µik Σik|k + (x̂ik|k − x̂k|k )(x̂ik|k − x̂k|k )T (6)

i=1

This estimate and covariance can be given to the user as the output. The mode conditional means

{x̂ik|k }N i Nr i Nr

i=1 , covariances {Σk|k }i=1 and mode probabilities {µk }i=1 must be calculated recursively

r

1

from their previous values {x̂ik−1|k−1 , Σik−1|k−1 , µik−1 }N

i=1 . For deriving such a recursion, we write

r

Nr

X

p(xk |y0:k ) , p(xk |y0:k , rk = i) P (rk = i|y0:k ) (7)

| {z }

i=1

,µik

Nr

X

= µik p(xk |y0:k , rk = i) (8)

i=1

Nr

X p(yk |xk , rk = i)

= µik p(xk |y0:k−1 , rk = i) (9)

p(yk |y0:k−1 , rk = i)

i=1

Nr

i p(yk |xk , rk = i)

X Z

= µk p(xk |xk−1 , rk = i)p(xk−1 |y0:k−1 , rk = i) dxk−1 (10)

p(yk |y0:k−1 , rk = i)

i=1

Nr

i p(yk |xk , rk = i)

X Z

= µk p(xk |xk−1 , rk = i)

p(yk |y0:k−1 , rk = i)

i=1

XNr

× p(xk−1 |y0:k−1 , rk−1 = j) P (rk−1 = j|rk = i, y0:k−1 ) dxk−1 (11)

| {z }

j=1

µji

k−1|k−1

Nr

p(yk |xk , rk = i)

X Z

= µik p(xk |xk−1 , rk = i)

p(yk |y0:k−1 , rk = i)

i=1

Nr

µji

X

× k−1|k−1 p(xk−1 |y0:k−1 , rk−1 = j) dxk−1 (12)

j=1

Substituting this into (12) would clearly yield a Gaussian mixture with Nr2 components for

p(xk |y0:k ). Therefore we make the following approximation

Nr

µji

X

p(xk−1 |y0:k−1 , rk = i) , k−1|k−1 p(xk−1 |y0:k−1 , rk−1 = j) (14)

j=1

Nr

µji

X

i i

= k−1|k−1 N (xk−1 ; x̂k−1|k−1 , Σk−1|k−1 ) (15)

j=1

≈N (xk−1 ; x̂0i 0i

k−1|k−1 , Σk−1|k−1 ) (16)

k−1|k−1 and covariance Σk−1|k−1 are obtained by moment matching as

Nr

µji j

X

x̂0i

k−1|k−1 = k−1|k−1 x̂k−1|k−1 , (17)

j=1

Nr h i

µji j j j

X

Σ0i

k−1|k−1 = k−1|k−1 Σ k−1|k−1 + (x̂ k−1|k−1 − x̂ 0i

k−1|k−1 )(x̂ k−1|k−1 − x̂ 0i

k−1|k−1 ) T

. (18)

j=1

In order to separate the merging in (17) and (18) from the merging in output calculation (i.e., in

(5) and (6)), the merging in (17) and (18) is called “mixing” in the literature [1]. The estimate

2

x̂0i 0i

k−1|k−1 and covariance Σk−1|k−1 are called the mixed estimate and covariance accordingly. Now

substituting the approximation (16) into (12), we get

Nr

p(yk |xk , rk = i)

X Z

p(xk |y0:k ) = µik p(xk |xk−1 , rk = i)N (xk−1 ; x̂0i 0i

k−1|k−1 , Σk−1|k−1 )dxk−1

p(yk |y0:k−1 , rk = i)

i=1

(19)

Noticing that

the integral in (19) represents a prediction update with ith model. Hence, we see that

Nr

X p(yk |xk , rk = i)

p(xk |y0:k ) = µik N (xk ; x̂ik|k−1 , Σik|k−1 ) (21)

p(yk |y0:k−1 , rk = i)

i=1

where

x̂ik|k−1 =A(i)x̂0i

k−1|k−1 (22)

Σik|k−1 =A(i)Σ0i T

k−1|k−1 A (i) + B(i)QB (i) T

(23)

the multiplication in (21) represents a measurement update with the ith model. Hence

Nr

X

p(xk |y0:k ) = µik N (xk ; x̂ik|k , Σik|k ) (25)

i=1

where

i

x̂ik|k =x̂ik|k−1 + Kki (yk − ŷk|k−1 ) (26)

Σik|k =Σik|k−1 − Kki Ski KkiT (27)

i

ŷk|k−1 =C(i)x̂ik|k−1 (28)

Ski =C(i)Σik|k−1 C T (i) + D(i)RD (i) T

(29)

Kki =Σik|k−1 C T (i)(Ski )−1 (30)

We have obtained the recursions for the estimates and covariances. In order to complete the re-

j

cursion, all we need to do is to give recursions for the probabilities {µik }N Nr

i=1 in terms of {µk−1 }j=1

r

k−1|k−1 .

The following gives a recursion for {µik }N

i=1 .

r

∝ p(yk |y0:k−1 , rk = i) P (rk = i|y0:k−1 ) (32)

| {z }

i

=N (yk ;ŷk|k−1 ,Ski )

Nr

X

i

=N (yk ; ŷk|k−1 , Ski ) P (r = i|rk−1 = j) P (rk−1 = j|y0:k−1 ) (33)

| k {z }| {z }

j=1 πji µjk−1

Nr

πji µjk−1

X

i

=N (yk ; ŷk|k−1 , Ski ) (34)

j=1

3

Hence, we have

i

PNr j

N (yk ; ŷk|k−1 , Ski ) j=1 πji µk−1

µik = PNr PNr j

. (35)

` `

`=1 N (yk ; ŷk|k−1 , Sk ) j=1 πj` µk−1

k−1|k−1 as

µji

k−1|k−1 ,P (rk−1 = j|rk = i, y0:k−1 ) (36)

∝P (rk = i|rk−1 = j, y0:k−1 ) P (rk−1 = j|y0:k−1 ) (37)

| {z }

µjk−1

| {z }

πji

Hence, we have

πji µjk−1

µji

k−1|k−1 = PNr . (40)

π µ `

`=1 `i k−1

Now we can give a verbal description of one step of the IMM filter as follows.

Algorithm 1 Single Step of IMM filter: Suppose we have the previous sufficient statis-

tics {xjk−1|k−1 , Σjk−1|k−1 , µjk−1 }N r

j=1 . Then, a single step of the IMM algorithm to obtain current

sufficient statistics {xik|k , Σik|k , µik }N r

i=1 is given as follows.

• Mixing:

k−1|k−1 }i,j=1 as

πji µjk−1

µji

k−1|k−1 = PNr . (41)

`

`=1 π`i µk−1

Nr Nr

– Calculate the mixed estimates {x̂0i 0i

k−1|k−1 }i=1 and covariances {Σk−1|k−1 }i=1 as

Nr

µji j

X

x̂0i

k−1|k−1 = k−1|k−1 x̂k−1|k−1 , (42)

j=1

Nr h i

µji j j j

X

Σ0i

k−1|k−1 = k−1|k−1

0i 0i T

Σk−1|k−1 + (x̂k−1|k−1 − x̂k−1|k−1 )(x̂k−1|k−1 − x̂k−1|k−1 ) .

j=1

(43)

• Mode Matched Prediction Update: For ith model, i = 1, . . . , Nr , calculate the pre-

dicted estimate x̂ik|k−1 and covariance Σik|k−1 from the mixed estimate x̂0i

k−1|k−1 and co-

0i

variance Σk−1|k−1 as

x̂ik|k−1 =A(i)x̂0i

k−1|k−1 , (44)

Σik|k−1 =A(i)Σ0i T

k−1|k−1 A (i) + B(i)QB (i). T

(45)

4

– Calculate the updated estimate x̂ik|k and covariance Σik|k from the predicted estimate

x̂ik|k−1 and covariance Σik|k−1 as

i

), (46)

Σik|k =Σik|k−1 − Kki Ski KkiT , (47)

i

ŷk|k−1 =C(i)x̂ik|k−1 , (48)

Ski =C(i)Σik|k−1 C T (i) + D(i)RDT (i), (49)

Kki =Σik|k−1 C T (i)(Ski )−1 . (50)

i

PNr j

N (yk ; ŷk|k−1 , Ski )

j=1 πji µk−1

µik = PNr PNr j

. (51)

` `

`=1 N (yk ; ŷk|k−1 , Sk ) j=1 πj` µk−1

• Output Estimate Calculation: Calculate the overall estimate x̂k|k and covariance as

Nr

X

x̂k|k = µik x̂ik|k , (52)

i=1

Nr

X h i

Σk|k = µik Σik|k + (x̂ik|k − x̂k|k )(x̂ik|k − x̂k|k )T . (53)

i=1

Remark 1 The output calculation step is done only for output purposes and therefore does not

affect the sufficient statistics.

A block diagram of a single step of the IMM filter is given in Figure 1 below.

References

[1] Y. Bar-Shalom, X. R. Li, and T. Kirubarajan, Estimation with Applications to Tracking and

Navigation. New York: Wiley, 2001.

[2] Y. Bar-Shalom, S. Challa, and H. A. P. Blom, “IMM estimator versus optimal estimator for

hybrid systems,” IEEE Trans. Aerosp. Electron. Syst., vol. 41, no. 3, pp. 986–991, Jul. 2005.

5

{µik−1 }N

i=1

Mixing yk Mode {µik }N

i=1

Probability Probability

Calculation Calculation

x̂1k−1|k−1 x̂01

k−1|k−1 x̂1k|k

1

Σ1k−1|k−1 Σ01

k−1|k−1 KF-1 Σk|k

x̂2k−1|k−1 x̂02

k−1|k−1 x̂2k|k

2

Σ2k−1|k−1 Σ02

k−1|k−1 KF-2 Σk|k

Mixing

x̂N

k−1|k−1 x̂0N

k−1|k−1 x̂N

k|k

N

ΣNk−1|k−1 Σ0N

k−1|k−1 KF-N Σk|k

Output

Estimate

Calculation

x̂k|k

Σk|k

Figure 1: Block diagram of a single step of the IMM algortihm for N -models.

- Intro to Markov ChainsUploaded byJune Lee
- Mio Petri NetUploaded byHassan Ahmed Khan
- BookUploaded byVrund
- Linear Quadratic Control for Sampled-data Systems with Stochastic DelaysUploaded byViyilsSangregorioSoto
- Markov ProcessUploaded byDestiny Rules
- TangUploaded byAnonymous Zx7EG1Pa
- Queuing TheoryUploaded bymohamagdy
- Classification of high resolution sar images using textural featuresUploaded byinriaariana
- Sales Planning and Control Using Absorb InUploaded bymanuel
- Risk game markovUploaded bymjherman
- Markov ChainUploaded bymrcet007
- Problemsheet_0Uploaded byAbdhul Khadhir Shalayar
- Plugin Science 28Uploaded byAsk Bulls Bear
- 6.pdfUploaded byMariaBalogh
- 1988 Paper Sutton TdlearningUploaded bySoyelquesoy Soyelquesoy
- c.qcampusUploaded bySumedha Swayansidha
- M1905 Topic 2A ProbabilityUploaded bymoskvax
- HW1Uploaded byArjunXYZ17
- 6 Models of PAUploaded bygodwsky
- MatUploaded byDanny
- 12. Applied-On Certain Generalized Modified Beta Operators-sangeetha GargUploaded byImpact Journals
- lin2Uploaded bylekinwalanirav
- Fundamentals Probability 08072009Uploaded bysriharshagadiraju
- Stock ControlUploaded bynicocla94maram
- 1sttest SlUploaded byhansaram_1
- 03 Systems ResponseUploaded byalareeqi
- Aieee Paper 2006Uploaded byNiranjan Savarirajalu
- Cal Sample Final TestUploaded bylarita98
- Steps Costs GraphUploaded byapi-3809857
- Zetica-Statistical-basis-for-assessing-the-risk-of-unexploded-bombs.pdfUploaded byRymond

- St. LawrenceUploaded byHoniel091112
- README - Cadworx 2019.pdfUploaded byGerry
- Shane Mackinlay_Exceeding Truth - Jean-Luc Marion’s Saturated PhenomenaUploaded bytanuwidjojo
- Finite Element Method - Boundary Element MethodUploaded byapi-3698788
- KAM TheoryUploaded bymbuyiselwa
- IPTSTS 100 - Le Cercle des lettres de l’alphabet Dāʾirat al-aḥruf al-abjadiyya.pdfUploaded byRes Arabica Studiosus
- Gopala Krishna Atmakuru Ds Assignment 1Uploaded bychuczy
- ECPE HAU BOOK 1 LISTENING PART III TEST 1.docxUploaded byΜαρία Λίμα
- [final thesis V] ACM Guildford - Comparative Management of Cultural Activities in EuropeUploaded bymcolarossi
- Computer science project on TUTION MANIA(class 12th)Uploaded bySahil Kumar
- Synthesis of Dibenzalacetone.docUploaded byHoai Van
- Final Notes for QtmUploaded bysupponi
- BE-SemVIII ClassesUploaded byPrathamesh Mestry
- english linksUploaded byapi-272535166
- graphic organizer grading rubricUploaded byapi-450888838
- Outsourcing Practices of the Kenyan Banking SectorUploaded byboodzz
- Matlba s functionUploaded bymsartini78442
- EE341 FinalUploaded byLe Huy
- Chapter 1 HWM 2nd Ed SolutionsUploaded byDaniel Choi
- TE-time-table-insem-2015.pdfUploaded byDhondiram Maruthi Kakre
- John J. GumperzUploaded bySanna Shān
- THESIS Mueller03Uploaded byTran Anh Tuan
- Conversion of volt to electron voltUploaded byshyam p
- Solar Steam GenerationUploaded bySebastian Montecino
- at 1Uploaded byYathash Hash
- BSBWOR301B Organise Personal Work Priorities and Development - Reading MaterialUploaded byRodger Phillips
- SAS12Uploaded byAnsh Hossein
- 8086 and Memory InterfacingUploaded byChandra Sekhar D
- RadiationUploaded byhelder.santos1114
- IRJET-Simulation of Smart Meter Using Proteus software for Smart GridUploaded byIRJET Journal