Você está na página 1de 33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

Kalmanfilter
FromWikipedia,thefreeencyclopedia

Kalmanfiltering,alsoknownaslinear
quadraticestimation(LQE),isan
algorithmthatusesaseriesof
measurementsobservedovertime,
containingstatisticalnoiseandother
inaccuracies,andproducesestimatesof
unknownvariablesthattendtobemore
precisethanthosebasedonasingle
measurementalone.Thefilterisnamed
afterRudolfE.Klmn,oneofthe
primarydevelopersofitstheory.
TheKalmanfilterkeepstrackoftheestimatedstateofthesystemand
TheKalmanfilterhasnumerous
thevarianceoruncertaintyoftheestimate.Theestimateisupdated
applicationsintechnology.Acommon
usingastatetransitionmodelandmeasurements.
denotesthe
applicationisforguidance,navigation
estimateofthesystem'sstateattimestepkbeforethekth
andcontrolofvehicles,particularly
measurementykhasbeentakenintoaccount
isthe
aircraftandspacecraft.Furthermore,the
correspondinguncertainty.
Kalmanfilterisawidelyapplied
conceptintimeseriesanalysisusedin
fieldssuchassignalprocessingand
econometrics.Kalmanfiltersalsoareoneofthemaintopicsinthefieldofroboticmotionplanningand
control,andtheyaresometimesincludedintrajectoryoptimization.Themultifractionalorderestimatoris
asimpleandpracticalalternativetotheKalmanfilterfortrackingtargets.

Thealgorithmworksinatwostepprocess.Inthepredictionstep,theKalmanfilterproducesestimatesof
thecurrentstatevariables,alongwiththeiruncertainties.Oncetheoutcomeofthenextmeasurement
(necessarilycorruptedwithsomeamountoferror,includingrandomnoise)isobserved,theseestimatesare
updatedusingaweightedaverage,withmoreweightbeinggiventoestimateswithhighercertainty.The
algorithmisrecursive.Itcanruninrealtime,usingonlythepresentinputmeasurementsandthepreviously
calculatedstateanditsuncertaintymatrixnoadditionalpastinformationisrequired.
TheKalmanfilterdoesnotrequireanyassumptionthattheerrorsareGaussian.[1]However,thefilteryields
theexactconditionalprobabilityestimateinthespecialcasethatallerrorsareGaussiandistributed.
Extensionsandgeneralizationstothemethodhavealsobeendeveloped,suchastheextendedKalmanfilter
andtheunscentedKalmanfilterwhichworkonnonlinearsystems.TheunderlyingmodelisaBayesian
modelsimilartoahiddenMarkovmodelbutwherethestatespaceofthelatentvariablesiscontinuousand
wherealllatentandobservedvariableshaveGaussiandistributions.

Contents
1Namingandhistoricaldevelopment
2Overviewofthecalculation

https://en.wikipedia.org/wiki/Kalman_filter

1/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

2Overviewofthecalculation
3Exampleapplication
4Technicaldescriptionandcontext
5Underlyingdynamicsystemmodel
6Details
6.1Predict
6.2Update
6.3Invariants
6.4Optimalityandperformance
7Exampleapplication,technical
8Derivations
8.1Derivingtheaposterioriestimatecovariancematrix
8.2Kalmangainderivation
8.3Simplificationoftheaposteriorierrorcovarianceformula
9Sensitivityanalysis
10Squarerootform
11RelationshiptorecursiveBayesianestimation
12Marginallikelihood
13Informationfilter
14Fixedlagsmoother
15Fixedintervalsmoothers
15.1RauchTungStriebel
15.2ModifiedBrysonFraziersmoother
15.3Minimumvariancesmoother
16FrequencyWeightedKalmanfilters
17Nonlinearfilters
17.1ExtendedKalmanfilter
17.2UnscentedKalmanfilter
18KalmanBucyfilter
19HybridKalmanfilter
20Variantsfortherecoveryofsparsesignals
21Applications
22Seealso
23References
24Furtherreading
25Externallinks

Namingandhistoricaldevelopment
ThefilterisnamedafterHungarianmigrRudolfE.Klmn,althoughThorvaldNicolaiThiele[2][3]and
PeterSwerlingdevelopedasimilaralgorithmearlier.RichardS.BucyoftheUniversityofSouthern
Californiacontributedtothetheory,leadingtoitoftenbeingcalledtheKalmanBucyfilter.StanleyF.
SchmidtisgenerallycreditedwithdevelopingthefirstimplementationofaKalmanfilter.Herealizedthat
thefiltercouldbedividedintotwodistinctparts,withonepartfortimeperiodsbetweensensoroutputsand
anotherpartforincorporatingmeasurements.[4]ItwasduringavisitbyKalmantotheNASAAmes
ResearchCenterthathesawtheapplicabilityofhisideastotheproblemoftrajectoryestimationforthe

https://en.wikipedia.org/wiki/Kalman_filter

2/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

Apolloprogram,leadingtoitsincorporationintheApollo
navigationcomputer.ThisKalmanfilterwasfirstdescribedand
partiallydevelopedintechnicalpapersbySwerling(1958),Kalman
(1960)andKalmanandBucy(1961).
Kalmanfiltershavebeenvitalintheimplementationofthe
navigationsystemsofU.S.Navynuclearballisticmissile
submarines,andintheguidanceandnavigationsystemsofcruise
missilessuchastheU.S.Navy'sTomahawkmissileandtheU.S.Air
Force'sAirLaunchedCruiseMissile.Itisalsousedintheguidance
andnavigationsystemsoftheNASASpaceShuttleandtheattitude
controlandnavigationsystemsoftheInternationalSpaceStation.

RudolfEmilKalman,coinventorand
developeroftheKalmanfilter.

ThisdigitalfilterissometimescalledtheStratonovichKalmanBucyfilterbecauseitisaspecialcaseofa
moregeneral,nonlinearfilterdevelopedsomewhatearlierbytheSovietmathematicianRuslan
Stratonovich.[5][6][7][8]Infact,someofthespecialcaselinearfilter'sequationsappearedinthesepapersby
Stratonovichthatwerepublishedbeforesummer1960,whenKalmanmetwithStratonovichduringa
conferenceinMoscow.

Overviewofthecalculation
TheKalmanfilterusesasystem'sdynamicsmodel(e.g.,physicallawsofmotion),knowncontrolinputsto
thatsystem,andmultiplesequentialmeasurements(suchasfromsensors)toformanestimateofthe
system'svaryingquantities(itsstate)thatisbetterthantheestimateobtainedbyusinganyonemeasurement
alone.Assuch,itisacommonsensorfusionanddatafusionalgorithm.
Allmeasurementsandcalculationsbasedonmodelsareestimatestosomedegree.Noisysensordata,
approximationsintheequationsthatdescribehowasystemchanges,andexternalfactorsthatarenot
accountedforintroducesomeuncertaintyabouttheinferredvaluesforasystem'sstate.TheKalmanfilter
averagesapredictionofasystem'sstatewithanewmeasurementusingaweightedaverage.Thepurposeof
theweightsisthatvalueswithbetter(i.e.,smaller)estimateduncertaintyare"trusted"more.Theweights
arecalculatedfromthecovariance,ameasureoftheestimateduncertaintyofthepredictionofthesystem's
state.Theresultoftheweightedaverageisanewstateestimatethatliesbetweenthepredictedand
measuredstate,andhasabetterestimateduncertaintythaneitheralone.Thisprocessisrepeatedeverytime
step,withthenewestimateanditscovarianceinformingthepredictionusedinthefollowingiteration.This
meansthattheKalmanfilterworksrecursivelyandrequiresonlythelast"bestguess",ratherthantheentire
history,ofasystem'sstatetocalculateanewstate.
Becausethecertaintyofthemeasurementsisoftendifficulttomeasureprecisely,itiscommontodiscuss
thefilter'sbehaviorintermsofgain.TheKalmangainisafunctionoftherelativecertaintyofthe
measurementsandcurrentstateestimate,andcanbe"tuned"toachieveparticularperformance.Withahigh
gain,thefilterplacesmoreweightonthemeasurements,andthusfollowsthemmoreclosely.Withalow
gain,thefilterfollowsthemodelpredictionsmoreclosely,smoothingoutnoisebutdecreasingthe
responsiveness.Attheextremes,againofonecausesthefiltertoignorethestateestimateentirely,whilea
gainofzerocausesthemeasurementstobeignored.

https://en.wikipedia.org/wiki/Kalman_filter

3/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

Whenperformingtheactualcalculationsforthefilter(asdiscussedbelow),thestateestimateand
covariancesarecodedintomatricestohandlethemultipledimensionsinvolvedinasinglesetof
calculations.Thisallowsforrepresentationoflinearrelationshipsbetweendifferentstatevariables(suchas
position,velocity,andacceleration)inanyofthetransitionmodelsorcovariances.

Exampleapplication
Asanexampleapplication,considertheproblemofdeterminingthepreciselocationofatruck.Thetruck
canbeequippedwithaGPSunitthatprovidesanestimateofthepositionwithinafewmeters.TheGPS
estimateislikelytobenoisyreadings'jumparound'rapidly,thoughalwaysremainingwithinafewmeters
oftherealposition.Inaddition,sincethetruckisexpectedtofollowthelawsofphysics,itspositioncan
alsobeestimatedbyintegratingitsvelocityovertime,determinedbykeepingtrackofwheelrevolutions
andtheangleofthesteeringwheel.Thisisatechniqueknownasdeadreckoning.Typically,dead
reckoningwillprovideaverysmoothestimateofthetruck'sposition,butitwilldriftovertimeassmall
errorsaccumulate.
Inthisexample,theKalmanfiltercanbethoughtofasoperatingintwodistinctphases:predictandupdate.
Inthepredictionphase,thetruck'soldpositionwillbemodifiedaccordingtothephysicallawsofmotion
(thedynamicor"statetransition"model)plusanychangesproducedbytheacceleratorpedalandsteering
wheel.Notonlywillanewpositionestimatebecalculated,butanewcovariancewillbecalculatedaswell.
Perhapsthecovarianceisproportionaltothespeedofthetruckbecausewearemoreuncertainaboutthe
accuracyofthedeadreckoningpositionestimateathighspeedsbutverycertainaboutthepositionestimate
whenmovingslowly.Next,intheupdatephase,ameasurementofthetruck'spositionistakenfromthe
GPSunit.Alongwiththismeasurementcomessomeamountofuncertainty,anditscovariancerelativeto
thatofthepredictionfromthepreviousphasedetermineshowmuchthenewmeasurementwillaffectthe
updatedprediction.Ideally,ifthedeadreckoningestimatestendtodriftawayfromtherealposition,the
GPSmeasurementshouldpullthepositionestimatebacktowardstherealpositionbutnotdisturbittothe
pointofbecomingrapidlychangingandnoisy.

Technicaldescriptionandcontext
TheKalmanfilterisanefficientrecursivefilterthatestimatestheinternalstateofalineardynamicsystem
fromaseriesofnoisymeasurements.Itisusedinawiderangeofengineeringandeconometricapplications
fromradarandcomputervisiontoestimationofstructuralmacroeconomicmodels,[9][10]andisanimportant
topicincontroltheoryandcontrolsystemsengineering.Togetherwiththelinearquadraticregulator(LQR),
theKalmanfiltersolvesthelinearquadraticGaussiancontrolproblem(LQG).TheKalmanfilter,the
linearquadraticregulatorandthelinearquadraticGaussiancontrolleraresolutionstowhatarguablyare
themostfundamentalproblemsincontroltheory.
Inmostapplications,theinternalstateismuchlarger(moredegreesoffreedom)thanthefew"observable"
parameterswhicharemeasured.However,bycombiningaseriesofmeasurements,theKalmanfiltercan
estimatetheentireinternalstate.
InDempsterShafertheory,eachstateequationorobservationisconsideredaspecialcaseofalinearbelief
functionandtheKalmanfilterisaspecialcaseofcombininglinearbelieffunctionsonajointreeor
Markovtree.AdditionalapproachesincludebelieffilterswhichuseBayesorevidentialupdatestothestate
equations.
https://en.wikipedia.org/wiki/Kalman_filter

4/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

AwidevarietyofKalmanfiltershavenowbeendeveloped,fromKalman'soriginalformulation,nowcalled
the"simple"Kalmanfilter,theKalmanBucyfilter,Schmidt's"extended"filter,theinformationfilter,anda
varietyof"squareroot"filtersthatweredevelopedbyBierman,Thorntonandmanyothers.Perhapsthe
mostcommonlyusedtypeofverysimpleKalmanfilteristhephaselockedloop,whichisnowubiquitous
inradios,especiallyfrequencymodulation(FM)radios,televisionsets,satellitecommunicationsreceivers,
outerspacecommunicationssystems,andnearlyanyotherelectroniccommunicationsequipment.

Underlyingdynamicsystemmodel
TheKalmanfiltersarebasedonlineardynamicsystemsdiscretizedinthetimedomain.Theyaremodelled
onaMarkovchainbuiltonlinearoperatorsperturbedbyerrorsthatmayincludeGaussiannoise.Thestate
ofthesystemisrepresentedasavectorofrealnumbers.Ateachdiscretetimeincrement,alinearoperator
isappliedtothestatetogeneratethenewstate,withsomenoisemixedin,andoptionallysomeinformation
fromthecontrolsonthesystemiftheyareknown.Then,anotherlinearoperatormixedwithmorenoise
generatestheobservedoutputsfromthetrue("hidden")state.TheKalmanfiltermayberegardedas
analogoustothehiddenMarkovmodel,withthekeydifferencethatthehiddenstatevariablestakevalues
inacontinuousspace(asopposedtoadiscretestatespaceasinthehiddenMarkovmodel).Thereisa
strongdualitybetweentheequationsoftheKalmanFilterandthoseofthehiddenMarkovmodel.Areview
ofthisandothermodelsisgiveninRoweisandGhahramani(1999)[11]andHamilton(1994),Chapter
13.[12]
InordertousetheKalmanfiltertoestimatetheinternalstateofaprocessgivenonlyasequenceofnoisy
observations,onemustmodeltheprocessinaccordancewiththeframeworkoftheKalmanfilter.This
meansspecifyingthefollowingmatrices:Fk,thestatetransitionmodelHk,theobservationmodelQk,the
covarianceoftheprocessnoiseRk,thecovarianceoftheobservationnoiseandsometimesBk,thecontrol
inputmodel,foreachtimestep,k,asdescribedbelow.
The

ModelunderlyingtheKalmanfilter.Squaresrepresentmatrices.Ellipsesrepresentmultivariatenormal
distributions(withthemeanandcovariancematrixenclosed).Unenclosedvaluesarevectors.Inthesimple
case,thevariousmatricesareconstantwithtime,andthusthesubscriptsaredropped,buttheKalmanfilter
allowsanyofthemtochangeeachtimestep.

Kalmanfiltermodelassumesthetruestateattimekisevolvedfromthestateat(k1)accordingto

https://en.wikipedia.org/wiki/Kalman_filter

5/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

where
Fkisthestatetransitionmodelwhichisappliedtothepreviousstatexk1
Bkisthecontrolinputmodelwhichisappliedtothecontrolvectoruk
wkistheprocessnoisewhichisassumedtobedrawnfromazeromeanmultivariatenormal
distributionwithcovarianceQk.
\mathbf{w}_k\simN(0,\mathbf{Q}_k)
Attimekanobservation(ormeasurement)zkofthetruestatexkismadeaccordingto
\mathbf{z}_k=\mathbf{H}_{k}\mathbf{x}_k+\mathbf{v}_k
whereHkistheobservationmodelwhichmapsthetruestatespaceintotheobservedspaceandvkisthe
observationnoisewhichisassumedtobezeromeanGaussianwhitenoisewithcovarianceRk.
\mathbf{v}_k\simN(0,\mathbf{R}_k)
Theinitialstate,andthenoisevectorsateachstep{x0,w1,,wk,v1vk}areallassumedtobemutually
independent.
Manyrealdynamicalsystemsdonotexactlyfitthismodel.Infact,unmodelleddynamicscanseriously
degradethefilterperformance,evenwhenitwassupposedtoworkwithunknownstochasticsignalsas
inputs.Thereasonforthisisthattheeffectofunmodelleddynamicsdependsontheinput,and,therefore,
canbringtheestimationalgorithmtoinstability(itdiverges).Ontheotherhand,independentwhitenoise
signalswillnotmakethealgorithmdiverge.Theproblemofseparatingbetweenmeasurementnoiseand
unmodelleddynamicsisadifficultoneandistreatedincontroltheoryundertheframeworkofrobust
control.[13][14]

Details
TheKalmanfilterisarecursiveestimator.Thismeansthatonlytheestimatedstatefromtheprevioustime
stepandthecurrentmeasurementareneededtocomputetheestimateforthecurrentstate.Incontrastto
batchestimationtechniques,nohistoryofobservationsand/orestimatesisrequired.Inwhatfollows,the
notation
representstheestimateof attimengivenobservationsupto,andincludingattimemn.
Thestateofthefilterisrepresentedbytwovariables:
,theaposterioristateestimateattimekgivenobservationsuptoandincludingattimek
,theaposteriorierrorcovariancematrix(ameasureoftheestimatedaccuracyofthestate
estimate).
TheKalmanfiltercanbewrittenasasingleequation,howeveritismostoftenconceptualizedastwo
distinctphases:"Predict"and"Update".Thepredictphaseusesthestateestimatefromtheprevious
timesteptoproduceanestimateofthestateatthecurrenttimestep.Thispredictedstateestimateisalso
knownastheaprioristateestimatebecause,althoughitisanestimateofthestateatthecurrenttimestep,it
https://en.wikipedia.org/wiki/Kalman_filter

6/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

doesnotincludeobservationinformationfromthecurrenttimestep.Intheupdatephase,thecurrentapriori
predictioniscombinedwithcurrentobservationinformationtorefinethestateestimate.Thisimproved
estimateistermedtheaposterioristateestimate.
Typically,thetwophasesalternate,withthepredictionadvancingthestateuntilthenextscheduled
observation,andtheupdateincorporatingtheobservation.However,thisisnotnecessaryifanobservation
isunavailableforsomereason,theupdatemaybeskippedandmultiplepredictionstepsperformed.
Likewise,ifmultipleindependentobservationsareavailableatthesametime,multipleupdatestepsmaybe
performed(typicallywithdifferentobservationmatricesHk).[15][16]

Predict
Predicted(apriori)stateestimate
Predicted(apriori)estimatecovariance

Update
Innovationormeasurementresidual

\tilde{\mathbf{y}}_k=\mathbf{z}_k
\mathbf{H}_k\hat{\mathbf{x}}_{k\mid
k1}

Innovation(orresidual)covariance
OptimalKalmangain
Updated(aposteriori)stateestimate
Updated(aposteriori)estimatecovariance
TheformulafortheupdatedestimatecovarianceaboveisonlyvalidfortheoptimalKalmangain.Usageof
othergainvaluesrequireamorecomplexformulafoundinthederivationssection.

Invariants
Ifthemodelisaccurate,andthevaluesfor
and
accuratelyreflectthedistributionoftheinitial
statevalues,thenthefollowinginvariantsarepreserved:(allestimateshaveameanerrorofzero)

where
istheexpectedvalueof ,andcovariancematricesaccuratelyreflectthecovarianceof
estimates

Optimalityandperformance
https://en.wikipedia.org/wiki/Kalman_filter

7/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

ItfollowsfromtheorythattheKalmanfilterisoptimalincaseswherea)themodelperfectlymatchesthe
realsystem,b)theenteringnoiseiswhiteandc)thecovariancesofthenoiseareexactlyknown.Several
methodsforthenoisecovarianceestimationhavebeenproposedduringpastdecades,includingALS,
mentionedinthepreviousparagraph.Afterthecovariancesareestimated,itisusefultoevaluatethe
performanceofthefilter,i.e.whetheritispossibletoimprovethestateestimationquality.IftheKalman
filterworksoptimally,theinnovationsequence(theoutputpredictionerror)isawhitenoise,thereforethe
whitenesspropertyoftheinnovationsmeasuresfilterperformance.Severaldifferentmethodscanbeused
forthispurpose.[17]

Exampleapplication,technical
Consideratruckonfrictionless,straightrails.Initiallythetruckis
stationaryatposition0,butitisbuffetedthiswayandthatby
randomuncontrolledforces.Wemeasurethepositionofthetruck
everytseconds,butthesemeasurementsareimprecisewewantto
maintainamodelofwherethetruckisandwhatitsvelocityis.We
showherehowwederivethemodelfromwhichwecreateour
Kalmanfilter.
Since

\mathbfF,\mathbfH,\mathbfR,\mathbfQ areconstant,

Black:truth,green:filteredprocess,
red:observations

theirtimeindicesaredropped.
Thepositionandvelocityofthetruckaredescribedbythelinearstatespace

where isthevelocity,thatis,thederivativeofpositionwithrespecttotime.
Weassumethatbetweenthe(k1)andktimestepuncontrolledforcescauseaconstantaccelerationofak
thatisnormallydistributed,withmean0andstandarddeviationa.FromNewton'slawsofmotionwe
concludethat
\mathbf{x}_{k}=\mathbf{F}\mathbf{x}_{k1}+\mathbf{G}a_{k}
(notethatthereisno

termsincewehavenoknowncontrolinputs.Instead,weassumethatakisthe

effectofanunknowninputand

appliesthateffecttothestatevector)where

\mathbf{F}=\begin{bmatrix}1&\Deltat\\0&1\end{bmatrix}
and

https://en.wikipedia.org/wiki/Kalman_filter

8/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

sothat
\mathbf{x}_{k}=\mathbf{F}\mathbf{x}_{k1}+\mathbf{w}_{k}
where

and

Ateachtimestep,anoisymeasurementofthetruepositionofthetruckismade.Letussupposethe
measurementnoisevkisalsonormallydistributed,withmean0andstandarddeviationz.

where

and

Weknowtheinitialstartingstateofthetruckwithperfectprecision,soweinitialize

andtotellthefilterthatweknowtheexactpositionandvelocity,wegiveitazerocovariancematrix:

Iftheinitialpositionandvelocityarenotknownperfectlythecovariancematrixshouldbeinitializedwitha
suitablylargenumber,sayL,onitsdiagonal.

Thefilterwillthenprefertheinformationfromthefirstmeasurementsovertheinformationalreadyinthe
model.

Derivations
Derivingtheaposterioriestimatecovariancematrix
https://en.wikipedia.org/wiki/Kalman_filter

9/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

StartingwithourinvariantontheerrorcovariancePk|kasabove

substituteinthedefinitionof

andsubstitute
\mathbf{P}_{k\midk}=\textrm{cov}(\mathbf{x}_{k}(\hat{\mathbf{x}}_{k\midk1}+
\mathbf{K}_k(\mathbf{z}_k\mathbf{H}_k\hat{\mathbf{x}}_{k\midk1})))
and

\mathbf{z}_{k}
\mathbf{P}_{k\midk}=\textrm{cov}(\mathbf{x}_{k}(\hat{\mathbf{x}}_{k\midk1}+
\mathbf{K}_k(\mathbf{H}_k\mathbf{x}_k+\mathbf{v}_k
\mathbf{H}_k\hat{\mathbf{x}}_{k\midk1})))

andbycollectingtheerrorvectorsweget
\mathbf{P}_{k|k}=\textrm{cov}((I\mathbf{K}_k\mathbf{H}_{k})(\mathbf{x}_k
\hat{\mathbf{x}}_{k\midk1})\mathbf{K}_k\mathbf{v}_k)
Sincethemeasurementerrorvkisuncorrelatedwiththeotherterms,thisbecomes

bythepropertiesofvectorcovariancethisbecomes
\mathbf{P}_{k\midk}=(I\mathbf{K}_k\mathbf{H}_{k})\textrm{cov}(\mathbf{x}_k
\hat{\mathbf{x}}_{k\midk1})(I\mathbf{K}_k\mathbf{H}_{k})^{\text{T}}+
\mathbf{K}_k\textrm{cov}(\mathbf{v}_k)\mathbf{K}_k^{\text{T}}
which,usingourinvariantonPk|k1andthedefinitionofRkbecomes

Thisformula(sometimesknownasthe"Josephform"ofthecovarianceupdateequation)isvalidforany
valueofKk.ItturnsoutthatifKkistheoptimalKalmangain,thiscanbesimplifiedfurtherasshown
below.

Kalmangainderivation
TheKalmanfilterisaminimummeansquareerrorestimator.Theerrorintheaposterioristateestimation
is
https://en.wikipedia.org/wiki/Kalman_filter

10/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

\mathbf{x}_{k}\hat{\mathbf{x}}_{k\midk}
Weseektominimizetheexpectedvalueofthesquareofthemagnitudeofthisvector,
Thisisequivalenttominimizingthetraceoftheaposterioriestimatecovariancematrix
expandingoutthetermsintheequationaboveandcollecting,weget:

.
.By

\begin{align}\mathbf{P}_{k\midk}&=\mathbf{P}_{k\midk1}\mathbf{K}_k
\mathbf{H}_k\mathbf{P}_{k\midk1}\mathbf{P}_{k\midk1}\mathbf{H}_k^\text{T}
\mathbf{K}_k^\text{T}+\mathbf{K}_k(\mathbf{H}_k\mathbf{P}_{k\midk1}
\mathbf{H}_k^\text{T}+\mathbf{R}_k)\mathbf{K}_k^\text{T}\\[6pt]&=\mathbf{P}_{k\mid
k1}\mathbf{K}_k\mathbf{H}_k\mathbf{P}_{k\midk1}\mathbf{P}_{k\midk1}
\mathbf{H}_k^\text{T}\mathbf{K}_k^\text{T}+\mathbf{K}_k
\mathbf{S}_k\mathbf{K}_k^\text{T}\end{align}
Thetraceisminimizedwhenitsmatrixderivativewithrespecttothegainmatrixiszero.Usingthegradient
matrixrulesandthesymmetryofthematricesinvolvedwefindthat
\frac{\partial\\mathrm{tr}(\mathbf{P}_{k\midk})}{\partial\\mathbf{K}_k}=2
(\mathbf{H}_k\mathbf{P}_{k\midk1})^\text{T}+2\mathbf{K}_k\mathbf{S}_k=0.
SolvingthisforKkyieldstheKalmangain:

\mathbf{K}_{k}=\mathbf{P}_{k\midk1}\mathbf{H}_k^\text{T}\mathbf{S}_k^{1}
Thisgain,whichisknownastheoptimalKalmangain,istheonethatyieldsMMSEestimateswhenused.

Simplificationoftheaposteriorierrorcovarianceformula
TheformulausedtocalculatetheaposteriorierrorcovariancecanbesimplifiedwhentheKalmangain
equalstheoptimalvaluederivedabove.MultiplyingbothsidesofourKalmangainformulaontherightby
SkKkT,itfollowsthat

Referringbacktoourexpandedformulafortheaposteriorierrorcovariance,

wefindthelasttwotermscancelout,giving

Thisformulaiscomputationallycheaperandthusnearlyalwaysusedinpractice,butisonlycorrectforthe
optimalgain.Ifarithmeticprecisionisunusuallylowcausingproblemswithnumericalstability,orifanon
optimalKalmangainisdeliberatelyused,thissimplificationcannotbeappliedtheaposteriorierror
https://en.wikipedia.org/wiki/Kalman_filter

11/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

covarianceformulaasderivedabove(Josephform)mustbeused.

Sensitivityanalysis
TheKalmanfilteringequationsprovideanestimateofthestate
anditserrorcovariance
recursively.Theestimateanditsqualitydependonthesystemparametersandthenoisestatisticsfedas
inputstotheestimator.Thissectionanalyzestheeffectofuncertaintiesinthestatisticalinputstothe
filter.[18]Intheabsenceofreliablestatisticsorthetruevaluesofnoisecovariancematrices
\mathbf{Q}_{k} and
,theexpression

nolongerprovidestheactualerrorcovariance.Inotherwords,
.
InmostrealtimeapplicationsthecovariancematricesthatareusedindesigningtheKalmanfilterare
differentfromtheactual(true)noisecovariancesmatrices.Thissensitivityanalysisdescribesthebehavior
oftheestimationerrorcovariancewhenthenoisecovariancesaswellasthesystemmatrices and
thatarefedasinputstothefilterareincorrect.Thus,thesensitivityanalysisdescribestherobustness(or
sensitivity)oftheestimatortomisspecifiedstatisticalandparametricinputstotheestimator.
Thisdiscussionislimitedtotheerrorsensitivityanalysisforthecaseofstatisticaluncertainties.Herethe
actualnoisecovariancesaredenotedby and \mathbf{R}^{a}_k respectively,whereasthedesign
valuesusedintheestimatorare
denotedby

\mathbf{Q}_k and

\mathbf{P}_{k\midk}^a and

Riccativariable.When

and

respectively.Theactualerrorcovarianceis

ascomputedbytheKalmanfilterisreferredtoasthe

\mathbf{R}_k\equiv\mathbf{R}^{a}_k ,thismeansthat

\mathbf{P}_{k\midk}=\mathbf{P}_{k\midk}^a .Whilecomputingtheactualerrorcovarianceusing
,substitutingfor
usingthefactthat

\widehat{\mathbf{x}}_{k\midk} and

E[\mathbf{w}_k\mathbf{w}_k^\mathrm{T}]=\mathbf{Q}_{k}^a and
,resultsinthefollowingrecursiveequationsfor

\mathbf{P}_{k\midk}^a :

and

Whilecomputing
,bydesignthefilterimplicitlyassumesthat
E[\mathbf{w}_k\mathbf{w}_k^\mathrm{T}]=\mathbf{Q}_{k} and
recursiveexpressionsfor
and

\mathbf{P}_{k\midk}^a and

\mathbf{R}_{k}^a inplaceofthedesignvalues

.Notethatthe

areidenticalexceptforthepresenceof
\mathbf{Q}_{k} and

respectively.

Squarerootform
https://en.wikipedia.org/wiki/Kalman_filter

12/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

OneproblemwiththeKalmanfilterisitsnumericalstability.IftheprocessnoisecovarianceQkissmall,
roundofferroroftencausesasmallpositiveeigenvaluetobecomputedasanegativenumber.Thisrenders
thenumericalrepresentationofthestatecovariancematrixPindefinite,whileitstrueformispositive
definite.
PositivedefinitematriceshavethepropertythattheyhaveatriangularmatrixsquarerootP=SST.This
canbecomputedefficientlyusingtheCholeskyfactorizationalgorithm,butmoreimportantly,ifthe
covarianceiskeptinthisform,itcanneverhaveanegativediagonalorbecomeasymmetric.Anequivalent
form,whichavoidsmanyofthesquarerootoperationsrequiredbythematrixsquarerootyetpreservesthe
desirablenumericalproperties,istheUDdecompositionform,P=UDUT,whereUisaunittriangular
matrix(withunitdiagonal),andDisadiagonalmatrix.
Betweenthetwo,theUDfactorizationusesthesameamountofstorage,andsomewhatlesscomputation,
andisthemostcommonlyusedsquarerootform.(Earlyliteratureontherelativeefficiencyissomewhat
misleading,asitassumedthatsquarerootsweremuchmoretimeconsumingthandivisions,[19]:69whileon
21stcenturycomputerstheyareonlyslightlymoreexpensive.)
EfficientalgorithmsfortheKalmanpredictionandupdatestepsinthesquarerootformweredevelopedby
G.J.BiermanandC.L.Thornton.[19][20]
TheLDLTdecompositionoftheinnovationcovariancematrixSkisthebasisforanothertypeof
numericallyefficientandrobustsquarerootfilter.[21]ThealgorithmstartswiththeLUdecompositionas
implementedintheLinearAlgebraPACKage(LAPACK).Theseresultsarefurtherfactoredintothe
LDLTstructurewithmethodsgivenbyGolubandVanLoan(algorithm4.1.2)forasymmetric
nonsingularmatrix.[22]Anysingularcovariancematrixispivotedsothatthefirstdiagonalpartitionis
nonsingularandwellconditioned.Thepivotingalgorithmmustretainanyportionoftheinnovation
covariancematrixdirectlycorrespondingtoobservedstatevariablesHkxk|k1thatareassociatedwith
auxiliaryobservationsinyk.theldltsquarerootfilterrequiresorthogonalizationoftheobservation
vector.[20][21]Thismaybedonewiththeinversesquarerootofthecovariancematrixfortheauxiliary
variablesusingMethod2inHigham(2002,p.263).[23]

RelationshiptorecursiveBayesianestimation
TheKalmanfiltercanbepresentedasoneofthemostsimpledynamicBayesiannetworks.TheKalman
filtercalculatesestimatesofthetruevaluesofstatesrecursivelyovertimeusingincomingmeasurements
andamathematicalprocessmodel.Similarly,recursiveBayesianestimationcalculatesestimatesofan
unknownprobabilitydensityfunction(PDF)recursivelyovertimeusingincomingmeasurementsanda
mathematicalprocessmodel.[24]
InrecursiveBayesianestimation,thetruestateisassumedtobeanunobservedMarkovprocess,andthe
measurementsaretheobservedstatesofahiddenMarkovmodel(HMM).

https://en.wikipedia.org/wiki/Kalman_filter

13/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

becauseofthemarkovassumption,thetruestateisconditionallyindependentofallearlierstatesgiventhe
immediatelypreviousstate.
p(\textbf{x}_k\mid\textbf{x}_0,\dots,\textbf{x}_{k1})=p(\textbf{x}_k\mid\textbf{x}_{k1})
Similarlythemeasurementatthekthtimestepisdependentonlyuponthecurrentstateandisconditionally
independentofallotherstatesgiventhecurrentstate.
p(\textbf{z}_k\mid\textbf{x}_0,\dots,\textbf{x}_{k})=p(\textbf{z}_k\mid\textbf{x}_{k})
UsingtheseassumptionstheprobabilitydistributionoverallstatesofthehiddenMarkovmodelcanbe
writtensimplyas:
p(\textbf{x}_0,\dots,\textbf{x}_k,\textbf{z}_1,\dots,\textbf{z}_k)=
p(\textbf{x}_0)\prod_{i=1}^kp(\textbf{z}_i\mid\textbf{x}_i)p(\textbf{x}_i\mid\textbf{x}_{i
1})
However,whentheKalmanfilterisusedtoestimatethestatex,theprobabilitydistributionofinterestis
thatassociatedwiththecurrentstatesconditionedonthemeasurementsuptothecurrenttimestep.Thisis
achievedbymarginalizingoutthepreviousstatesanddividingbytheprobabilityofthemeasurementset.
ThisleadstothepredictandupdatestepsoftheKalmanfilterwrittenprobabilistically.Theprobability
distributionassociatedwiththepredictedstateisthesum(integral)oftheproductsoftheprobability
distributionassociatedwiththetransitionfromthe(k1)thtimesteptothekthandtheprobability
distributionassociatedwiththepreviousstate,overallpossible x_{k1} .
p(\textbf{x}_k\mid\textbf{Z}_{k1})=\intp(\textbf{x}_k\mid\textbf{x}_{k1})
p(\textbf{x}_{k1}\mid\textbf{Z}_{k1})\,d\textbf{x}_{k1}
Themeasurementsetuptotimetis
\textbf{Z}_{t}=\left\{\textbf{z}_{1},\dots,\textbf{z}_{t}\right\}
Theprobabilitydistributionoftheupdateisproportionaltotheproductofthemeasurementlikelihoodand
thepredictedstate.
https://en.wikipedia.org/wiki/Kalman_filter

14/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

Thedenominator
p(\textbf{z}_k\mid\textbf{Z}_{k1})=\intp(\textbf{z}_k\mid\textbf{x}_k)p(\textbf{x}_k\mid
\textbf{Z}_{k1})d\textbf{x}_k
isanormalizationterm.
Theremainingprobabilitydensityfunctionsare
p(\textbf{x}_k\mid\textbf{x}_{k1})=\mathcal{N}(\textbf{F}_k\textbf{x}_{k1},
\textbf{Q}_k)
p(\textbf{z}_k\mid\textbf{x}_k)=\mathcal{N}(\textbf{H}_{k}\textbf{x}_k,\textbf{R}_k)
p(\textbf{x}_{k1}\mid\textbf{Z}_{k1})=\mathcal{N}(\hat{\textbf{x}}_{k1},\textbf{P}_{k
1})
NotethatthePDFattheprevioustimestepisinductivelyassumedtobetheestimatedstateandcovariance.
Thisisjustifiedbecause,asanoptimalestimator,theKalmanfiltermakesbestuseofthemeasurements,
thereforethePDFfor \mathbf{x}_k giventhemeasurements istheKalmanfilterestimate.

Marginallikelihood
RelatedtotherecursiveBayesianinterpretationdescribedabove,theKalmanfiltercanbeviewedasa
generativemodel,i.e.,aprocessforgeneratingastreamofrandomobservationsz=(z0,z1,z2,).
Specifically,theprocessis
1. Sampleahiddenstate
2. Sampleanobservation

\mathbf{x}_0 fromtheGaussianpriordistribution
.
\mathbf{z}_0 fromtheobservationmodel
.

3. For
,do
1. Samplethenexthiddenstate
2. Sampleanobservation

\mathbf{x}_k fromthetransitionmodel
.
fromtheobservationmodel

NotethatthisprocesshasidenticalstructuretothehiddenMarkovmodel,exceptthatthediscretestateand
observationsarereplacedwithcontinuousvariablessampledfromGaussiandistributions.
Insomeapplications,itisusefultocomputetheprobabilitythataKalmanfilterwithagivensetof
parameters(priordistribution,transitionandobservationmodels,andcontrolinputs)wouldgeneratea
particularobservedsignal.Thisprobabilityisknownasthemarginallikelihoodbecauseitintegratesover
https://en.wikipedia.org/wiki/Kalman_filter

15/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

("marginalizesout")thevaluesofthehiddenstatevariables,soitcanbecomputedusingonlytheobserved
signal.Themarginallikelihoodcanbeusefultoevaluatedifferentparameterchoices,ortocomparethe
KalmanfilteragainstothermodelsusingBayesianmodelcomparison.
Itisstraightforwardtocomputethemarginallikelihoodasasideeffectoftherecursivefiltering
computation.Bythechainrule,thelikelihoodcanbefactoredastheproductoftheprobabilityofeach
observationgivenpreviousobservations,
p(\mathbf{z})=\prod_{k=0}^Tp(\mathbf{z}_k\mid\mathbf{z}_{k1},\ldots,\mathbf{z}_0) ,
andbecausetheKalmanfilterdescribesaMarkovprocess,allrelevantinformationfromprevious
observationsiscontainedinthecurrentstateestimate
.Thusthemarginallikelihoodis
givenby

i.e.,aproductofGaussiandensities,eachcorrespondingtothedensityofoneobservationzkunderthe
currentfilteringdistribution
.Thiscaneasilybecomputedasasimplerecursiveupdate
however,toavoidnumericunderflow,inapracticalimplementationitisusuallydesirabletocomputethe
logmarginallikelihood \ell=\logp(\mathbf{z}) instead.Adoptingtheconvention \ell^{(1)}=0 ,
thiscanbedoneviatherecursiveupdaterule
\ell^{(k)}=\ell^{(k1)}\frac{1}{2}\left(\tilde{\mathbf{y}}_k^T\mathbf{S}^{1}_k
\tilde{\mathbf{y}}_k+\log\left|\mathbf{S}_k\right|+d_{y}\log2\pi\right)
,
where

isthedimensionofthemeasurementvector.[25]

Animportantapplicationwheresucha(log)likelihoodoftheobservations(giventhefilterparameters)is
usedismultitargettracking.Forexample,consideranobjecttrackingscenariowhereastreamof
observationsistheinput,however,itisunknownhowmanyobjectsareinthescene(or,thenumberof
objectsisknownbutisgreaterthanone).Insuchascenario,itcanbeunknownaprioriwhich
observations/measurementsweregeneratedbywhichobject.Amultiplehypothesistracker(MHT)typically
willformdifferenttrackassociationhypotheses,whereeachhypothesiscanbeviewedasaKalmanfilter
https://en.wikipedia.org/wiki/Kalman_filter

16/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

(inthelinearGaussiancase)withaspecificsetofparametersassociatedwiththehypothesizedobject.
Thus,itisimportanttocomputethelikelihoodoftheobservationsforthedifferenthypothesesunder
consideration,suchthatthemostlikelyonecanbefound.

Informationfilter
Intheinformationfilter,orinversecovariancefilter,theestimatedcovarianceandestimatedstateare
replacedbytheinformationmatrixandinformationvectorrespectively.Thesearedefinedas:
\textbf{Y}_{k\midk}=\textbf{P}_{k\midk}^{1}
\hat{\textbf{y}}_{k\midk}=\textbf{P}_{k\midk}^{1}\hat{\textbf{x}}_{k\midk}
Similarlythepredictedcovarianceandstatehaveequivalentinformationforms,definedas:
\textbf{Y}_{k\midk1}=\textbf{P}_{k\midk1}^{1}
\hat{\textbf{y}}_{k\midk1}=\textbf{P}_{k\midk1}^{1}\hat{\textbf{x}}_{k\midk1}
ashavethemeasurementcovarianceandmeasurementvector,whicharedefinedas:
\textbf{I}_{k}=\textbf{H}_{k}^{\text{T}}\textbf{R}_{k}^{1}\textbf{H}_{k}
\textbf{i}_{k}=\textbf{H}_{k}^{\text{T}}\textbf{R}_{k}^{1}\textbf{z}_{k}
Theinformationupdatenowbecomesatrivialsum.
\textbf{Y}_{k\midk}=\textbf{Y}_{k\midk1}+\textbf{I}_{k}
\hat{\textbf{y}}_{k\midk}=\hat{\textbf{y}}_{k\midk1}+\textbf{i}_{k}
ThemainadvantageoftheinformationfilteristhatNmeasurementscanbefilteredateachtimestepsimply
bysummingtheirinformationmatricesandvectors.
\textbf{Y}_{k\midk}=\textbf{Y}_{k\midk1}+\sum_{j=1}^N\textbf{I}_{k,j}
\hat{\textbf{y}}_{k\midk}=\hat{\textbf{y}}_{k\midk1}+\sum_{j=1}^N\textbf{i}_{k,j}
Topredicttheinformationfiltertheinformationmatrixandvectorcanbeconvertedbacktotheirstate
spaceequivalents,oralternativelytheinformationspacepredictioncanbeused.
\textbf{M}_{k}=[\textbf{F}_{k}^{1}]^{\text{T}}\textbf{Y}_{k1\midk1}
\textbf{F}_{k}^{1}
\textbf{C}_{k}=\textbf{M}_{k}[\textbf{M}_{k}+\textbf{Q}_{k}^{1}]^{1}
\textbf{L}_{k}=I\textbf{C}_{k}
\textbf{Y}_{k\midk1}=\textbf{L}_{k}\textbf{M}_{k}\textbf{L}_{k}^{\text{T}}+
\textbf{C}_{k}\textbf{Q}_{k}^{1}\textbf{C}_{k}^{\text{T}}

https://en.wikipedia.org/wiki/Kalman_filter

17/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

\hat{\textbf{y}}_{k\midk1}=\textbf{L}_{k}
[\textbf{F}_{k}^{1}]^{\text{T}}\hat{\textbf{y}}_{k1\midk1}
NotethatifFandQaretimeinvariantthesevaluescanbecached.NotealsothatFandQneedtobe
invertible.

Fixedlagsmoother
Theoptimalfixedlagsmootherprovidestheoptimalestimateof
givenfixedlag

N usingthemeasurementsfrom

\hat{\textbf{x}}_{kN\midk} fora

\textbf{z}_{1} to

\textbf{z}_{k} .Itcanbe

derivedusingtheprevioustheoryviaanaugmentedstate,andthemainequationofthefilteristhe
following:
\begin{bmatrix}\hat{\textbf{x}}_{t\midt}\\\hat{\textbf{x}}_{t1\midt}\\\vdots\\
\hat{\textbf{x}}_{tN+1\midt}\\\end{bmatrix}=\begin{bmatrix}\textbf{I}\\0\\\vdots\\0\\
\end{bmatrix}\hat{\textbf{x}}_{t\midt1}+\begin{bmatrix}0&\ldots&0\\\textbf{I}&0&
\vdots\\\vdots&\ddots&\vdots\\0&\ldots&I\\\end{bmatrix}\begin{bmatrix}
\hat{\textbf{x}}_{t1\midt1}\\\hat{\textbf{x}}_{t2\midt1}\\\vdots\\\hat{\textbf{x}}_{t
N+1\midt1}\\\end{bmatrix}+\begin{bmatrix}\textbf{K}^{(0)}\\\textbf{K}^{(1)}\\\vdots\\
\textbf{K}^{(N1)}\\\end{bmatrix}\textbf{y}_{t\midt1}
where:
\hat{\textbf{x}}_{t\midt1} isestimatedviaastandardKalmanfilter
\textbf{y}_{t\midt1}=\textbf{z}(t)\textbf{H}\hat{\textbf{x}}_{t\midt1} istheinnovation
producedconsideringtheestimateofthestandardKalmanfilter
thevarious \hat{\textbf{x}}_{ti\midt} with i=1,\ldots,N1 arenewvariables,i.e.theydonot
appearinthestandardKalmanfilter
thegainsarecomputedviathefollowingscheme:
\textbf{K}^{(i)}=\textbf{P}^{(i)}\textbf{H}^{T}\left[\textbf{H}\textbf{P}
\textbf{H}^\mathrm{T}+\textbf{R}\right]^{1}
and
\textbf{P}^{(i)}=\textbf{P}\left[\left[\textbf{F}\textbf{K}\textbf{H}\right]^{T}
\right]^{i}
where

\textbf{P} and

\textbf{K} arethepredictionerrorcovarianceandthegainsofthe

standardKalmanfilter(i.e.,

\textbf{P}_{t\midt1} ).

Iftheestimationerrorcovarianceisdefinedsothat
\textbf{P}_{i}:=E\left[\left(\textbf{x}_{ti}\hat{\textbf{x}}_{ti\midt}\right)^{*}\left(
\textbf{x}_{ti}\hat{\textbf{x}}_{ti\midt}\right)\midz_{1}\ldotsz_{t}\right],
https://en.wikipedia.org/wiki/Kalman_filter

18/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

thenwehavethattheimprovementontheestimationof

\textbf{x}_{ti} isgivenby:

\textbf{P}\textbf{P}_{i}=\sum_{j=0}^{i}\left[\textbf{P}^{(j)}\textbf{H}^{T}\left[
\textbf{H}\textbf{P}\textbf{H}^\mathrm{T}+\textbf{R}\right]^{1}\textbf{H}\left(
\textbf{P}^{(i)}\right)^\mathrm{T}\right]

Fixedintervalsmoothers
Theoptimalfixedintervalsmootherprovidestheoptimalestimateof
k<n )usingthemeasurementsfromafixedinterval

\hat{\textbf{x}}_{k\midn} (

\textbf{z}_{1} to

\textbf{z}_{n} .Thisisalso

called"KalmanSmoothing".Thereareseveralsmoothingalgorithmsincommonuse.

RauchTungStriebel
TheRauchTungStriebel(RTS)smootherisanefficienttwopassalgorithmforfixedinterval
smoothing.[26]
TheforwardpassisthesameastheregularKalmanfilteralgorithm.Thesefilteredstateestimates
\hat{\textbf{x}}_{k\midk} andcovariances \textbf{P}_{k\midk} aresavedforuseinthebackwards
pass.
Inthebackwardspass,wecomputethesmoothedstateestimates
covariances

\hat{\textbf{x}}_{k\midn} and

\textbf{P}_{k\midn} .Westartatthelasttimestepandproceedbackwardsintimeusingthe

followingrecursiveequations:
\hat{\textbf{x}}_{k\midn}=\hat{\textbf{x}}_{k\midk}+\textbf{C}_k(
\hat{\textbf{x}}_{k+1\midn}\hat{\textbf{x}}_{k+1\midk})

where
\textbf{C}_k=\textbf{P}_{k\midk}\textbf{F}_{k+1}^\mathrm{T}\textbf{P}_{k+1\mid
k}^{1}

ModifiedBrysonFraziersmoother
AnalternativetotheRTSalgorithmisthemodifiedBrysonFrazier(MBF)fixedintervalsmoother
developedbyBierman.[20]ThisalsousesabackwardpassthatprocessesdatasavedfromtheKalmanfilter
forwardpass.Theequationsforthebackwardpassinvolvetherecursivecomputationofdatawhichare
usedateachobservationtimetocomputethesmoothedstateandcovariance.
Therecursiveequationsare

https://en.wikipedia.org/wiki/Kalman_filter

19/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

\tilde{\Lambda}_k=\textbf{H}_k^T\textbf{S}_k^{1}\textbf{H}_k+\hat{\textbf{C}}_k^T
\hat{\Lambda}_k\hat{\textbf{C}}_k
\hat{\Lambda}_{k1}=\textbf{F}_k^T\tilde{\Lambda}_{k}\textbf{F}_k
\hat{\Lambda}_n=0
\tilde{\lambda}_k=\textbf{H}_k^T\textbf{S}_k^{1}\textbf{y}_k+\hat{\textbf{C}}_k^T
\hat{\lambda}_k
\hat{\lambda}_{k1}=\textbf{F}_k^T\tilde{\lambda}_{k}
\hat{\lambda}_n=0
where

\textbf{S}_k istheresidualcovarianceand

\hat{\textbf{C}}_k=\textbf{I}\textbf{K}_k\textbf{H}_k .Thesmoothedstateandcovariancecan
thenbefoundbysubstitutionintheequations
\textbf{P}_{k\midn}=\textbf{P}_{k\midk}\textbf{P}_{k\mid
k}\hat{\Lambda}_k\textbf{P}_{k\midk}
\textbf{x}_{k\midn}=\textbf{x}_{k\midk}\textbf{P}_{k\midk}\hat{\lambda}_k
or
\textbf{P}_{k\midn}=\textbf{P}_{k\midk1}\textbf{P}_{k\midk
1}\tilde{\Lambda}_k\textbf{P}_{k\midk1}
\textbf{x}_{k\midn}=\textbf{x}_{k\midk1}\textbf{P}_{k\midk1}\tilde{\lambda}_k.
AnimportantadvantageoftheMBFisthatitdoesnotrequirefindingtheinverseofthecovariancematrix.

Minimumvariancesmoother
Theminimumvariancesmoothercanattainthebestpossibleerrorperformance,providedthatthemodels
arelinear,theirparametersandthenoisestatisticsareknownprecisely.[27]Thissmootherisatimevarying
statespacegeneralizationoftheoptimalnoncausalWienerfilter.
Thesmoothercalculationsaredoneintwopasses.Theforwardcalculationsinvolveaonestepahead
predictorandaregivenby
\hat{\textbf{x}}_{k+1\midk}=\textbf{(F}_{k}
\textbf{K}_{k}\textbf{H}_{k})\hat{\textbf{x}}_{k\midk1}+\textbf{K}_{k}\textbf{z}_{k}
{\alpha}_{k}=\textbf{S}_k^{1/2}\textbf{H}_{k}\hat{\textbf{x}}_{k\midk1}+
\textbf{S}_k^{1/2}\textbf{z}_{k}
TheabovesystemisknownastheinverseWienerHopffactor.Thebackwardrecursionistheadjointofthe
aboveforwardsystem.Theresultofthebackwardpass \beta_{k} maybecalculatedbyoperatingthe
forwardequationsonthetimereversed

\alpha_{k} andtimereversingtheresult.Inthecaseofoutput

estimation,thesmoothedestimateisgivenby
https://en.wikipedia.org/wiki/Kalman_filter

20/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

\hat{\textbf{y}}_{k\midN}=\textbf{z}_{k}\textbf{R}_{k}\beta_{k}
Takingthecausalpartofthisminimumvariancesmootheryields
\hat{\textbf{y}}_{k\midk}=\textbf{z}_{k}\textbf{R}_{k}\textbf{S}_k^{1/2}\alpha_{k}
whichisidenticaltotheminimumvarianceKalmanfilter.Theabovesolutionsminimizethevarianceofthe
outputestimationerror.NotethattheRauchTungStriebelsmootherderivationassumesthatthe
underlyingdistributionsareGaussian,whereastheminimumvariancesolutionsdonot.Optimalsmoothers
forstateestimationandinputestimationcanbeconstructedsimilarly.
Acontinuoustimeversionoftheabovesmootherisdescribedin.[28][29]
Expectationmaximizationalgorithmsmaybeemployedtocalculateapproximatemaximumlikelihood
estimatesofunknownstatespaceparameterswithinminimumvariancefiltersandsmoothers.Often
uncertaintiesremainwithinproblemassumptions.Asmootherthataccommodatesuncertaintiescanbe
designedbyaddingapositivedefinitetermtotheRiccatiequation.[30]
Incaseswherethemodelsarenonlinear,stepwiselinearizationsmaybewithintheminimumvariance
filterandsmootherrecursions(extendedKalmanfiltering).

FrequencyWeightedKalmanfilters
PioneeringresearchontheperceptionofsoundsatdifferentfrequencieswasconductedbyFletcherand
Munsoninthe1930s.Theirworkledtoastandardwayofweightingmeasuredsoundlevelswithin
investigationsofindustrialnoiseandhearingloss.Frequencyweightingshavesincebeenusedwithinfilter
andcontrollerdesignstomanageperformancewithinbandsofinterest.
Typically,afrequencyshapingfunctionisusedtoweighttheaveragepoweroftheerrorspectraldensityin
aspecifiedfrequencyband.Let \textbf{y} \hat{\textbf{y}} denotetheoutputestimationerror
exhibitedbyaconventionalKalmanfilter.Also,let

\textbf{W} denoteacausalfrequencyweighting

transferfunction.Theoptimumsolutionwhichminimizesthevarianceof
\hat{\textbf{y}} )arisesbysimplyconstructing
Thedesignof

\textbf{W} (

\textbf{y}

\textbf{W}^{1}\hat{\textbf{y}} .

\textbf{W} remainsanopenquestion.Onewayofproceedingistoidentifyasystem

whichgeneratestheestimationerrorandsetting

\textbf{W} equaltotheinverseofthatsystem.[31]This

proceduremaybeiteratedtoobtainmeansquareerrorimprovementatthecostofincreasedfilterorder.
Thesametechniquecanbeappliedtosmoothers.

Nonlinearfilters
ThebasicKalmanfilterislimitedtoalinearassumption.Morecomplexsystems,however,canbe
nonlinear.Thenonlinearitycanbeassociatedeitherwiththeprocessmodelorwiththeobservationmodel
orwithboth.
https://en.wikipedia.org/wiki/Kalman_filter

21/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

ExtendedKalmanfilter
IntheextendedKalmanfilter(EKF),thestatetransitionandobservationmodelsneednotbelinear
functionsofthestatebutmayinsteadbenonlinearfunctions.Thesefunctionsareofdifferentiabletype.
\textbf{x}_{k}=f(\textbf{x}_{k1},\textbf{u}_{k})+\textbf{w}_{k}
\textbf{z}_{k}=h(\textbf{x}_{k})+\textbf{v}_{k}
Thefunctionfcanbeusedtocomputethepredictedstatefromthepreviousestimateandsimilarlythe
functionhcanbeusedtocomputethepredictedmeasurementfromthepredictedstate.However,fandh
cannotbeappliedtothecovariancedirectly.Insteadamatrixofpartialderivatives(theJacobian)is
computed.
AteachtimesteptheJacobianisevaluatedwithcurrentpredictedstates.Thesematricescanbeusedinthe
Kalmanfilterequations.Thisprocessessentiallylinearizesthenonlinearfunctionaroundthecurrent
estimate.

UnscentedKalmanfilter
Whenthestatetransitionandobservationmodelsthatis,thepredictandupdatefunctions

f and

arehighlynonlinear,theextendedKalmanfiltercangiveparticularlypoorperformance.[32]Thisisbecause
thecovarianceispropagatedthroughlinearizationoftheunderlyingnonlinearmodel.Theunscented
Kalmanfilter(UKF)[32]usesadeterministicsamplingtechniqueknownastheunscentedtransformtopick
aminimalsetofsamplepoints(calledsigmapoints)aroundthemean.Thesesigmapointsarethen
propagatedthroughthenonlinearfunctions,fromwhichthemeanandcovarianceoftheestimatearethen
recovered.Theresultisafilterwhichmoreaccuratelycapturesthetruemeanandcovariance.(Thiscanbe
verifiedusingMonteCarlosamplingorthroughaTaylorseriesexpansionoftheposteriorstatistics.)In
addition,thistechniqueremovestherequirementtoexplicitlycalculateJacobians,whichforcomplex
functionscanbeadifficulttaskinitself(i.e.,requiringcomplicatedderivativesifdoneanalyticallyorbeing
computationallycostlyifdonenumerically).
Predict
AswiththeEKF,theUKFpredictioncanbeusedindependentlyfromtheUKFupdate,incombinationwith
alinear(orindeedEKF)update,orviceversa.
Theestimatedstateandcovarianceareaugmentedwiththemeanandcovarianceoftheprocessnoise.
\textbf{x}_{k1\midk1}^{a}=[\hat{\textbf{x}}_{k1\midk1}^\mathrm{T}\quad
E[\textbf{w}_{k}^\mathrm{T}]\]^\mathrm{T}
\textbf{P}_{k1\midk1}^{a}=\begin{bmatrix}&\textbf{P}_{k1\midk1}&&0&\\&0&
&\textbf{Q}_{k}&\end{bmatrix}
Asetof2L+1sigmapointsisderivedfromtheaugmentedstateandcovariancewhereListhedimension
oftheaugmentedstate.
https://en.wikipedia.org/wiki/Kalman_filter

22/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

\chi_{k1\midk1}^{0}=\textbf{x}_{k1\midk1}^{a}
\chi_{k1\midk1}^{i}=\textbf{x}_{k1\midk1}^{a}+\left(\sqrt{(L+\lambda)
\textbf{P}_{k1\midk1}^{a}}\right)_{i},\qquadi=1,\ldots,L
\chi_{k1\midk1}^{i}=\textbf{x}_{k1\midk1}^{a}\left(\sqrt{(L+\lambda)
\textbf{P}_{k1\midk1}^{a}}\right)_{iL},\qquadi=L+1,\dots{},2L
where
\left(\sqrt{(L+\lambda)\textbf{P}_{k1\midk1}^{a}}\right)_{i}
istheithcolumnofthematrixsquarerootof
(L+\lambda)\textbf{P}_{k1\midk1}^{a}
usingthedefinition:squareroot

\textbf{A} ofmatrix

\textbf{B} satisfies

\textbf{B}\triangleq\textbf{A}\textbf{A}^\mathrm{T}.\,
Thematrixsquarerootshouldbecalculatedusingnumericallyefficientandstablemethodssuchasthe
Choleskydecomposition.
Thesigmapointsarepropagatedthroughthetransitionfunctionf.
\chi_{k\midk1}^{i}=f(\chi_{k1\midk1}^{i})\quadi=0,\dots,2L
where
covariance.

.Theweightedsigmapointsarerecombinedtoproducethepredictedstateand

\hat{\textbf{x}}_{k\midk1}=\sum_{i=0}^{2L}W_{s}^{i}\chi_{k\midk1}^{i}
\textbf{P}_{k\midk1}=\sum_{i=0}^{2L}W_{c}^{i}\[\chi_{k\midk1}^{i}
\hat{\textbf{x}}_{k\midk1}][\chi_{k\midk1}^{i}\hat{\textbf{x}}_{k\midk
1}]^\mathrm{T}
wheretheweightsforthestateandcovariancearegivenby:
W_{s}^{0}=\frac{\lambda}{L+\lambda}
W_{c}^{0}=\frac{\lambda}{L+\lambda}+(1\alpha^2+\beta)
W_{s}^{i}=W_{c}^{i}=\frac{1}{2(L+\lambda)}
\lambda=\alpha^2(L+\kappa)L\,\!
\alpha and

\kappa controlthespreadofthesigmapoints.

Normalvaluesare
Gaussian,

\alpha=10^{3} ,

\kappa=0 and

\beta isrelatedtothedistributionof

\beta=2 .Ifthetruedistributionof

x.

x is

\beta=2 isoptimal.[33]

https://en.wikipedia.org/wiki/Kalman_filter

23/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

Update
Thepredictedstateandcovarianceareaugmentedasbefore,exceptnowwiththemeanandcovarianceof
themeasurementnoise.
\textbf{x}_{k\midk1}^{a}=[\hat{\textbf{x}}_{k\midk1}^\mathrm{T}\quad
E[\textbf{v}_{k}^\mathrm{T}]\]^\mathrm{T}
\textbf{P}_{k\midk1}^{a}=\begin{bmatrix}&\textbf{P}_{k\midk1}&&0&\\&0&
&\textbf{R}_{k}&\end{bmatrix}
Asbefore,asetof2L+1sigmapointsisderivedfromtheaugmentedstateandcovariancewhereListhe
dimensionoftheaugmentedstate.
\begin{align}\chi_{k\midk1}^{0}&=\textbf{x}_{k\midk1}^{a}\\[6pt]\chi_{k\midk1}^{i}
&=\textbf{x}_{k\midk1}^{a}+\left(\sqrt{(L+\lambda)\textbf{P}_{k\midk1}^{a}}\right
)_{i},\qquadi=1,\dots,L\\[6pt]\chi_{k\midk1}^{i}&=\textbf{x}_{k\midk1}^{a}\left(
\sqrt{(L+\lambda)\textbf{P}_{k\midk1}^{a}}\right)_{iL},\qquadi=L+1,\dots,2L
\end{align}
AlternativelyiftheUKFpredictionhasbeenusedthesigmapointsthemselvescanbeaugmentedalongthe
followinglines
\chi_{k\midk1}:=[\chi_{k\midk1}^\mathrm{T}\quadE[\textbf{v}_{k}^\mathrm{T}]\
]^\mathrm{T}\pm\sqrt{(L+\lambda)\textbf{R}_{k}^{a}}
where
\textbf{R}_{k}^{a}=\begin{bmatrix}&0&&0&\\&0&&\textbf{R}_{k}&\end{bmatrix}
Thesigmapointsareprojectedthroughtheobservationfunctionh.
\gamma_{k}^{i}=h(\chi_{k\midk1}^{i})\quadi=0..2L
Theweightedsigmapointsarerecombinedtoproducethepredictedmeasurementandpredicted
measurementcovariance.
\hat{\textbf{z}}_{k}=\sum_{i=0}^{2L}W_{s}^{i}\gamma_{k}^{i}
\textbf{P}_{z_{k}z_{k}}=\sum_{i=0}^{2L}W_{c}^{i}\[\gamma_{k}^{i}
\hat{\textbf{z}}_{k}][\gamma_{k}^{i}\hat{\textbf{z}}_{k}]^\mathrm{T}
Thestatemeasurementcrosscovariancematrix,

https://en.wikipedia.org/wiki/Kalman_filter

24/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

isusedtocomputetheUKFKalmangain.
K_{k}=\textbf{P}_{x_{k}z_{k}}\textbf{P}_{z_{k}z_{k}}^{1}
AswiththeKalmanfilter,theupdatedstateisthepredictedstateplustheinnovationweightedbythe
Kalmangain,
\hat{\textbf{x}}_{k\midk}=\hat{\textbf{x}}_{k\midk1}+K_{k}(\textbf{z}_{k}
\hat{\textbf{z}}_{k})
Andtheupdatedcovarianceisthepredictedcovariance,minusthepredictedmeasurementcovariance,
weightedbytheKalmangain.
\textbf{P}_{k\midk}=\textbf{P}_{k\midk1}K_{k}\textbf{P}_{z_{k}z_{k}}
K_{k}^\mathrm{T}

KalmanBucyfilter
TheKalmanBucyfilter(namedafterRichardSnowdenBucy)isacontinuoustimeversionoftheKalman
filter.[34][35]
Itisbasedonthestatespacemodel
\frac{d}{dt}\mathbf{x}(t)=\mathbf{F}(t)\mathbf{x}(t)+\mathbf{B}(t)\mathbf{u}(t)+
\mathbf{w}(t)
\mathbf{z}(t)=\mathbf{H}(t)\mathbf{x}(t)+\mathbf{v}(t)
where

\mathbf{Q}(t) and

\mathbf{w}(t) and

\mathbf{R}(t) representtheintensitiesofthetwowhitenoiseterms

\mathbf{v}(t) ,respectively.

Thefilterconsistsoftwodifferentialequations,oneforthestateestimateandoneforthecovariance:
\frac{d}{dt}\hat{\mathbf{x}}(t)=\mathbf{F}(t)\hat{\mathbf{x}}(t)+\mathbf{B}(t)\mathbf{u}
(t)+\mathbf{K}(t)(\mathbf{z}(t)\mathbf{H}(t)\hat{\mathbf{x}}(t))
\frac{d}{dt}\mathbf{P}(t)=\mathbf{F}(t)\mathbf{P}(t)+\mathbf{P}(t)\mathbf{F}^{T}(t)+
\mathbf{Q}(t)\mathbf{K}(t)\mathbf{R}(t)\mathbf{K}^{T}(t)
wheretheKalmangainisgivenby
\mathbf{K}(t)=\mathbf{P}(t)\mathbf{H}^{T}(t)\mathbf{R}^{1}(t)

https://en.wikipedia.org/wiki/Kalman_filter

25/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

Notethatinthisexpressionfor

\mathbf{K}(t) thecovarianceoftheobservationnoise

\mathbf{R}(t)

representsatthesametimethecovarianceofthepredictionerror(orinnovation)
\tilde{\mathbf{y}}(t)=\mathbf{z}(t)\mathbf{H}(t)\hat{\mathbf{x}}(t) thesecovariancesareequalonly
inthecaseofcontinuoustime.[36]
ThedistinctionbetweenthepredictionandupdatestepsofdiscretetimeKalmanfilteringdoesnotexistin
continuoustime.
Theseconddifferentialequation,forthecovariance,isanexampleofaRiccatiequation.

HybridKalmanfilter
Mostphysicalsystemsarerepresentedascontinuoustimemodelswhilediscretetimemeasurementsare
frequentlytakenforstateestimationviaadigitalprocessor.Therefore,thesystemmodelandmeasurement
modelaregivenby
\begin{align}\dot{\mathbf{x}}(t)&=\mathbf{F}(t)\mathbf{x}(t)+\mathbf{B}(t)\mathbf{u}
(t)+\mathbf{w}(t),&\mathbf{w}(t)&\simN\bigl(\mathbf{0},\mathbf{Q}(t)\bigr)\\\mathbf{z}_k
&=\mathbf{H}_k\mathbf{x}_k+\mathbf{v}_k,&\mathbf{v}_k&\sim
N(\mathbf{0},\mathbf{R}_k)\end{align}
where
\mathbf{x}_k=\mathbf{x}(t_k) .
Initialize
\hat{\mathbf{x}}_{0\mid0}=E\bigl[\mathbf{x}(t_0)\bigr],\mathbf{P}_{0\mid
0}=Var\bigl[\mathbf{x}(t_0)\bigr]
Predict
\begin{align}&\dot{\hat{\mathbf{x}}}(t)=\mathbf{F}(t)\hat{\mathbf{x}}(t)+\mathbf{B}(t)
\mathbf{u}(t)\text{,with}\hat{\mathbf{x}}(t_{k1})=\hat{\mathbf{x}}_{k1\midk1}\\
\Rightarrow&\hat{\mathbf{x}}_{k\midk1}=\hat{\mathbf{x}}(t_k)\\&\dot{\mathbf{P}}(t)=
\mathbf{F}(t)\mathbf{P}(t)+\mathbf{P}(t)\mathbf{F}(t)^T+\mathbf{Q}(t)\text{,with}
\mathbf{P}(t_{k1})=\mathbf{P}_{k1\midk1}\\\Rightarrow&\mathbf{P}_{k\midk1}=
\mathbf{P}(t_k)\end{align}
ThepredictionequationsarederivedfromthoseofcontinuoustimeKalmanfilterwithoutupdatefrom
measurements,i.e., \mathbf{K}(t)=0 .Thepredictedstateandcovariancearecalculatedrespectivelyby
solvingasetofdifferentialequationswiththeinitialvalueequaltotheestimateatthepreviousstep.
Update
\mathbf{K}_{k}=\mathbf{P}_{k\midk
1}\mathbf{H}_{k}^T\bigl(\mathbf{H}_{k}\mathbf{P}_{k\midk
1}\mathbf{H}_{k}^T+\mathbf{R}_{k}\bigr)^{1}
\hat{\mathbf{x}}_{k\midk}=\hat{\mathbf{x}}_{k\midk1}+\mathbf{K}_k(\mathbf{z}_k
\mathbf{H}_k\hat{\mathbf{x}}_{k\midk1})
https://en.wikipedia.org/wiki/Kalman_filter

26/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

\mathbf{P}_{k\midk}=(\mathbf{I}\mathbf{K}_{k}\mathbf{H}_{k})\mathbf{P}_{k\midk
1}
TheupdateequationsareidenticaltothoseofthediscretetimeKalmanfilter.

Variantsfortherecoveryofsparsesignals
RecentlythetraditionalKalmanfilterhasbeenemployedfortherecoveryofsparse,possiblydynamic,
signalsfromnoisyobservations.Bothworks[37]and[38]utilizenotionsfromthetheoryofcompressed
sensing/sampling,suchastherestrictedisometrypropertyandrelatedprobabilisticrecoveryarguments,for
sequentiallyestimatingthesparsestateinintrinsicallylowdimensionalsystems.

Applications
AttitudeandHeadingReferenceSystems
Autopilot
Batterystateofcharge(SoC)estimation[39][40]
Braincomputerinterface
Chaoticsignals
TrackingandVertexFittingofchargedparticlesinParticleDetectors[41]
Trackingofobjectsincomputervision
Dynamicpositioning
Economics,inparticularmacroeconomics,timeseriesanalysis,andeconometrics[42]
Inertialguidancesystem
OrbitDetermination
Powersystemstateestimation
Radartracker
Satellitenavigationsystems
Seismology[43]
SensorlesscontrolofACmotorvariablefrequencydrives
Simultaneouslocalizationandmapping
Speechenhancement
Visualodometry
Weatherforecasting
Navigationsystem
3Dmodeling
Structuralhealthmonitoring
Humansensorimotorprocessing[44]

Seealso
Alphabetafilter
BayesianMMSEestimator
Covarianceintersection
Dataassimilation
EnsembleKalmanfilter
https://en.wikipedia.org/wiki/Kalman_filter

27/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

ExtendedKalmanfilter
FastKalmanfilter
Filteringproblem(stochasticprocesses)
Generalizedfiltering
InvariantextendedKalmanfilter
Kerneladaptivefilter
LinearquadraticGaussiancontrol
Movinghorizonestimation
Nonlinearfilter
Particlefilterestimator
Predictorcorrector
Recursiveleastsquares
SchmidtKalmanfilter
Separationprinciple
Slidingmodecontrol
Stochasticdifferentialequations
Volterraseries
Wienerfilter
Zakaiequation

References
1. Kalman,R.E.(1960)."ANewApproachtoLinearFilteringandPredictionProblems".JournalofBasic
Engineering82:35.doi:10.1115/1.3662552(https://dx.doi.org/10.1115%2F1.3662552).
2. SteffenL.Lauritzen(http://www.stats.ox.ac.uk/~steffen/)."Timeseriesanalysisin1880.Adiscussionof
contributionsmadebyT.N.Thiele".InternationalStatisticalReview49,1981,319333.JSTOR1402616
(http://www.jstor.org/stable/1402616)
3. SteffenL.Lauritzen,Thiele:PioneerinStatistics(http://www.oup.com/uk/catalogue/?ci=9780198509721),
OxfordUniversityPress,2002.ISBN0198509723.
4. MohinderS.GrewalandAngusP.Andrews(http://ieeecss.org/CSM/library/2010/june10/11
HistoricalPerspectives.pdf)
5. Stratonovich,R.L.(1959).Optimumnonlinearsystemswhichbringaboutaseparationofasignalwithconstant
parametersfromnoise.Radiofizika,2:6,pp.892901.
6. Stratonovich,R.L.(1959).Onthetheoryofoptimalnonlinearfilteringofrandomfunctions.Theoryof
ProbabilityanditsApplications,4,pp.223225.
7. Stratonovich,R.L.(1960)ApplicationoftheMarkovprocessestheorytooptimalfiltering.RadioEngineering
andElectronicPhysics,5:11,pp.119.
8. Stratonovich,R.L.(1960).ConditionalMarkovProcesses.TheoryofProbabilityanditsApplications,5,
pp.156178.
9. IngvarStridKarlWalentin(April2009)."BlockKalmanFilteringforLargeScaleDSGEModels"
(http://www.riksbank.se/en/Pressandpublished/PublishedfromtheRiksbank/Otherreports/WorkingPaper
Series/2008/No224BlockKalmanfilteringforlargescaleDSGEmodels/).ComputationalEconomics
(Springer)33(3):277304.doi:10.1007/s1061400891604(https://dx.doi.org/10.1007%2Fs1061400891604).
10. MartinMllerAndreasen(2008)."NonlinearDSGEModels,TheCentralDifferenceKalmanFilter,andThe
MeanShiftedParticleFilter"(ftp://ftp.econ.au.dk/creates/rp/08/rp08_33.pdf)(PDF).
11. Roweis,SGhahramani,Z(1999)."Aunifyingreviewoflineargaussianmodels".Neuralcomputation11(2):
30545.doi:10.1162/089976699300016674(https://dx.doi.org/10.1162%2F089976699300016674).
PMID9950734(https://www.ncbi.nlm.nih.gov/pubmed/9950734).
12. Hamilton,J.(1994),TimeSeriesAnalysis,PrincetonUniversityPress.Chapter13,'TheKalmanFilter'.
13. Ishihara,J.Y.Terra,M.H.Campos,J.C.T.(2006)."RobustKalmanFilterforDescriptorSystems".IEEE
TransactionsonAutomaticControl51(8):1354.doi:10.1109/TAC.2006.878741
(https://dx.doi.org/10.1109%2FTAC.2006.878741).
https://en.wikipedia.org/wiki/Kalman_filter

28/33

7/27/2015

14.
15.
16.
17.

18.
19.
20.
21.
22.
23.
24.

25.
26.

27.

28.

29.
30.

KalmanfilterWikipedia,thefreeencyclopedia

(https://dx.doi.org/10.1109%2FTAC.2006.878741).
Terra,MarcoH.Cerri,JoaoP.Ishihara,JoaoY.(2014)."OptimalRobustLinearQuadraticRegulatorfor
SystemsSubjecttoUncertainties".IEEETransactionsonAutomaticControl59(9):2586.
doi:10.1109/TAC.2014.2309282(https://dx.doi.org/10.1109%2FTAC.2014.2309282).
Kelly,Alonzo(1994)."A3DstatespaceformulationofanavigationKalmanfilterforautonomousvehicles"
(http://www.dtic.mil/dtic/tr/fulltext/u2/a282853.pdf)(PDF).DTICDocument:13.2006CorrectedVersion
(http://www.frc.ri.cmu.edu/~alonzo/pubs/reports/kalman_V2.pdf)
Reid,IanTerm,Hilary."EstimationII"
(http://www.robots.ox.ac.uk/~ian/Teaching/Estimation/LectureNotes2.pdf)(PDF).www.robots.ox.ac.uk.Oxford
University.Retrieved6August2014.
ThreeoptimalitytestswithnumericalexamplesaredescribedinPeter,Matisko,(2012)."OptimalityTestsand
AdaptiveKalmanFilter".16thIFACSymposiumonSystemIdentification.16thIFACSymposiumonSystem
Identification.p.1523.doi:10.3182/201207113BE2027.00011(https://dx.doi.org/10.3182%2F201207113BE
2027.00011).ISBN9783902823069.
Anderson,BrianD.O.Moore,JohnB.(1979).OptimalFiltering.NewYork:PrenticeHall.pp.129133.
ISBN0136381227.
Thornton,CatherineL.(15October1976)."TriangularCovarianceFactorizationsforKalmanFiltering"
(http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19770005172_1977005172.pdf)(PDF).(PhDthesis).NASA.
NASATechnicalMemorandum33798.
Bierman,G.J.(1977)."FactorizationMethodsforDiscreteSequentialEstimation".FactorizationMethodsfor
DiscreteSequentialEstimation(AcademicPress).Bibcode:1977fmds.book.....B
(http://adsabs.harvard.edu/abs/1977fmds.book.....B).
BarShalom,YaakovLi,X.RongKirubarajan,Thiagalingam(July2001).EstimationwithApplicationsto
TrackingandNavigation.NewYork:JohnWiley&Sons.pp.308317.ISBN9780471416555.
Golub,GeneH.VanLoan,CharlesF.(1996).MatrixComputations.JohnsHopkinsStudiesintheMathematical
Sciences(Thirded.).Baltimore,Maryland:JohnsHopkinsUniversity.p.139.ISBN9780801854149.
Higham,NicholasJ.(2002).AccuracyandStabilityofNumericalAlgorithms(Seconded.).Philadelphia,PA:
SocietyforIndustrialandAppliedMathematics.p.680.ISBN9780898715217.
Masreliez,C.JohanMartin,RD(1977)."RobustBayesianestimationforthelinearmodelandrobustifyingthe
Kalmanfilter"(http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1101538).IEEETransactionson
AutomaticControl22(3):361.doi:10.1109/TAC.1977.1101538
(https://dx.doi.org/10.1109%2FTAC.1977.1101538).
Ltkepohl,Helmut(1991).IntroductiontoMultipleTimeSeriesAnalysis.Heidelberg:SpringerVerlagBerlin,.
p.435.
Rauch,H.E.Tung,F.Striebel,C.T.(August1965)."Maximumlikelihoodestimatesoflineardynamic
systems"(http://pdf.aiaa.org/getfile.cfm?
urlX=7%3CWIG7D%2FQKU%3E6B5%3AKF2Z%5CD%3A%2B82%2A%40%24%5E%3F%40%20%0A&urla
=%25%2ARL%2F%220L%20%0A&urlb=%21%2A%20%20%20%0A&urlc=%21%2A0%20%20%0A&urld=%
21%2A0%20%20%0A&urle=%27%2BB%2C%27%22%20%22KT0%20%20%0A).AIAAJ3(8):14451450.
Bibcode:1965AIAAJ...3.1445(http://adsabs.harvard.edu/abs/1965AIAAJ...3.1445).doi:10.2514/3.3166
(https://dx.doi.org/10.2514%2F3.3166).
Einicke,G.A.(March2006)."OptimalandRobustNoncausalFilterFormulations".IEEETrans.Signal
Processing54(3):10691077.Bibcode:2006ITSP...54.1069E
(http://adsabs.harvard.edu/abs/2006ITSP...54.1069E).doi:10.1109/TSP.2005.863042
(https://dx.doi.org/10.1109%2FTSP.2005.863042).
Einicke,G.A.(April2007)."AsymptoticOptimalityoftheMinimumVarianceFixedIntervalSmoother".IEEE
Trans.SignalProcessing55(4):15431547.Bibcode:2007ITSP...55.1543E
(http://adsabs.harvard.edu/abs/2007ITSP...55.1543E).doi:10.1109/TSP.2006.889402
(https://dx.doi.org/10.1109%2FTSP.2006.889402).
Einicke,G.A.Ralston,J.C.Hargrave,C.O.Reid,D.C.Hainsworth,D.W.(December2008)."Longwall
MiningAutomation.AnApplicationofMinimumVarianceSmoothing".IEEEControlSystemsMagazine28(6):
2837.doi:10.1109/MCS.2008.929281(https://dx.doi.org/10.1109%2FMCS.2008.929281).
Einicke,G.A.(December2009)."AsymptoticOptimalityoftheMinimumVarianceFixedIntervalSmoother".
IEEETrans.AutomaticControl54(12):29042908.Bibcode:2007ITSP...55.1543E
(http://adsabs.harvard.edu/abs/2007ITSP...55.1543E).doi:10.1109/TSP.2006.889402

https://en.wikipedia.org/wiki/Kalman_filter

29/33

7/27/2015

31.
32.

33.

34.
35.
36.
37.

38.
39.

40.
41.
42.
43.

44.

KalmanfilterWikipedia,thefreeencyclopedia

(http://adsabs.harvard.edu/abs/2007ITSP...55.1543E).doi:10.1109/TSP.2006.889402
(https://dx.doi.org/10.1109%2FTSP.2006.889402).
Einicke,G.A.(December2014)."IterativeFrequencyWeightedFilteringandSmoothingProcedures".IEEE
SignalProcessingLetters21(12):14671470.doi:10.1109/LSP.2014.2341641
(https://dx.doi.org/10.1109%2FLSP.2014.2341641).
Julier,SimonJ.Uhlmann,JeffreyK.(1997)."AnewextensionoftheKalmanfiltertononlinearsystems"
(http://www.cs.unc.edu/~welch/kalman/media/pdf/Julier1997_SPIE_KF.pdf)(PDF).Int.Symp.Aerospace/Defense
Sensing,Simul.andControls.SignalProcessing,SensorFusion,andTargetRecognitionVI3:182.
Bibcode:1997SPIE.3068..182J(http://adsabs.harvard.edu/abs/1997SPIE.3068..182J).doi:10.1117/12.280797
(https://dx.doi.org/10.1117%2F12.280797).Retrieved20080503.
Wan,E.A.VanDerMerwe,R.(2000)."TheunscentedKalmanfilterfornonlinearestimation".Proceedingsof
theIEEE2000AdaptiveSystemsforSignalProcessing,Communications,andControlSymposium(Cat.
No.00EX373)(http://www.lara.unb.br/~gaborges/disciplinas/efe/papers/wan2000.pdf)(PDF).p.153.
doi:10.1109/ASSPCC.2000.882463(https://dx.doi.org/10.1109%2FASSPCC.2000.882463).ISBN078035800
7.
Bucy,R.S.andJoseph,P.D.,FilteringforStochasticProcesseswithApplicationstoGuidance,JohnWiley&
Sons,19682ndEdition,AMSChelseaPubl.,2005.ISBN0821837826
Jazwinski,AndrewH.,Stochasticprocessesandfilteringtheory,AcademicPress,NewYork,1970.ISBN012
3815509
Kailath,T.(1968)."AninnovationsapproachtoleastsquaresestimationPartI:Linearfilteringinadditivewhite
noise".IEEETransactionsonAutomaticControl13(6):646655.doi:10.1109/TAC.1968.1099025
(https://dx.doi.org/10.1109%2FTAC.1968.1099025).
Carmi,AvishyGurfil,PiniKanevsky,Dimitri(2010)."MethodsforsparsesignalrecoveryusingKalman
filteringwithembeddedpseudomeasurementnormsandquasinorms".IEEETransactionsonSignalProcessing
58(4):24052409.Bibcode:2010ITSP...58.2405C(http://adsabs.harvard.edu/abs/2010ITSP...58.2405C).
doi:10.1109/TSP.2009.2038959(https://dx.doi.org/10.1109%2FTSP.2009.2038959).
Vaswani,Namrata(2008)."KalmanfilteredCompressedSensing".200815thIEEEInternationalConferenceon
ImageProcessing.p.893.doi:10.1109/ICIP.2008.4711899(https://dx.doi.org/10.1109%2FICIP.2008.4711899).
ISBN9781424417650.
Vasebi,AmirPartovibakhsh,MaralBathaee,S.MohammadTaghi(2007)."Anovelcombinedbatterymodelfor
stateofchargeestimationinleadacidbatteriesbasedonextendedKalmanfilterforhybridelectricvehicle
applications".JournalofPowerSources174:30.doi:10.1016/j.jpowsour.2007.04.011
(https://dx.doi.org/10.1016%2Fj.jpowsour.2007.04.011).
Vasebi,A.Bathaee,S.M.T.Partovibakhsh,M.(2008)."Predictingstateofchargeofleadacidbatteriesfor
hybridelectricvehiclesbyextendedKalmanfilter".EnergyConversionandManagement49:75.
doi:10.1016/j.enconman.2007.05.017(https://dx.doi.org/10.1016%2Fj.enconman.2007.05.017).
Fruhwirth,R.(1987)."ApplicationofKalmanfilteringtotrackandvertexfitting".Nucl.Instrum.Meth.A262
(23):444450.Bibcode:1987NIMPA.262..444F(http://adsabs.harvard.edu/abs/1987NIMPA.262..444F).
doi:10.1016/01689002(87)908874(https://dx.doi.org/10.1016%2F01689002%2887%29908874).
Harvey,AndrewC.(1994)."ApplicationsoftheKalmanfilterineconometrics".InBewley,Truman.Advances
inEconometrics.NewYork:CambridgeUniversityPress.pp.285f.ISBN0521467268.
Bock,Y.Crowell,B.Webb,F.Kedar,S.Clayton,R.Miyahara,B.(2008)."FusionofHighRateGPSand
SeismicData:ApplicationstoEarlyWarningSystemsforMitigationofGeologicalHazards".American
GeophysicalUnion43:01.Bibcode:2008AGUFM.G43B..01B
(http://adsabs.harvard.edu/abs/2008AGUFM.G43B..01B).
Wolpert,D.M.Miall,R.C.(1996)."ForwardModelsforPhysiologicalMotorControl".NeuralNetw.9(8):
12651279.PMID12662535(https://www.ncbi.nlm.nih.gov/pubmed/12662535).

Furtherreading
Einicke,G.A.(2012).Smoothing,FilteringandPrediction:EstimatingthePast,PresentandFuture
(http://www.intechopen.com/books/smoothingfilteringandpredictionestimatingthepastpresentandfuture).
Rijeka,Croatia:Intech.ISBN9789533077529.
JinyaSuBaibingLiWenHuaChen(2015)."Onexistence,optimalityandasymptoticstabilityoftheKalman

https://en.wikipedia.org/wiki/Kalman_filter

30/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

JinyaSuBaibingLiWenHuaChen(2015)."Onexistence,optimalityandasymptoticstabilityoftheKalman
filterwithpartiallyobservedinputs".Automatica53:149154.doi:10.1016/j.automatica.2014.12.044
(https://dx.doi.org/10.1016%2Fj.automatica.2014.12.044).
Gelb,A.(1974).AppliedOptimalEstimation.MITPress.
Kalman,R.E.(1960)."Anewapproachtolinearfilteringandpredictionproblems"
(http://www.elo.utfsm.cl/~ipd481/Papers%20varios/kalman1960.pdf)(PDF).JournalofBasicEngineering82(1):
3545.doi:10.1115/1.3662552(https://dx.doi.org/10.1115%2F1.3662552).Retrieved20080503.
Kalman,R.E.Bucy,R.S.(1961)."NewResultsinLinearFilteringandPredictionTheory"
(http://www.dtic.mil/srch/doc?collection=t2&id=ADD518892).Retrieved20080503.
Harvey,A.C.(1990).Forecasting,StructuralTimeSeriesModelsandtheKalmanFilter.CambridgeUniversity
Press.
Roweis,S.Ghahramani,Z.(1999)."AUnifyingReviewofLinearGaussianModels".NeuralComputation11
(2):305345.doi:10.1162/089976699300016674(https://dx.doi.org/10.1162%2F089976699300016674).
PMID9950734(https://www.ncbi.nlm.nih.gov/pubmed/9950734).
Simon,D.(2006).OptimalStateEstimation:Kalman,HInfinity,andNonlinearApproaches
(http://academic.csuohio.edu/simond/estimation/).WileyInterscience.
Stengel,R.F.(1994).OptimalControlandEstimation(http://www.princeton.edu/~stengel/OptConEst.html).
DoverPublications.ISBN0486682005.
Warwick,K.(1987)."OptimalobserversforARMAmodels"
(http://www.informaworld.com/index/779885789.pdf)(PDF).InternationalJournalofControl46(5):14931503.
doi:10.1080/00207178708933989(https://dx.doi.org/10.1080%2F00207178708933989).Retrieved20080503.
Bierman,G.J.(1977).FactorizationMethodsforDiscreteSequentialEstimation.MathematicsinScienceand
Engineering128(Mineola,N.Y.:DoverPublications).ISBN9780486449814.
Bozic,S.M.(1994).DigitalandKalmanfiltering.ButterworthHeinemann.
Haykin,S.(2002).AdaptiveFilterTheory.PrenticeHall.
Liu,W.Principe,J.C.andHaykin,S.(2010).KernelAdaptiveFiltering:AComprehensiveIntroduction.John
Wiley.
Manolakis,D.G.(1999).StatisticalandAdaptivesignalprocessing.ArtechHouse.
Welch,GregBishop,Gary(1997)."SCAAT:incrementaltrackingwithincompleteinformation"
(http://www.cs.unc.edu/~welch/media/pdf/scaat.pdf)(PDF).SIGGRAPH'97Proceedingsofthe24thannual
conferenceonComputergraphicsandinteractivetechniques.ACMPress/AddisonWesleyPublishingCo.
pp.333344.doi:10.1145/258734.258876(https://dx.doi.org/10.1145%2F258734.258876).ISBN0897918967.
Jazwinski,AndrewH.(1970).StochasticProcessesandFiltering.MathematicsinScienceandEngineering.New
York:AcademicPress.p.376.ISBN0123815509.
Maybeck,PeterS.(1979).StochasticModels,Estimation,andControl.MathematicsinScienceandEngineering.
1411.NewYork:AcademicPress.p.423.ISBN0124807011.
Moriya,N.(2011).PrimertoKalmanFiltering:APhysicistPerspective.NewYork:NovaSciencePublishers,
Inc.ISBN9781616683115.
Dunik,J.SimandlM.StrakaO.(2009)."Methodsforestimatingstateandmeasurementnoisecovariance
matrices:Aspectsandcomparisons".Proceedingsof15thIFACSymposiumonSystemIdentification(France):
372377.
Chui,CharlesK.Chen,Guanrong(2009).KalmanFilteringwithRealTimeApplications.SpringerSeriesin
InformationSciences17(4thed.).NewYork:Springer.p.229.ISBN9783540878483.
Spivey,BenHedengren,J.D.andEdgar,T.F.(2010)."ConstrainedNonlinearEstimationforIndustrialProcess
Fouling"(http://pubs.acs.org/doi/abs/10.1021/ie9018116).Industrial&EngineeringChemistryResearch49(17):
78247831.doi:10.1021/ie9018116(https://dx.doi.org/10.1021%2Fie9018116).
ThomasKailathAliH.SayedBabakHassibi(2000).LinearEstimation.NJ:PrenticeHall.ISBN978013
0224644.
AliH.Sayed(2008).AdaptiveFilters.NJ:Wiley.ISBN9780470253885.

Externallinks
https://en.wikipedia.org/wiki/Kalman_filter

31/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

ANewApproachtoLinearFilteringandPredictionProblems
(http://www.cs.unc.edu/~welch/kalman/kalmanPaper.html),byR.E.Kalman,1960
KalmanBucyFilter(http://www.eng.tau.ac.il/~liptser/lectures1/lect6.pdf),agoodderivationofthe
KalmanBucyFilter
MITVideoLectureontheKalmanfilter(https://www.youtube.com/watch?v=d0D3VwBh5UQ)on
YouTube
AnIntroductiontotheKalmanFilter
(http://www.cs.unc.edu/~tracker/media/pdf/SIGGRAPH2001_CoursePack_08.pdf),SIGGRAPH
2001Course,GregWelchandGaryBishop
Kalmanfilteringchapter(http://www.cs.unc.edu/~welch/kalman/media/pdf/maybeck_ch1.pdf)from
StochasticModels,Estimation,andControl,vol.1,byPeterS.Maybeck
KalmanFilter(http://www.cs.unc.edu/~welch/kalman/)webpage,withlotsoflinks
"KalmanFiltering"
(https://web.archive.org/web/20130623214223/http://www.innovatia.com/software/papers/kalman.ht
m).Archivedfromtheoriginal(http://www.innovatia.com/software/papers/kalman.htm)on201306
23.
KalmanFilters,thoroughintroductiontoseveraltypes,togetherwithapplicationstoRobot
Localization(http://www.negenborn.net/kal_loc/)
KalmanfiltersusedinWeathermodels(http://www.siam.org/pdf/news/362.pdf),SIAMNews,
Volume36,Number8,October2003.
CriticalEvaluationofExtendedKalmanFilteringandMovingHorizonEstimation
(http://pubs.acs.org/cgibin/abstract.cgi/iecred/2005/44/i08/abs/ie034308l.html),Ind.Eng.Chem.
Res.,44(8),24512460,2005.
KalmanandBayesianFiltersinPython(http://github.com/rlabbe/KalmanandBayesianFiltersin
Python)FreebookonKalmanFilteringimplementedinIPythonNotebook.
Sourcecodeforthepropellermicroprocessor(http://obex.parallax.com/object/326):Well
documentedsourcecodewrittenfortheParallaxpropellerprocessor.
GeraldJ.Bierman'sEstimationSubroutineLibrary(http://netlib.org/a/esl.tgz):Correspondstothe
codeintheresearchmonograph"FactorizationMethodsforDiscreteSequentialEstimation"
originallypublishedbyAcademicPressin1977.RepublishedbyDover.
MatlabToolboximplementingpartsofGeraldJ.Bierman'sEstimationSubroutineLibrary
(http://www.mathworks.com/matlabcentral/fileexchange/32537):UD/UDU'andLD/LDL'
factorizationwithassociatedtimeandmeasurementupdatesmakinguptheKalmanfilter.
MatlabToolboxofKalmanFilteringappliedtoSimultaneousLocalizationandMapping
(http://eia.udg.es/~qsalvi/Slam.zip):Vehiclemovingin1D,2Dand3D
Derivationofa6DEKFsolutiontoSimultaneousLocalizationandMapping
(http://www.mrpt.org/6DSLAM)(InoldversionPDF
(http://mapir.isa.uma.es/~jlblanco/papers/RangeBearingSLAM6D.pdf)).Seealsothetutorialon
implementingaKalmanFilter(http://www.mrpt.org/Kalman_Filters)withtheMRPTC++libraries.
TheKalmanFilterExplained(http://www.tristanfletcher.co.uk/LDS.pdf)Averysimpletutorial.
TheKalmanFilterinReproducingKernelHilbertSpaces
(http://www.cnel.ufl.edu/~weifeng/publication.htm)Acomprehensiveintroduction.
MatlabcodetoestimateCoxIngersollRossinterestratemodelwithKalmanFilter
(http://www.mathfinance.cn/kalmanfilterfinancerevisited/):Correspondstothepaper"estimating
andtestingexponentialaffinetermstructuremodelsbykalmanfilter"publishedbyReviewof
QuantitativeFinanceandAccountingin1999.
ExtendedKalmanFilters(http://apmonitor.com/wiki/index.php/Main/Background)explainedinthe
contextofSimulation,Estimation,Control,andOptimization
OnlinedemooftheKalmanFilter(http://www.dataassimilation.net/Tools/AssimDemo/?
method=KF).DemonstrationofKalmanFilter(andotherdataassimilationmethods)usingtwin
https://en.wikipedia.org/wiki/Kalman_filter

32/33

7/27/2015

KalmanfilterWikipedia,thefreeencyclopedia

experiments.
Botella,GuillermoMartnh.,JosAntonioSantos,MatildeMeyerBaese,Uwe(2011)."FPGA
BasedMultimodalEmbeddedSensorSystemIntegratingLowandMidLevelVision".Sensors11
(12):12511259.doi:10.3390/s110808164(https://dx.doi.org/10.3390%2Fs110808164).
HookesLawandtheKalmanFilter(http://finmathblog.blogspot.com/2013/10/hookeslawand
kalmanfilterlittle.html)Alittle"springtheory"emphasizingtheconnectionbetweenstatisticsand
physics.
ExamplesandhowtoonusingKalmanFilterswithMATLAB
(http://www.mathworks.com/discovery/kalmanfilter.html)ATutorialonFilteringandEstimation
ExplainingFiltering(Estimation)inOneHour,TenMinutes,OneMinute,andOneSentence
(http://blog.sciencenet.cn/home.php?mod=space&uid=1565&do=blog&id=851754)byYuChiHo
Retrievedfrom"https://en.wikipedia.org/w/index.php?title=Kalman_filter&oldid=672721435"
Categories: Controltheory Nonlinearfilters Linearfilters Signalestimation
Stochasticdifferentialequations Robotcontrol Markovmodels Hungarianinventions
Thispagewaslastmodifiedon23July2015,at12:13.
TextisavailableundertheCreativeCommonsAttributionShareAlikeLicenseadditionaltermsmay
apply.Byusingthissite,youagreetotheTermsofUseandPrivacyPolicy.Wikipediaisa
registeredtrademarkoftheWikimediaFoundation,Inc.,anonprofitorganization.

https://en.wikipedia.org/wiki/Kalman_filter

33/33

Você também pode gostar