estimates = [measurement]
predictions = []
for m in measurements:
# prediction step
prediction = prediction + behavior_model*time_step
behavior_model=behavior_model
predictions.append(prediction)
# update step
residual = m - prediction
behavior_model = behavior_model + H * (residual/time_step)
prediction = prediction + G * residual
estimates.append(prediction)
How to describe measurement?
X ~ N
Example for understanding
Terrible!
Terrible!
What is different?
What is different?
The superposition of the two covariances is where all the magic happens.
Velocity is unobserved variable
Prediction step
Update step
Legend
x - state
P - uncertainty covariance
Q - process uncertainty
u - motion vector
B - control transition matrix
F - state transition matrix
H - measurement function
R - state uncertainty
S - measurement space
K - kalman gain
Prediction step
Update step
Innovation or measurement residual
Innovation (or residual) covariance
Map system uncertainty into optimal kalman gain
Updated (a posteriori) state estimate
Predicted (a priori) state estimate
Predicted (a priori) estimate covariance
Updated (a posteriori) estimate covariance
predict the next value for x
adjust covariance for x for uncertainty caused by prediction
get measurement for x
compute residual as: "x - x_prediction"
compute kalman gain based on noise levels
compute new position as "residual * kalman gain"
compute covariance for x to account for additional information the measurement provides
1D vs 2D
Process noise
Measurement noise and bad init
Progress
Animated progress
Disadvantages:
Advantages: