Extended Kalman Filter based SLAM for orientation

In summary, the speaker is implementing an Extended Kalman Filter based SLAM for orientation and angular rates, but has discarded the estimation of position and linear velocities. They are using a quaternion based approach with translation vectors of 3D-points/landmarks as measurements. However, the filter is not behaving as expected, with an innovation inverse to the correct one in certain orientation ranges. The problem seems to be in the correction part of the filter, and the speaker is seeking help to resolve the issue. They provide formulas and Matlab code that they are using, and later realize that the derivative of the conjugation process was missing, leading to the incorrect behavior of the filter. After correcting this, everything works as intended.
  • #1
slamo
2
0
Dear all,

I'm trying to implement an Extended Kalman Filter based SLAM for orientation and angular rates.
To ease up things I discarded the estimation of position and linear velocities. I'm using a quaternion based approach with translation vectors of 3D-points/landmarks as measurements.

For tests I'm only using one landmark/measurement. Unfortunately the filter is not behaving as supposed. If the orientation-state (rotation only around z-axis) is within the range 90°-270° or multiples everything works just fine. If the orientation is in the other 180° range I get an innovation inverse to the correct one (the predicted state rotates in the false direction). The problem seems to be in the correction part of the filter. I'm pretty sure the prediction part has nothing to with the problem. The innovation vectors are also correct.
Changing the direction of the simulated rotation or the position of the landmark to another side of the robot shows no change.

Here is some of the formulas/matlab-code I use:

Code:
% Predict measurement -> measurement model h
%
% Calculate translation in world coordinates (feature-pos)
dx = x(fpos);
dy = x(fpos+1);
dz = x(fpos+2);
%
% Get rotation matrix
R = quat2RotMat( quatConjungate(x(1:4)) );
%
zp = R * [dx; dy; dz];


% Calculate H_clm for this landmark (H_c 0 ... 0 H_lm 0 ... 0 ) -> h
% derived by state
%
qr = x(1);
qx = -x(2);
qy = -x(3);
qz = -x(4);
%
%
% dht_dqc
%
dht_dqrc = 2* [ qr -qz  qy;
                qz  qr  -qx;
               -qy  qx  qr ] * [dx; dy; dz];
%
dht_dqxc = 2* [ qx  qy  qz;
                qy -qx  -qr;
                qz  qr  -qx ] * [dx; dy; dz];
%
dht_dqyc = 2* [-qy  qx  qr;
                qx  qy  qz;
               -qr  qz  -qy ] * [dx; dy; dz];
%          
dht_dqzc = 2* [-qz -qr  qx;
                qr -qz  qy;
                qx  qy  qz ] * [dx; dy; dz];
%
% dht_ dyi
%
dht_dyi = R;

% Part of the camera Hc
%
Hcy(:,1:7) = [dht_dqrc dht_dqxc dht_dqyc dht_dqzc zeros(3)];

% Part of the landmark Hy
%
Hcy(:,fpos:fpos+Nxf-1) = dht_dyi;

with

Code:
function R = quat2RotMat(q_in)

qzqz = q_in(4)^2;
qyqy = q_in(3)^2;
qxqx = q_in(2)^2;
qrqr = q_in(1)^2;
qrqz = q_in(1)*q_in(4);
qxqy = q_in(2)*q_in(3);
qrqy = q_in(1)*q_in(3);
qxqz = q_in(2)*q_in(4);
qyqz = q_in(3)*q_in(4);
qrqx = q_in(1)*q_in(2);

R = [ qrqr+qxqx-qyqy-qzqz -2*qrqz+2*qxqy      2*qrqy+2*qxqz;
      2*qxqy+2*qrqz       qrqr-qxqx+qyqy-qzqz -2*qrqx+2*qyqz;
      -2*qrqy+2*qxqz      2*qyqz+2*qrqx       qrqr-qxqx-qyqy+qzqz ];

The EKF-update is:

Code:
% filter gain
K = PX * H' * inv(H*PX*H' + R)
 
% updated state and covariance
XX = XX + K * v;
PX = (eye(size(PX,1)) - K*H ) * PX;
 
% Normalize the quaternion
%
XX(1:4) = XX(1:4) / norm(XX(1:4))
Jnorm = eye(size(XX,1));
Jnorm(1:4,1:4) = normJac(XX(1:4));
PX = Jnorm * PX * Jnorm';


Is there something I’m missing? I'm really stuck with this problem. So any help would be welcome.
 
Engineering news on Phys.org
  • #2
Ok, I found it:

Code:
% dht_dqc
%
dht_dqc = zeros(3,4);
%
dht_dqrc = 2* [ qr -qz  qy;
                qz  qr  -qx;
               -qy  qx  qr ] * [dx; dy; dz];
%
dht_dqxc = 2* [ qx  qy  qz;
                qy -qx  -qr;
                qz  qr  -qx ] * [dx; dy; dz];
%
dht_dqyc = 2* [-qy  qx  qr;
                qx  qy  qz;
               -qr  qz  -qy ] * [dx; dy; dz];
%          
dht_dqzc = 2* [-qz -qr  qx;
                qr -qz  qy;
                qx  qy  qz ] * [dx; dy; dz];
%
dht_dqc = [dht_dqrc dht_dqxc dht_dqyc dht_dqzc][COLOR="Red"] * diag([1 -1 -1 -1])[/COLOR];
%
%
% dht_ dyi
%
dht_dyi = Rcw;

% Part of the camera Hc
%
Hcy(:,1:13) = [dht_dtc dht_dqc zeros(3) zeros(3)];

The derivate of the conjungation process was missing. Now everything works fine :)
 

FAQ: Extended Kalman Filter based SLAM for orientation

What is Extended Kalman Filter based SLAM for orientation?

Extended Kalman Filter based SLAM for orientation is a method used in robotics and autonomous systems to simultaneously map the environment and estimate the orientation or pose of the robot within that environment. It combines the Extended Kalman Filter algorithm, which is a recursive estimation algorithm, with Simultaneous Localization and Mapping (SLAM) techniques to provide real-time updates of the robot's position and orientation.

How does Extended Kalman Filter based SLAM for orientation work?

Extended Kalman Filter based SLAM for orientation works by using sensor measurements, such as odometry and inertial measurements, to estimate the robot's pose. These measurements are then combined with a model of the environment to create a map. The Extended Kalman Filter algorithm uses this information to update the robot's pose estimate, taking into account any uncertainty in the sensor measurements and the environment model.

What are the benefits of using Extended Kalman Filter based SLAM for orientation?

Extended Kalman Filter based SLAM for orientation has several benefits, including real-time updates of the robot's position and orientation, the ability to handle non-linear sensor measurements, and the ability to handle uncertainties in the sensor measurements and environment model. It is also a popular choice for SLAM in robotics due to its efficiency and accuracy.

What are some applications of Extended Kalman Filter based SLAM for orientation?

Extended Kalman Filter based SLAM for orientation has a wide range of applications, including autonomous vehicles, mobile robots, and drones. It is also used in augmented reality and virtual reality systems to track the position and orientation of users in real-time. Additionally, it is used in mapping and exploration tasks, such as creating maps of unknown environments or inspecting hazardous areas.

How accurate is Extended Kalman Filter based SLAM for orientation?

The accuracy of Extended Kalman Filter based SLAM for orientation depends on various factors, such as the quality of sensor measurements, the environment model, and the presence of external disturbances. In general, it can provide accurate estimates of the robot's position and orientation, with errors typically in the range of a few centimeters or degrees. However, the accuracy may decrease in challenging environments with limited sensor information or in the presence of strong external disturbances.

Similar threads

Back
Top