# Multiple sensor fusion for mobile robot localization and navigation using the Extended Kalman Filter
https://ieeexplore.ieee.org/document/7373480
## Introduction
Three different approaches were used to localize a robot using Extended Kalman Filter. The robot has four sensors
* Encoder
* Compass
* IMU
* GPS
An input output state feedback linearization (I-O SFL) method was used to control the robot along the desired robot trajectory.
Two types of positioning systems were mentioned
* Relative Position Measurements - fast rate with drifting bias
* Absolute Position Measurements - accurate low rate measurement
Sensor fusion is a useful technique to fuse both types of positioning sensors to provide high-accuracy estimates at high update rates.
Three approaches used in the filter's measurement equation are
* encoder, compass and GPS receiver fusion
* gyroscope, accelerometer and GPS fusion
* all sensors
## Kinematic Model of Differential Drive
Three variables $x$,$y$ and $θ$ specify the position of the robot.
$\alpha_g$ = $\begin{bmatrix}x \\y \\ \theta \end{bmatrix}$
$R(\theta)$ = $\begin{bmatrix}cos(θ) & -sin(θ) & 0\\sin(θ) & cos( θ)\ & 0 \\ 0&0&1 \end{bmatrix}$
$\alpha_r$ =$R(\theta)$ $\alpha_g$
$\dot\alpha_g$ = $\begin{bmatrix}Vcos\theta \\Vsin\theta \\ \omega \end{bmatrix}$
Two inputs $V,\omega$ for the robot.

The state vector of the robot is given as
$\beta$ = $\begin{bmatrix} x \\ y\\z\\V\\ \omega \end{bmatrix}$
The kinematic model will be
$\dot\beta$ = $\begin{bmatrix} \dot x \\ \dot y\\\dot z\\\dot V\\ \dot \omega \end{bmatrix}$ =$\begin{bmatrix} Vcos\theta \\ Vsin\theta\\\omega\\0\\0 \end{bmatrix}$
The discretized model for sampling time $\Delta t$ will be

## Extended Kalman Filter
The motion and measurement models are given as
$\beta_k$ = $f(\beta_{k-1}) + w_{k-1}$
$z_k$ = $h(\beta_k) +v_k$
The Linearized form looks like



Measurement matrices for different sensors were calculated.
Each sensors gave the following variable data
* GPS - $x_k,y_k$
* Odometry - $V_r,V_l$
* Compass - heading $\theta$
* IMU - $\omega,\theta, V$
From this measurement Jacobians were calculated and EKF was performed for localization.
## Kinematic Controller
The input output state feedback linearization was implemented. By taking a point B outside the wheel axle of the mobile robot, as the output (reference point) of the system, where the position of B is :
$x_B= x+bcos\theta$
$y_B= y+bsin\theta$

Control law for a trajectory($x_{des},y_{des}$) is given as
$\dot x_B$ = $\dot x_{des} +k_1e_x$
$\dot y_B$ = $\dot y_{des} +k_2e_y$
where $e_x,e_y$ are the errors obtained from desired and current positions.
The whole system is summarized as

## Results

Approach 3 has the best result of all due to use of all sensor data.
For circular path, a fault occurred in the encoder after half of the simulation time. Even in this case, approach performed the best due to working of other sensors properly.
