owned this note
owned this note
Published
Linked with GitHub
# State Estimation using Force and Vision
[Back to the homepage](https://hackmd.io/jUNv2qU_TKmuVxHbMwMeWA?both)
### **Sensor Fusion and Probabilistic Methods in Contact Detection for Robot Assembly**
In robot assembly, **accurate contact detection** is critical for ensuring **precise positioning, force-controlled assembly, and adaptability to variations** in object alignment and surface properties. Unlike quadruped robots that switch between swing and stance phases, an **assembly robot switches between free motion and constrained contact** with its environment. To **reliably detect contact**, robots integrate **contact force measurements and vision-based localization** using **sensor fusion and probabilistic inference**.
---
## **1. Free Motion and Contact Phase Detection**
A robotic manipulator or end-effector alternates between two key phases:
- **Free motion phase**: The tool or gripper moves freely without external constraints.
- **Contact phase**: The tool makes contact with an object or surface, requiring force control.
### **Challenges in Contact Detection:**
1. **Uncertain contact conditions**: The force required for contact varies based on material properties.
2. **Sensor noise**: Force-torque sensors and vision systems have inherent noise and drift.
3. **Slippage and misalignment**: Contact may be unstable due to misalignment, surface roughness, or lubrication.
4. **External disturbances**: Unexpected forces (e.g., vibrations, mispositioning) can influence detection.
---
## **2. Sensor Fusion for Contact Detection**
To accurately detect contact, robots integrate multiple sensors:
### **(1) Force-Torque Sensors (FTS)**
- Measures **contact forces and torques** at the end-effector.
- If **force $F_c > F_{\text{threshold}}$**, the robot is likely in **contact**.
- Useful but **may have drift or low sensitivity to small forces**.
### **(2) Vision-Based Localization**
- **Stereo cameras, depth sensors, or LiDAR** estimate object pose and contact location.
- Tracks the **position and orientation** of the tool relative to the assembly object.
- Helps anticipate **contact regions and predict required force adjustments**.
### **(3) Joint Encoders & Motor Current Feedback**
- Measures **joint angles and motor torques**.
- Detects if the robot is exerting resistance against an obstacle (**contact**) or moving freely.
- **Higher torque → likely contact phase**.
### **(4) Acoustic or Tactile Sensors (Optional)**
- **Microphones or contact microphones** detect impact sounds.
- **Tactile arrays** provide additional surface feedback.
### **Sensor Fusion Process:**
By fusing these measurements, the system improves contact detection accuracy:
1. **Extended Kalman Filter (EKF)**
- Fuses **force sensor + vision + encoders** to estimate contact state.
2. **Bayesian Inference**
- Uses probability models to infer contact likelihood based on noisy data.
3. **Machine Learning / Neural Networks**
- Uses **historical force and vision data** to predict contact transitions.
---
## **3. Probabilistic Models for Contact Phase Switching**
Since sensor data is noisy and contact conditions vary, **probabilistic models** help infer whether the robot is in free motion or contact.
### **(1) Hidden Markov Model (HMM) for Contact Detection**
- The **free/contact phase** is treated as a **hidden state** $S_t$.
- Observations $O_t$ come from fused sensor data.
- The transition probability $P(S_t | S_{t-1})$ depends on:
- Force readings.
- Vision-based position error.
- Joint torque.
**HMM State Transition Matrix Example**:
$$P(S_t | S_{t-1}) =
\begin{bmatrix}
P(\text{contact}|\text{contact}) & P(\text{free}|\text{contact}) \\
P(\text{contact}|\text{free}) & P(\text{free}|\text{free})
\end{bmatrix}$$
### **(2) Gaussian Mixture Models (GMM)**
- **Clusters force-vision data** into **contact** and **free-motion** phases.
- Uses **likelihood estimation** to classify the current phase.
### **(3) Reinforcement Learning for Contact Adaptation**
- **State $s_t$ → Sensor measurements.**
- **Action $a_t$ → Adjust force control strategy.**
- The model **learns optimal contact force profiles** for various assembly tasks.
---
## **4. Example: Bayesian Decision Rule for Contact Detection**
To determine whether the robot is in contact, we compute:
$$P(\text{contact} | O_t) = \frac{P(O_t | \text{contact}) P(\text{contact})}{P(O_t)}$$
Where:
- $P(O_t | \text{contact})$ is the **likelihood** of observing the sensor data given contact.
- $P(\text{contact})$ is the **prior probability** of contact (e.g., during insertion, contact probability is high).
- $P(O_t)$ is the **normalization factor**.
**Decision Rule:**
- If $P(\text{contact} | O_t) > 0.5$, switch to **contact mode**.
- Else, remain in **free motion**.
---
## **5. Implementation in MATLAB**
Here’s a **MATLAB example** using **Extended Kalman Filter (EKF) for contact detection**:
```matlab
function contact_state = contact_detection(Force, VisionError, Torque)
% Define prior probabilities
P_contact = 0.7;
P_free = 0.3;
% Define mean and standard deviation for contact and free motion. Typically, the data are obtained by statistics from experiments
mu_Force_contact = 50; sigma_Force_contact = 10;
mu_Force_free = 5; sigma_Force_free = 2;
mu_Vision_contact = 0.5; sigma_Vision_contact = 0.2;
mu_Vision_free = 2.0; sigma_Vision_free = 0.5;
mu_Torque_contact = 8; sigma_Torque_contact = 3;
mu_Torque_free = 2; sigma_Torque_free = 1;
% Compute Gaussian PDF using normpdf
L_contact = normpdf(Force, mu_Force_contact, sigma_Force_contact) * ...
normpdf(VisionError, mu_Vision_contact, sigma_Vision_contact) * ...
normpdf(Torque, mu_Torque_contact, sigma_Torque_contact);
L_free = normpdf(Force, mu_Force_free, sigma_Force_free) * ...
normpdf(VisionError, mu_Vision_free, sigma_Vision_free) * ...
normpdf(Torque, mu_Torque_free, sigma_Torque_free);
% Compute posterior probabilities
P_contact_given_O = (L_contact * P_contact) / (L_contact + L_free);
P_free_given_O = (L_free * P_free) / (L_contact + L_free);
% Decision rule
if P_contact_given_O > P_free_given_O
contact_state = 'Contact';
else
contact_state = 'Free Motion';
end
end
% Example usage
Force = 55; VisionError = 0.4; Torque = 9;
contact_state = contact_detection(Force, VisionError, Torque);
disp(['Predicted state: ', contact_state]);
```
---
## **6. Summary**
✅ **Sensor Fusion:** Combines **force sensors, vision, and torque feedback** for accurate contact detection.
✅ **Probability Models:** Uses **HMM, Bayesian Inference, and GMM** to infer contact likelihood.
✅ **Adaptive Control:** Machine Learning or Reinforcement Learning adjusts force during assembly.
✅ **Practical Use:** Implemented in **ABB YuMi, KUKA iiwa, and Universal Robots** for high-precision tasks.
With **sensor fusion and probabilistic models**, robotic assembly systems achieve **stable, precise, and efficient contact detection**, crucial for automation in manufacturing and assembly. 🚀🔧