owned this note
owned this note
Published
Linked with GitHub
---
title: 'OCMR documentation'
disqus: hackmd
---
# **OCMR - Open Source Cost-Effective Mobile Robot Platform**
[TOC]
---
### **Index**
1. Introdution
2. Motivation
3. Components Used
4. Theory
5. Methodology
6. Experimentation
7. Software Stack
8. Datasheets and Download
---
### **Introduction**:
#### v1.0 ( Current Version )
- A Modifiable Robot Chassis ( 3d print , cardboard, PVC foam sheet, metal ,etc)
- Tools to help you select and buy parts for your project
- Make your own **LIDAR from scratch**.
- Connect everything together and power up using ROS.
- Teleop using a **PS3 joystick**
- **SLAM capabilities**
#### v2.0 ( Upcoming )
- a open robotic arm software to integrate your own arm
- a MPU for improving Localization
- a upgraded LIDAR
- a Camera to perform vision algorithms.
#### What can you do with this project ?
Make AMRs, Experiment algorithms, learning how actual robots work, add your own sensors, contribute to this project so others can learn. Make a own version of this Platform.
---
### **Motivation** :
I made this robot because I wanted to see real world implementation of Mobile robots but when I stared searching for robot parts they were too costly so I thought of making some of the parts from scratch like a LIDAR and chassis.
---
### **Parts used** :
1. VL53L0X Time of Flight Sensor
2. 12V DC Motor with Quad Encoders
3. Arduino Mega
4. Arduino Nano
5. Sparkfun Monster Motor Shield dual channel
6. Servo Motor or Stepper Motor
7. Cardboard or 3D Printed Robot chassis
8. Mechanical Parts ( Screws,Clamps,etc. )
9. Wires
10. Wheels
11. Raspberry Pi or Jetson Nano (Optional)
12. PS3 Controller (Optional)
13. Li-ion Battery
---
### **Theory** :
#### 1. LIDAR
LIDAR stands for **Light Detection and Ranging** also referred as Laser Scanning. LIDAR data represents distance of sensor from the objects in its environment. Its simply the distance data associated with where that object is in the environment with respect to the sensor.
How does LIDAR work?
A typical lidar sensor emits pulsed light waves from a laser into the environment. These pulses bounce off surrounding objects and return to the sensor. The sensor uses the time it took for each pulse to return to the sensor to calculate the distance it traveled. Repeating this process over time creates a virtual map of environment. An onboard computer can utilize this map of the surrounding environment for navigation.

We use this principle by measuring ranges at various positions ranging from 0-180 deg in uniform steps to emulate working of a LIDAR.
#### 2. Closed Loop Control of Motor
For Controlling the motor we use **PID Controller where we use Encoder data as a feedback to PID Algorithm**. This Method is effective as it is independent of the type of motor or encoder. For more detail information refer to one of my project on github.
https://github.com/JeelChatrola/Control-of-motor-with-ROS
#### 3. Design
For Design of this robot I took inspiration from the **KUKA youBot** which is a mobile robotic platform equipped with a robotic arm. I have kept the design such that the robot can be modified to have different drive mechanisms according to the use case such as
**1. Differential Drive
2. Ackkerman Drive
3. 4 wheel mechanum drive**
Also inside of the robot there are two compartments one for all the motors, battery and one for all the electronics.
Image of Robot.
#### 4. Mobile Robots
I choose to build a mobile robot as it is one of the easiest robots to make and program for beginners. Specifical it is a **wheeled mobile robot**. The basic functions of a mobile robot include the ability to move and explore, transport payloads,nuclear power plants, where factors, like high radiation, make the area too dangerous for humans to inspect and monitor themselves. Also **Mobile robots can be Autonomous or Non-Autonomous** that are controlled externally by humans.
Here is a resource to help you understand the math behind wheeled robots
http://ais.informatik.uni-freiburg.de/teaching/ss17/robotics/slides/03-locomotion.pdf
---
### **Methodology and Problems faced** :-
First step of building a robot is to buy the parts required
**1. Motors**
Here first we need to select the motors for our project. Most important factor is how much weight you want your robot to carry. Also how fast you want your robot to go. Here is a Link to a tutorial for the same
https://www.robotshop.com/community/tutorials/series/robot-design-basics
https://www.robotshop.com/community/blog/show/drive-motor-sizing-tool
**2. Battery**
Once you are done with selecting the number of motors and the exact motor you want to use next step is to select a battery for your robot. For this you need to understand about various battery types and understand what the specification mean.
{%youtube arOXg7y6r8k%}
https://robu.in/lithium-ion-battery-vs-li-po-battery/
**3. Chassis**
Next is finding a chassis or making one yourself. You can use PVC Foam sheet, acrylic sheet, Metal sheets, Cardboard, etc based on the application of the robot.
Now you can select the product that you wish to use. I will link the ones that I have used. I will provide the Design of my robot as a STL file.
Next we will move on to the **software configuration issues**.
Biggest issue that **I faced when making the LIDAR is the Buffer Size of arduino mega is very less** so we cannot use the Laser_scan msgs directly for that purpose i get range data to ros and then convert the incoming range messages to a Laser_scan message via a node. Another issue that i faced is the TF Tree was not in sync so the SLAM didn't generate the map initially. To solve this I specified the update frequency of robot state publisher to match the Laser_scan update frequency. For more info refer to tf tree given below
---
### **Experimentation** :-
#### 1. LIDAR
**Field of View - 0 to 180 degrees
Max Range - 1.2 meters**
This lidar is only capable of measuring from 0 to 180 deg, as 0 to 360 will take a longer time to process which will result in a longer scan time. As a solution we can use **2 VL53L0X sensors but that will require additional arduino.**
This Lidar can be modified according to your own requires below are experiments carried out by changing the parameters such as angle increment of motor, distance calculation modes of VL53L0X such as High accuracy, high speed, long distance. Here we have limited the max range to 1.2 meters as this robot is made for indoor environments. Although this sensor is capable of reading values a longer ranges upto 2 meters.
* ##### High Speed
| No of Points | Scan Time | Accuracy | Actual Scan |
| -------- | -------- | -------- | -------- |
| 61 | 3.2 secs | 3.5 ||
| 37 | 1.9 secs | 3 ||
| 19 | 0.9 secs | 1.5 ||
* ##### High accuracy
| No of Points | Scan Time | Accuracy | Scan Image |
| -------- | -------- | -------- | -------- |
| 61 | 10.2 secs | 4.5 ||
| 37 | 6 secs | 4 ||
| 19 | 2.9 secs | 2.5 ||
* ##### Long Range
| No of Points | Scan Time | Accuracy | Scan Image |
| -------- | -------- | -------- | -------- |
| 61 | 3.4 secs | 4 ||
| 37 | 2.1 secs | 3.5 ||
| 19 | 1.02 secs | 2 ||
The above table is a analytical comparison of scan of same position but with a different resolution, the 61 points corresponds to a 3deg incremental change in the servo position to take the reading, similarily 37 corresponds to 5 deg and 19 corresponds to 10 deg.
* ##### Conclusion
**Lidar is usable if you just wish to use it as project and not as a real world use case**. From the experimentation I found out that the processing power of the **arduino mega is limited in term of its speed to process the range readings for the laser scanner**. To solve this we can use a powerfull
microcontroller board such as teensy which has a faster processing capacity. Also using multiple range sensors as a array can help reducing the scan time by a greater extend but I choose not to use it as cost is the factor that I was concerned about for v1.0.
#### 2. SLAM
I used the **Gmapping package** available in ros to perform SLAM and generate a map. I know the map generated is not usable if we want to move robot autonomously. But from testing and building this robot I found that by tunning multiple parameter of SLAM and LIDAR the map quality can be improved.Seeing the cost of this setup compared to a real lidar definitely worth making this for the sake of learning.
The **Gmapping slam uses rao-blackwellized particle filter** to perform slam as a alternative we can use a other slam methods such as hector slam.also we can use camera to perform Visual Slam.
Here is a Sample Map Generated

**Analysis**
The problem here is we dont have any good feedback system from the servo to know the exact position so using a stepper motor will definitely improve the results due to more accurate and minute motion possible.
Another issue is the Sampling rate of the lidar is the bottleneck of generating a map here,the solution to this is using a powerfull microcontroller or a microprocessor.I plan to execute this using a Jetson Nano , I sure the map generated will be better.
---
### **Software Stack**
In this section we will take a look at overall setup of the robot including ha
I have used two arduinos with a Laptop to run the robot. I communicate with the arduinos with ros-serial protocol. Here is the overall structure of the robot.

First step is writing the PID algorithm to move the robot according to our required velocity commands. For this we get the encoder data from the motor to the ros now we calculate the velocity from this encoder data and apply the PID algorithm. Also we use this encoder data to calculate the odometery which will help us localise the robot in the environment.
Also I am using a PS3 controller to control the robot wirelessly.
Once the we are able to control the robot we move on to making our LIDAR.
This is the tricky part here we get the data from VL53L0X throught ROS-Message called Range_msgs, which also has the current position of the servo motor. Now when we receive this data in ROS we have a node which converts this range_msgs to Laser_msgs so that RVIZ can understand and visualize this data.
For experimentation purposes I have used gmapping_slam.You can also integrate it with the Navigation stack but that will require some tunning of the LIDAR and other mapping parameters.
Below is the **rqt_graph** of the nodes running

Here is the **overall structure** of the package.

Below is what a ideal setup should look like.
**Topic List**

**TF TREE**

**RVIZ**

**Overall Software structure.**

**Circuit Diagram**

---
### Setting up the robot with ROS
**Requirements :-**
1.ROS Melodic on Ubuntu 18.04
2.Gmapping Package - http://wiki.ros.org/gmapping
3.Teleop Package - http://wiki.ros.org/teleop_twist_joy
4.Tf2,Robot State publisher,rosserial,Arduino IDE.
5.Basic Knowledge of ROS.
**Steps to setup your own robot**
1. Download the package and check that you have all the necessary dependencies (i.e. rosserial, navigation stack, gmapping package)
2. Tune the parameters of the pid algorithm according to your robot (change the parameter such as encoder ticks , wheel radius,wheel thickness,P-I-D paramters, wheel seperation)
3. Test your robot after tunning the parameters using Launch file provided. ( Tip:- use the rostopic echo command to monitor specific topics that you are testing )
4. If you wish to make this robot autonomous use official navigation stack documentation on the ros wiki website.
**Commands**
1. Running the OCMR setup files
```shell
roslaunch mobile_robot robot_mobile.launch
```
2. Launching the teleop nodes to controll the robot using PS3 controller
```shell
roslaunch teleop_twist_joy teleop.launch
```
3. Launching rviz to see the output
```shell
rviz
```
4. Running the gmapping slam node to peform SLAM.
```shell
rosrun gmapping slam_gmapping
```
---
### **Documents and Reference Links**:-
1. VL53L0x - https://www.pololu.com/product/2490
2. DC motor with encoder - https://robokits.download/downloads/RMCS%205032.pdf
3. Motor Shield -
Datasheet :- https://robu.in/wp-content/uploads/2017/04/MONSTER-MOTO-SHIELD-VNH2SP30-MOTOR-DRIVER.pdf
4. Arduino Mega Wifi - https://robu.in/product/wemos-mega-wifi-r3-atmega2560nodemcu-esp8266-32mb-memory-usb-ttl-ch340g-compatible-for-arduino-mega/
5. Mesh file of Chassis - In Progress will upload soon
6. Github Link - https://github.com/JeelChatrola/OCMR
7. Demo Video of Teleop Robot -
{%youtube EltNoBsaq5w %}
---
### Appendix and FAQ
:::info
**Find this document incomplete?** Leave a comment!
Suggestions are always welcome.
:::
###### Ignore the Tags: `Closed Loop Control` `ROS` `Arduino` `Tutorial` `python` `VNH3ASP30 Motor shield` `DIY LIDAR` `Mobile ROBOT` `Autonomous Mobile Robot` `VL53L0X`
---