apriltag localization
每週星期一晚上八點
每周三下午1.30 (w/ prof.)
Environment Setup | Survey |
---|---|
Andrew | Vivian |
Jakey | Leo |
Shaw |
Baseline | Ourwork |
---|---|
Andrew | Vivian |
Jakey | Leo |
Chan | Shaw |
AprilTag_Localization
Note:
This github repo contains the entire essential workspace, not packages. There's no need to create catkin workspace by yourself.
robot_localization
is a collection of state estimation nodes
ekf_localization_node
Mind that the Apriltag has size of 0.3m x 0.3m, make sure to change the standalone tags info in tags.yaml in apriltag_ros!
⇒ The tag size of tag36h11 is defined as the distance between the black border (24cm), not the model size (30cm).
Using RGB image
Downward Error Table
Error = (uav GT z position - 0.05) - (/tag_detections z position + 0.01)
Height | 0.8m | 1.4m | 2m | 3m |
---|---|---|---|---|
Error | 0.046m | 0.046m | 0.042m | 0.017m |
Forward Error Table
Error = tag GT x position - (uav GT x position + 0.1 + /tag_detection z position)
Height | 0.8m | 1.4m | 2m | 3m |
---|---|---|---|---|
Error | 0.008m | 0.005m | 0.007m | 0.001m |
偵測圖像發佈頻率(hz) | fmin | fmax |
---|---|---|
camera_forward/infra1/image_raw | 7.87 | 25 |
camera_forward/color/image_raw | 11.36 | 21.74 |
/forward/tag_detections | 8.33 | 12.05 |
Expected Rate>30hz
Problem / Note:
⚠️ Problem: Iris cannot take off.
✔️ Solution: Remember to add librotors_gazebo_ros_interface_plugin.so to test_room.world(or other worlds where you want to simulate UAVs).
Note that tags.yaml should me modified according to your usage.
Both downward & forward have 5 apriltags with different id
Downward with id 0 - 4
Forward with id 5 - 9
Reference: Using Apriltags with ROS
Error = abs(odometry - (x、y、z transformate from camera_frame to tag_frame(middle of tag_bundles) then transformate to world_frame))
Height | 3 m | 2.5 m | 2m |
---|---|---|---|
Error | x: 0.0275 m | x: 0.0163 m | x: 0.007 m |
y: 0.0364 m | y: 0.023 m | y: 0.019 m | |
z: 0.023 m | z: 0.025 m | z: 0.025 m |
Height | 3 m | 2.5 m | 2m |
---|---|---|---|
Error | x: 0.097 m | x: 0.0956 m | x: 0.0958 m |
y: 0.0354 m | y: 0.026 m | y: 0.018 m | |
z: 0.026 m | z: 0.007 m | z: 0.012 m |
Download the package of robot_localization: robot_localization
Output is on the topic /odometry/filtered
robot_localization wiki
Tutorials for users You can check 01 - ROS and Sensor Fusion Tutorial
ROS Developers Live-Class #51: How to fuse Odometry & IMU using Robot Localization Package
7/11:
7/16 Test:
7/17 Test:
7/18:
multi-tag_bundle detection
fusion w/ multi-tag_bundle and imu
flying w/o motion planning
scenario 1: quadcopter fly to (0, 0, 2) from the origin (0, 0, 0) w/ one waypoint (0, 0, 2)
fusion result
only fusion imu data
scenario 2: quadcopter fly to (0, 0, 2) from the origin (0, 0, 0) w/ two points (0, 0, 0.5) and (0, 0, 2)
fusion result
only fusion imu data
tag tf
tag detected result
Vivian | Shaw | Leo |
---|---|---|
Evaluation of tag_bundle & robot_localization package | Code optimization | Evaluation of tag_bundle & robot_localization package |
Experiment | Experiment | Experiment |
7/20:
7/25
7/29:
Environment
Detail: https://docs.google.com/presentation/d/1osLHdvbiBsqHqRepMad7XYyr-Qt42LXiAthbbVHfIgE/edit#slide=id.p
Control yaw of iris
According to rotors control/lee_position_controller.cpp, in LeePositionController::ComputeDesiredMoment, we know that assign direction of velocity, we can change heading of iris.
Waypoint
Demo I: Square Path w/o Turn
Demo II: Square Path w/ Turn
Bootloader 安裝(安裝在筆電上)
St-link 安裝(安裝在筆電上)
Bootloader 程式燒錄(只燒錄 px4fmuv2_bl 的部份)
筆電的 usb 接上 st-link,並用 lsusb 確認連接成功
燒錄程式
Mavros 安裝(安裝在 upboard 上)
安裝步驟
https://hackmd.io/iOtCxeZtSSqi2rCyHt2dyg#Husky-setting-in-ROS-melodic
測試
連接 px4 與 upboard
執行指令
執行結果
Topic
IMU 資訊接收
執行結果
IMU 的座標軸
Official: PX4 User Guide
Basic Assembly
Basic Configuration
Flying
Contorl Mode: Offboard Control
Usage: On-board processor and Serial radios link to ROS
Image
Control Objective:
Parameter:
The QGroundControl Parameters screen allows you to find and modify any of the parameters associated with the vehicle.
Inspection before flying
Hardware:
Software
Calibration
Build
Serial Setting
First Fly(Manual Mode)
Connection Interface
Demo Video
Tagslam itself does not provide fusion with imu. Thus, fusion of camera and IMU must be carried out by ourselves via either the mocap survey or robot_localization package (ekf localization node).
Real-time Restriction
Real-time usage is limited to situations where either the graph is so small that it can be solved quickly (maybe a thousand frames maximum!), or the pose of the tags can be determined beforehand, offline, with a “mapping run” that builds the full graph, and then stores the optimized tag poses for a subsequent real-time run. To get real-time performance, use the “amnesia” feature such that old information gets thrown away and the graph stays small.
Test Environment (3 tags in triangular form)
Downward detection
Forward detection
Configuration
camera_pose.yaml
camera.yaml
Intrinsics can be found in camera_info
Tagslam only Test Result
Hovering
Given tag0 (downward) & tag1 (forward) poses
Odom = /tagslam/odom/body_iris1 (Extreme low publishing rate)
GT = /iris1/ground_truth/position
Odom | GT | |
---|---|---|
x | 0.126m | 0.126m |
y | 0.017m | 0m |
z | 1.972m | 1.978m |
Error = |Odom - GT| = 0.018m
Hallway (straight trajectory)
起點 | Odom | GT | Error |
---|---|---|---|
x | -6.874 | -6.842 | 0.032 |
y | 1.348e-08 | 0.016 | 0.016 |
z | 1.978 | 1.97 | 0.008 |
移動中誤差
https://drive.google.com/file/d/1jb-hImTk7QwZ8OKWgtO5uOA21vUUXlxm/view?usp=sharing
終點 | Odom | GT | Error |
---|---|---|---|
x | -3.865 | -3.874 | 0.009 |
y | 0.022 | -7.62e-09 | 0.022 |
z | 1.974 | 1.9779 | 0.0039 |
Tagslam and IMU fusion
The computed coordinate of uav is relative to odom frame. Therefore, the relative static transform from map to odom should be adjust according to your purposes and scenario.
Flow chart
IMU filter is madgwick filter
tf tree
sensor_fusion.launch
Commands
Hovering
Fusion後誤差
Hovering | Odom | GT | Error |
---|---|---|---|
x | -6.845 | -6.874 | 0.029 |
y | 0.016 | 2.22e-09 | 0.016 |
z | 1.971 | 1.978 | 0.007 |
Troubleshooting
All topics are correctly connected and nodes are fully opened, but tagslam is not providing any map or tf.
Make sure that the time stamp of every sensors and external odometry are synchronized, if not, set use_approximate_sync to true in tagslam.yaml
The relative position of fixed tags and cameras are supposedly configured, however, the position tagslam provided is weird. Possible reasons are
The cameras' axis are defined differently to Gazebo's. X and Y axis are reversed since the coordinate is respect to iris1's, which has z axis pointing downward.
Every known tags pose needs to rotate around its normal vector with -90 degrees.
Tagslam's rotation is defined as Axis with angle magnitude (radians). Therefore, the XYZ Euler Angle Rotation with [1.57, 1.57, 0] will become [1.2091996, 1.2091996, 1.2091996]. The latter should be the correct value to fit in the camera_pose.yaml for cam_0. Rotation Converter
Beware that the intrinsic and image size of stereo and color camera of d435 bare different.
TO-DO
Test different environment and evaluate the performance
Fusion of Tagslam's odometry & IMU's
Mapping Output
Work distribution
Andrew | Jakey | Chan |
---|---|---|
Error evaluation for hovering scenario | Visual-Inertial Motion Capture System環境測試 | Setup & test Tagslam |
Create hallway scenario Gazebo simulation & tagslam config | Error evaluation for hallway scenario | |
Create Hover scenario Gazebo simulation & tagslam config | Error evaluation for hallway scenario after fusion | |
Write config for our camera and uav model setup with parameter tuning | ||
Sensor fusion of tagslam & imu using ekf_localization_node | ||
Setup & test Tagslam | ||
Add yaml ros service |
Camera test
Camera calibration
Camera Parameters
cameras.yaml
d435i test result
mapping
3D mapping
Step1
Step2
Go to the address shown in terminal
Step3
Choose GUI.html
Transform the poses.yaml to the format that apriltag_ros can understand with an user defined tag_bundles.yaml.
Downward tags are excluded
Andrew → Vivian → Jakey → Leo → Shaw → Chan
Last week progress:
This week expected progress:
Shaw will be absence this week.
Requirements
Position for this week
臨時動議
List out what each member is responsible for in detail during weekly report.
robot_localization
PPT outline
Position for this week
臨時動議
Week progress
Project 行程規劃
Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|
Andrew | ✔ | ✔ | ✗ | ✗ | ✗ |
Jakey | ✔ | ✔ | ✔ | ✔ | ✔ |
Chan | ✔ | ✔ | ✔ | ✔ | ✔ |
Vivian | ✔ | ✗ | ✔ | ✗ | ✗ |
Shaw | ✔ | ✔ | ✔ | ✔ | ✔ |
Leo | ✔ | ✔ | ✔ | ✔ | ✔ |
先連續去兩天(7/27 三、7/28 四),再看之後的進度安排。
相機校正以及測試Apriltag | 無人機轉彎模擬 | 其他 |
---|---|---|
竣智﹔ | 博慧﹔環境架設 | 晏誠﹔網頁API |
禹維﹔ | 家豪﹔模擬兩種型態的轉彎 | |
劭暐﹔程式優化 |
3D建圖 | upboard & px4 Setting | 其他 |
---|---|---|
竣智﹔ | 博慧:upboard & px4 溝通測試 | 晏誠﹔網頁API |
禹維﹔ | 家豪:upboard & px4 程式燒錄 | |
劭暐:upboard & px4 程式燒錄 |
Week progress
PPT outline
Position for this week
分工
d435i tag detection | upboard & px4 Setting | Web GUI user guide |
---|---|---|
竣智﹔ | 博慧:飛行控制參數設定 | 晏誠 |
禹維﹔ | 家豪:飛行控制參數設定 & 試飛 | |
劭暐:飛行控制參數設定 & 試飛 |
Week progress
Web 前後端連接
PPT outline
Position for this week
Paper
分工
竣智﹔ | 博慧:PX4 控制程式修正 & Paper (Miminum snap) | 晏誠 |
禹維﹔ | 家豪:PX4 控制程式修正 & Paper (ECBF) | |
劭暐:PX4 控制程式修正 & Paper (ECBF) |
Week progress
PPT outline
Position for this week
Paper
分工
竣智﹔ 相機、腳架架設 、 paper | 博慧:NCRL link & 網頁測試 | 晏誠 web integration |
禹維: 相機、paper | 家豪:NCRL link & 網頁測試 & Paper (ECBF) | |
劭暐:NCRL link & 網頁測試 & Paper (ECBF) |
Week progress
PPT outline
Position for this week
分工
竣智﹔ 相機、腳架架設 、 paper | 博慧:NCRL link & 網頁測試 | 晏誠 web integration |
禹維: 相機、paper | 家豪:NCRL link & 網頁測試 & Paper (ECBF) | |
劭暐:NCRL link & 網頁測試 & Paper (ECBF) |