# Apriltag localization Project -- Note ###### tags: `apriltag localization` [TOC] ## Environment * Ubuntu 18.04 (ROS melodic) * apriltag_ros * rotorS ## Weekly Meeting 每週星期一晚上八點 每周三下午1.30 (w/ prof.) ## Work Distribution ### Phase I: Setup & Survey | Environment Setup | Survey | | ----------------- | ------- | | Andrew | Vivian | | Jakey | Leo | | | Shaw | ### Phase II: Utilize open-source as baseline and construct our own work | Baseline | Ourwork | | -------- | ------- | | Andrew | Vivian | | Jakey | Leo | | Chan | Shaw | ## GitHub URL [AprilTag_Localization](https://github.com/Andreew9504089/AprilTag_Localization) **<u>Note:</u>** &emsp; This github repo contains the entire essential workspace, not packages. There's no need to create catkin workspace by yourself. ## Survey 1. [TagSlam](https://arxiv.org/pdf/1910.00679.pdf) [[Github]](https://berndpfrommer.github.io/tagslam_web/) + Visual-inertial + GTSAM (factor graph optimizer) 2. [An Open Source, Fiducial Based, Visual-Inertial Motion Capture System](https://arxiv.org/pdf/1507.02081.pdf) [[Github]](https://github.com/clydemcqueen/rcars) [[Doc]](https://bitbucket.org/adrlab/rcars/wiki/browse/) + Visual-inertial + EKF 3. [AprilTag array-aided extrinsic calibration of camera–laser multi-sensor system](https://link.springer.com/content/pdf/10.1186/s40638-016-0044-0.pdf) + Used for static estimation + Bundled with EKF to achieve dynamic estimation + Useful to calibrate two cameras before our experiments 4. [Analysis and improvements in AprilTag based state estimation](https://www.mdpi.com/594480) 5. [Improved Tag-based Indoor Localization of UAVs Using Extended Kalman Filter](https://www.researchgate.net/publication/333633015_Improved_Tag-based_Indoor_Localization_of_UAVs_Using_Extended_Kalman_Filter), ISARC 2019 * multi apriltags + imu * using EKF to fusion sensor datas * `robot_localization` is a collection of state estimation nodes * [robot_localization wiki](https://docs.ros.org/en/noetic/api/robot_localization/html/index.html) * [ROS Package](https://github.com/cra-ros-pkg/robot_localization) * ROS node: `ekf_localization_node` ## Simulation setup :heavy_check_mark: Create a Gazebo environment for testing ![](https://i.imgur.com/CHFXFtG.png =300x200) :heavy_check_mark: Mount downward & forward facing d435 camera (w/o embedded imu) ![](https://i.imgur.com/Q3cYp2P.png =300x200) ![](https://i.imgur.com/NYr5rVZ.jpg =250x200) :heavy_check_mark: Add bottom and front Apriltags ![](https://i.imgur.com/7TovuAf.png =300x250) ***Mind that the Apriltag has size of 0.3m x 0.3m, make sure to change the standalone tags info in tags.yaml in apriltag_ros! &rArr; The tag size of tag36h11 is defined as the distance between the black border (24cm), not the model size (30cm). ![](https://i.imgur.com/To2Vxgp.png)*** :heavy_check_mark: Test /tag_detections correctness ![](https://i.imgur.com/fbAIWoS.gif =300x200) * Using RGB image * ++Downward Error Table++ Error = (uav GT z position - 0.05) - (/tag_detections z position + 0.01) | Height | 0.8m | 1.4m | 2m | 3m | | ------ | ----- | ----- | --- | --- | | Error | 0.046m | 0.046m | 0.042m | 0.017m | * ++Forward Error Table++ Error = tag GT x position - (uav GT x position + 0.1 + /tag_detection z position) | Height | 0.8m | 1.4m | 2m | 3m | | ------ | ------ | ------ | ------ | ------ | | Error | 0.008m | 0.005m | 0.007m | 0.001m | | 偵測圖像發佈頻率(hz) | fmin | fmax| | -------- | -------- | -------- | | camera_forward/infra1/image_raw | 7.87 | 25 | | camera_forward/color/image_raw|11.36|21.74| | /forward/tag_detections | 8.33|12.05| Expected Rate>30hz * Problem / Note: 1. Forward and downward apriltag detections are running on two independent nodes, might decrease processing speed? 2. Precise static detection result with error less than 0.1m. ## Usage ### Spawn Iris and Gazebo world ```console $ export GAZEBO_MODEL_PATH=[path to Apriltag_Localization]/AprilTag_Localization/src/Env/multirotor_geometry_control/rotors_simulator/rotors_gazebo/models:$GAZEBO_MODEL_PATH $ roslaunch rotors_gazebo iris_one.launch ``` ### Iris take off ```console $ roslaunch rotors_gazebo controller_geometry_iris.launch ``` &#9888;&#65039; **Problem:** Iris cannot take off. &#10004;&#65039; **Solution:** Remember to add ==librotors_gazebo_ros_interface_plugin.so== to test_room.world(or other worlds where you want to simulate UAVs). ```sdf <plugin name="ros_interface_plugin" filename="librotors_gazebo_ros_interface_plugin.so"/> <spherical_coordinates> <surface_model>EARTH_WGS84</surface_model> <latitude_deg>47.3667</latitude_deg> <longitude_deg>8.5500</longitude_deg> <elevation>500.0</elevation> <heading_deg>0</heading_deg> </spherical_coordinates> <physics type='ode'> <ode> <solver> <type>quick</type> <iters>1000</iters> <sor>1.3</sor> </solver> <constraints> <cfm>0</cfm> <erp>0.2</erp> <contact_max_correcting_vel>100</contact_max_correcting_vel> <contact_surface_layer>0.001</contact_surface_layer> </constraints> </ode> <max_step_size>0.01</max_step_size> <real_time_factor>1</real_time_factor> <real_time_update_rate>100</real_time_update_rate> <gravity>0 0 -9.8</gravity> </physics> ``` ### Start Apriltag Detection ```console $ roslaunch apriltag_ros continuous_detection.launch ``` **Note that tags.yaml should me modified according to your usage.** ## Tutorials ### How to add Apriltags in Gazebo 1. Change directory to /src/Env/multirotor_geometry_control/rotors_simulator/rotors_gazebo/models 2. Copy the desired Apriltag .png image file to media/materials/textures 3. Add the material in media/materials/scripts/Apriltag.material by copying the same as above and modify the texture to your desired image 4. Find 0.3mSquarePlate template in /models and copy 5. Change the name in model.config and model.sdf to ApriltagIDx 6. In model.sdf find name tag under material tag, then change the name to Apriltag/IDx #### ++Usage++ ##### **A. Insert Apriltag as an external model from Insert Tab in Gazebo GUI** 1. Insert the Apriltag in Gazebo by selecting the model in Insert panel ##### **B. Directly add Apriltag to Gazebo world sdf file** 1. In terminal, type ```console $ export GAZEBO_MODEL_PATH=[path to Apriltag_Localization]/AprilTag_Localization/src/Env/multirotor_geometry_control/rotors_simulator/rotors_gazebo/models:$GAZEBO_MODEL_PATH ``` 2. In test_room.world, add ```html <include> <uri>model://ApriltagIDx </uri> <pose>x y z roll pitch yaw </pose> </include> ``` 3. Change the pose of the tag to desired position #### ++Problems++ 1. Apriltag's texture is gray in Gazebo &rarr; Change the lighting tag to 0 in model.sdf 2. Cannot find the apriltag texture &rarr; Change uri to model:// instead of file:// 3. Cannot find Apriltag model &rarr; Run gazebo only in the rotors_gazebo package 4. Apriltag cannot be fixed in desired position. It follows the physic law &rarr; set static tag to true ### How to mount D435 on UAV in Gazebo 1. Download [realsense_gazebo_plugin](https://github.com/pal-robotics/realsense_gazebo_plugin) to your workspace 2. Add [_d435.gazebo.xacro](https://github.com/pal-robotics-forks/realsense/blob/upstream/realsense2_description/urdf/_d435.gazebo.xacro) and [_d435.urdf.xacro](https://github.com/pal-robotics-forks/realsense/blob/upstream/realsense2_description/urdf/_d435.urdf.xacro) to the urdf folder under the simulation environment ([/rotors_description/urdf](https://github.com/Andreew9504089/AprilTag_Localization/tree/master/src/Env/multirotor_geometry_control/rotors_simulator/rotors_description/urdf)) 3. In file _d435.urdf.xacro, change ![](https://i.imgur.com/QDycJFH.png) to ![](https://i.imgur.com/XBp6l97.png) 4. In iris_base.urdf(or the urdf you want to add d435), add the following lines. ![](https://i.imgur.com/in51Vs3.png) Localization with multiple apriltags --- #### Environment setup * Both downward & forward have 5 apriltags with different id ![](https://i.imgur.com/ICvf9VP.png) * Downward with id 0 - 4 ![](https://i.imgur.com/kIOj6fa.png) * Forward with id 5 - 9 ![](https://i.imgur.com/tailAeo.png) #### Create tag_bundles in tag.yaml >Reference: [Using Apriltags with ROS](https://optitag.io/blogs/news/using-your-apriltag-with-ros) ![](https://i.imgur.com/3AscwOt.png) #### Calculate Error * Error = abs(odometry - (x、y、z transformate from camera_frame to tag_frame(middle of tag_bundles) then transformate to world_frame)) * Downward | Height | 3 m | 2.5 m | 2m | | ------ | ---------- | ---------- | --------- | | Error |x: 0.0275 m |x: 0.0163 m |x: 0.007 m | | |y: 0.0364 m |y: 0.023 m |y: 0.019 m | | |z: 0.023 m |z: 0.025 m |z: 0.025 m | * Forward | Height | 3 m | 2.5 m | 2m | | ------ | ---------- | ---------- | ---------- | | Error |x: 0.097 m |x: 0.0956 m |x: 0.0958 m | | |y: 0.0354 m |y: 0.026 m |y: 0.018 m | | |z: 0.026 m |z: 0.007 m |z: 0.012 m | Robot_Localization --- :::info **Download the package of robot_localization:** [robot_localization](https://github.com/cra-ros-pkg/robot_localization/tree/melodic-devel) ::: #### Fusion of pose data from two different camera & imu. * Prepare your data * pose datatype: [geometry_msgs/PoseWithCovarianceStamped](http://docs.ros.org/en/api/geometry_msgs/html/msg/PoseWithCovarianceStamped.html) * imu datatype: [sensor_msgs/Imu](http://docs.ros.org/en/api/sensor_msgs/html/msg/Imu.html) * Transformation: odom &rarr; base_link(It is the relationship between the robot’s current position from its origin.) * Accounts for the robot's local pose. * Data is continuous, but drifts over time due to sensor drift. * Coordinate Frame * map: world (if there is no map, you can just remove it). * odom_frame: iris1/odometry_sensor1. * base_link_frame: iris1/base_link. * world_frame: iris1/odometry_sensor1. * Configuring robot_localization * Reason for using ekf Although ukf is more accurate for non-linear transformation, but according to reference, ekf is as effective as ukf for sensor fusion. * check ==ekf_template.yaml== & ==ekf_template.launch== * setting of ==ekf_template.yaml== * coordinate frame ![](https://i.imgur.com/PANiBiW.png) * pose sensor ![](https://i.imgur.com/CgCJg9W.png) * imu sensor ![](https://i.imgur.com/E9nTfwP.png) * process_noise_covariance & initial_estimate_covariance & other parameters are same as default value. * Validation ```bash= roslaunch robot_localization ekf_template.launch ``` Output is on the topic ==/odometry/filtered== ![](https://i.imgur.com/HPk9Blm.png) #### Reference [robot_localization wiki](https://docs.ros.org/en/noetic/api/robot_localization/html/index.html) [Tutorials for users](https://github.com/methylDragon/ros-sensor-fusion-tutorial) **You can check 01 - ROS and Sensor Fusion Tutorial** [ROS Developers Live-Class #51: How to fuse Odometry & IMU using Robot Localization Package](https://www.youtube.com/watch?v=QZ5q59H2qaI) #### Test * 7/11: * [doc](https://docs.google.com/document/d/1XmHVnF8dkrGiqNl1YXQf-NiO5mYv35ZC9dRz4tRfJiE/edit) * 7/16 Test: * **Apriltag Hallway: Fusion of camera & imu** * [Github](https://github.com/Leo125Jan/Apriltag-Localization) * {%youtube Ub9gjNUJmf8 %} * 7/17 Test: * **Iris takes off, robot_localization only fuse imu sensor** * {%youtube efvqqaAKpQU %} * 7/18: 1. multi-tag_bundle detection - environment: * Forward camera - tag_bundle1: (4.01, 0, 2) * Forward camera - tag_bundle2: (4.01, 2, 2) * Forward camera - tag_bundle3: (4.01, -2, 2) * Downward camera - tag_bundle1: (0, 0, 0) * Downward camera - tag_bundle2: (0, 2, 0) * Downward camera - tag_bundle3: (0, -2, 0) ![](https://i.imgur.com/lPmYXd9.png) - detection result of tag_bundles * quadcopter hovering at (x:0, y:0, z:2) * each error < 0.1 m * units: m ![](https://i.imgur.com/Jty49pm.png) 2. fusion w/ multi-tag_bundle and imu - flying w/o motion planning - scenario 1: quadcopter fly to (0, 0, 2) from the origin (0, 0, 0) w/ one waypoint (0, 0, 2) * fusion result * sensor: one imu + 0~4 poses detected by tag_bundles * time cost: 4.3 s * x-axis: time(second) * y-axis: position(meter) ![](https://i.imgur.com/Qnmz3N0.png) * only fusion imu data ![](https://i.imgur.com/1Ggd2pM.png) - scenario 2: quadcopter fly to (0, 0, 2) from the origin (0, 0, 0) w/ two points (0, 0, 0.5) and (0, 0, 2) * fusion result * sensor: one imu + 0~4 poses detected by tag_bundles * time cost: 7.3 s * x-axis: time(second) * y-axis: position(meter) ![](https://i.imgur.com/YegLDVO.png) * only fusion imu data ![](https://i.imgur.com/7TdLbOi.png) 3. tag tf - tag detection output is based on tag frame (see arrows in below picture) - tf showed in rviz ![](https://i.imgur.com/GaO7ulY.png) 4. tag detected result - issue: x and y is opposite 5. [Code - Github](https://github.com/vivianbh/AprilTag_Localization/tree/vivian-dev) ![](https://i.imgur.com/j00vtAz.png) #### Work Distribution | Vivian | Shaw | Leo | |--------|--------|------| |Evaluation of tag_bundle & robot_localization package |Code optimization |Evaluation of tag_bundle & robot_localization package | |Experiment|Experiment|Experiment --- * 7/20: * **Iris flies horizontally, fuse camera and IMU.** * {%youtube LlKsxc9_ssw %} * 7/25 1. imu coordinate change to ENU * origin: xyz -> SEU (include gravity) * fly a square route ![](https://i.imgur.com/2SYeVmq.png) * new topic name: /iris1/imu/enu * [Code Link](https://github.com/vivianbh/AprilTag_Localization/blob/vivian-dev/src/robot_localization/src/imu_coord_enu.cpp) * before and after * vertically fly to 2.5 meter high, given five waypoints * below 2 pictures: acc from imu and groungtruth position * origin topic: /iris1/imu ![](https://i.imgur.com/zT9ZWKf.png) * new topic: /iris1/imu/enu ![](https://i.imgur.com/YBdMc6T.png) 2. fusion tag_bundles and imu (frame as ENU) * vertically fly to 2.5 meter high, given five waypoints * fusion result affected by acc variance ![](https://i.imgur.com/zbxOzVv.png) 3. fusion imu (frame as ENU) only * vertically fly to 2.5 meter high, given five waypoints * fusion result affected by drift after intergration 2 times ![](https://i.imgur.com/vk0VnOU.png) 4. ekf.yaml setup changed ![](https://i.imgur.com/DJDddnp.png) * 7/29: #### **Iris goes through square path, fusion with camera & IMU** * Environment ![](https://i.imgur.com/3qKaYNB.png) >Detail: https://docs.google.com/presentation/d/1osLHdvbiBsqHqRepMad7XYyr-Qt42LXiAthbbVHfIgE/edit#slide=id.p * Control yaw of iris According to **rotors control/lee_position_controller.cpp**, in **LeePositionController::ComputeDesiredMoment**, we know that assign direction of velocity, we can change heading of iris. ![](https://i.imgur.com/PXfNjTV.png) * Waypoint * Path: (-2, 2, 2) &rarr; (2, 2, 2) &rarr; (2, -2, 2) &rarr; (-2, -2, 2) &rarr; (-2, 2, 2) * Each segment has 100 points, total 400 points. Then, we modify ==offb/geo.cpp== to calculate waypoint and velocity of ecah waypoint. * Message will be published on ==iris1/desired_trajectory==. * Demo I: Square Path w/o Turn {%youtube _QiUByngv7Q %} * Demo II: Square Path w/ Turn {%youtube -AoKKMnz1AA %} > [Github](https://github.com/Leo125Jan/Apriltag-Localization) Pixhawk 4 & Upboard Setup --- ### Pixhawk 4 * Bootloader 安裝(安裝在筆電上) * Download bootloader,並按照 github 上的說明完成編譯 >https://github.com/PX4/PX4-Bootloader >Reference: https://hackmd.io/VYuIrmeMRSy2pcKGMvT15Q?view * St-link 安裝(安裝在筆電上) 1. Download st-link ```bash= git clone http://github.com/texane/stlink.git cd stlink mkdir build cd build cmake .. make -j4 sudo make install ``` 2. Copy 49-stlinkv2.rules ```bash= cd stlink/config/udev/rules.d sudo cp 49-stlinkv2.rules /etc/udev/rules.d/ ``` >Reference: https://github.com/shengwen-tw/stm32f4-examples * Bootloader 程式燒錄(只燒錄 px4fmuv2_bl 的部份) 1. 筆電的 usb 接上 st-link,並用 lsusb 確認連接成功 ![](https://i.imgur.com/fJo2JTy.jpg) 2. 燒錄程式 ```bash= cd PX4-Bootloader/build/px4fmuv2_bl st-flash write px4fmuv2_bl.bin 0x8000000 ``` >Reference: https://hackmd.io/VYuIrmeMRSy2pcKGMvT15Q?view ### Upboard * Mavros 安裝(安裝在 upboard 上) 1. 安裝步驟 >https://hackmd.io/iOtCxeZtSSqi2rCyHt2dyg#Husky-setting-in-ROS-melodic 2. 測試 * 連接 px4 與 upboard ![](https://i.imgur.com/IaH9qRS.jpg) * 執行指令 ```bash= roslaunch mavros px4.launch ``` * 執行結果 ![](https://i.imgur.com/Z0DWPkQ.jpg) * Topic ![](https://i.imgur.com/wjvwU8Y.jpg) ![](https://i.imgur.com/HV439FC.jpg) 3. IMU 資訊接收 * 執行指令 ```bash= rostopic echo /mavros/imu/data ``` * 執行結果 ![](https://i.imgur.com/EA7cDFu.jpg) * IMU 的座標軸 ![](https://i.imgur.com/T8GUJIX.png) PX4 Original --- :::info **Official: [PX4 User Guide](https://docs.px4.io/main/en/)** ::: * Basic Assembly * 機架: Done * 馬達: Done * 飛控板(PX4): Done * Radio Receiver: Done * Xbee: Done * Upboard x 2(Respond for camera respectively): Not yet * 配線: Done * Basic Configuration * RC sysyem * Xbee 連線設定 * OptiTrack 設定 * 飛控板校正 * MAVROS 安裝 * Ground Station(Laptop) * Vehicle(Upboard) * PID 參數調整 * Flying Mode 參數設定 * Flying * Contorl Mode: **Offboard Control** * Usage: On-board processor and Serial radios link to ROS * Ground Station: Laptop(ROS + mavros) * Commuciation device: Xbee * Protocol: MAVlink * Vehicle: Upboard & PX4 Upboard with ROS & mavros, connect to PX4 through usb. ![](https://i.imgur.com/I5Tfk4h.jpg) * Image ![](https://i.imgur.com/axL85Vo.png) ![](https://i.imgur.com/zzQ1Vp6.png) * Control Objective: * Position: x、y、z * Velocity: vx、vy、vz * Attitude: quaternion、roll/yaw/pitch rate、thrust * Parameter: :::info **The QGroundControl Parameters screen allows you to find and modify any of the parameters associated with the vehicle.** ::: ![](https://i.imgur.com/nyt65M7.png) * Example Code: [Official Example Code](https://github.com/PX4/PX4-Devguide/blob/master/en/ros/mavros_offboard.md) [mavros_offboard_control](https://github.com/raaslab/mavros_offboard_control) [offboard_control](https://github.com/TIERS/offboard-control) * Reference: [Offboard Control](https://docs.px4.io/main/en/flight_modes/offboard.html) [Offboard Control from Linux](https://docs.px4.io/main/en/ros/offboard_control.html) [MAVROS Offboard control example](https://docs.px4.io/main/en/ros/mavros_offboard_cpp.html) [MAVROS Document](http://wiki.ros.org/mavros) * Inspection before flying * Hardware: * Battery * Propeller * Radio connection * Xbee connection * Upboard & PX4 connection * LED * GPS / Pose estimate * [Using Vision or Motion Capture Systems for Position Estimation](https://docs.px4.io/main/en/ros/external_position_estimation.html#specific-system-setups) * [基於PX4、樹莓派和Mocap(Optitrack)的室內飛行環境搭建](https://blog.csdn.net/qq_42879437/article/details/117514082?fbclid=IwAR0ST33CtXGXvtWUF2k2PrrCwiGSPqj0ZcK3VUGFmPOsyznJo8lpeypD7Yw#t14) * Software * Firmware: * Follow the instruction above * [Firmware](https://docs.px4.io/main/en/config/firmware.html) * Airframe: * [Airframe](https://docs.px4.io/main/en/config/airframe.html ) * Choose [Quadrotor-X](https://docs.px4.io/main/en/airframes/airframe_reference.html#quadrotor-x) * Power moudal: * [Battery and Power Module Setup](https://docs.px4.io/main/en/config/battery.html) * Safety: * [Low Battery](https://docs.px4.io/main/en/config/safety.html#low-battery-failsafe) * [RC Loss](https://docs.px4.io/main/en/config/safety.html#rc-loss-failsafe) * Actuators: * Choose appropriate ==**airframe**== by instruction above. * [Motor Geometry](https://docs.px4.io/main/en/config/actuators.html#motor-geometry) * [Bidirectional Motors](https://docs.px4.io/main/en/config/actuators.html#bidirectional-motors) * [Control Surfaces Geometry](https://docs.px4.io/main/en/config/actuators.html#control-surfaces-geometry) * [Actuator Outputs](https://docs.px4.io/main/en/config/actuators.html#actuator-outputs) * Prefer AUX output * [Actuator Testing](https://docs.px4.io/main/en/config/actuators.html#actuator-testing) * [Multicopter PWM: Motor Assignment](https://docs.px4.io/main/en/config/actuators.html#multicopter-pwm-motor-assignment) * [Reversing Motors](https://docs.px4.io/main/en/config/actuators.html#reversing-motors) * [Other Notes](https://docs.px4.io/main/en/config/actuators.html#other-notes) * Tuning * [Auto-tuning](https://docs.px4.io/main/en/config/autotune.html#auto-tuning) * Calibration * [Compass Calibration](https://docs.px4.io/main/en/config/compass.html#compass-calibration) * [Gyroscope Calibration](https://docs.px4.io/main/en/config/gyroscope.html#gyroscope-calibration) * [Accelerometer](https://docs.px4.io/main/en/config/accelerometer.html#accelerometer) * ==**Above three is done, click level to update.**== * [ESC Calibration](https://docs.px4.io/main/en/advanced_config/esc_calibration.html#esc-calibration) * Build * [F450](https://docs.px4.io/main/en/frames_multicopter/dji_f450_cuav_5plus.html) * Serial Setting * [Serial Port Configuration](https://docs.px4.io/main/en/peripherals/serial_configuration.html#serial-port-configuration) * [MAVLink Peripherals (GCS/OSD/Companion)](https://docs.px4.io/main/en/peripherals/mavlink_peripherals.html#mavlink-peripherals-gcs-osd-companion) * First Fly(Manual Mode) * {%youtube Kml9Gl0oOMA %} NCRL link & Web Test --- * Connection Interface ![](https://i.imgur.com/tuyPFyR.png) * Demo Video {%youtube Fjl2wFfWzXE %} > [Web_Control_Square_Demo](https://drive.google.com/file/d/1zOsqin8TvYIw1V6SHDzuyIk3OqPuzXXw/view?usp=sharing) {%youtube RSIzx6iRbxU %} > [Web_Control_frame mounted_Demo](https://drive.google.com/file/d/1ik0JsEIYGDNpJ9mTLafmqRjOTsveJpzT/view?usp=sharing) Implement --- ### Get IMU Data From D435i #### Start With D435i(IMU) 1. Download following package >[IntelRealSense](https://github.com/IntelRealSense/realsense-ros.git) ```shell= sudo apt-get install ros-melodic-realsense2-camera sudo apt-get install ros-melodic-realsense2-description ``` 2. Command For Test ```shell= roslaunch realsense2_camera rs_camera.launch ``` #### Obtaining IMU Data 1. Go to **realsense-ros/realsense2_camera/launch**, then modify **rs_camera.launch** with following argument. ```xml= <arg name="enable_syn" value="true"/> <arg name="enable_gyro" value="true"/> <arg name="enable_accel" value="true"/> <arg name="unite_imu_method" default="linear_interpolation"/> ``` 2. Download imu_filter_madgwick ```shell= sudo apt-get install ros-melodic-imu-filter-madgwick ``` 3. Modify **opensource_tracking.launch**. ```xml= <launch> <arg name="offline" default="false"/> <include unless="$(arg offline)" file="$(find realsense2_camera)/launch/rs_camera.launch"> <arg name="align_depth" value="true"/> <arg name="linear_accel_cov" value="1.0"/> <arg name="unite_imu_method" value="linear_interpolation"/> </include> <node pkg="imu_filter_madgwick" type="imu_filter_node" name="ImuFilter"> <param name="use_mag" type="bool" value="false" /> <param name="_publish_tf" type="bool" value="false" /> <param name="_world_frame" type="string" value="enu" /> <remap from="/imu/data_raw" to="/camera/imu"/> </node> </launch> ``` 4. Command For Test ```shell= roslaunch realsense2_camera opensource_tracking.launch ``` Tagslam (Baseine I) --- :heavy_check_mark: Run GitHub provided example * Example 1 ![](https://i.imgur.com/UXXtwtm.gif) :heavy_check_mark: Tagslam Documentation Note * [Camera Calibration](https://berndpfrommer.github.io/tagslam_web/intrinsic_calibration/) * [Input Files and Parameters](https://berndpfrommer.github.io/tagslam_web/input_files/) * [Tutorial on using our own camera](https://blog.csdn.net/leida_wt/article/details/109690955) :::danger Tagslam itself does not provide fusion with imu. Thus, fusion of camera and IMU must be carried out by ourselves via either the mocap survey or robot_localization package (ekf localization node). ::: :heavy_check_mark: Test our two cameras model with tagslam :::warning **++Real-time Restriction++** &emsp; Real-time usage is limited to situations where either the graph is so small that it can be solved quickly (maybe a thousand frames maximum!), or the pose of the tags can be determined beforehand, offline, with a “mapping run” that builds the full graph, and then stores the optimized tag poses for a subsequent real-time run. To get real-time performance, use the “amnesia” feature such that old information gets thrown away and the graph stays small. ::: * Test Environment (3 tags in triangular form) ![](https://i.imgur.com/AEYx0qf.png) ![](https://i.imgur.com/zuEPmHh.png) **Downward detection** ![](https://i.imgur.com/paAK1di.png) **Forward detection** * Configuration * camera_pose.yaml ![](https://i.imgur.com/5Qoxhfy.png) ![](https://i.imgur.com/w8yy4X8.png) * camera.yaml ![](https://i.imgur.com/KW5jBPz.png) Intrinsics can be found in camera_info * Tagslam only Test Result * Hovering * Given tag0 (downward) & tag1 (forward) poses ![](https://i.imgur.com/SwHYQIU.png) Odom = /tagslam/odom/body_iris1 (Extreme low publishing rate) GT = /iris1/ground_truth/position | | Odom | GT | | -------- | -------- | -------- | | x | 0.126m | 0.126m | | y | 0.017m | 0m | | z | 1.972m | 1.978m | Error = |Odom - GT| = 0.018m * Hallway (straight trajectory) ![](https://i.imgur.com/eM0579m.gif) | 起點 | Odom | GT | Error | | ---- | --------- | ------ | --- | | x | -6.874 | -6.842 | 0.032 | | y | 1.348e-08 | 0.016 | 0.016 | | z | 1.978 | 1.97 | 0.008| 移動中誤差 https://drive.google.com/file/d/1jb-hImTk7QwZ8OKWgtO5uOA21vUUXlxm/view?usp=sharing | 終點 | Odom | GT | Error | | ---- | --------- | ------ | --- | | x | -3.865 | -3.874 | 0.009 | | y | 0.022 | -7.62e-09 | 0.022 | | z | 1.974 | 1.9779 | 0.0039 | * Tagslam and IMU fusion :::warning The computed coordinate of uav is relative to odom frame. Therefore, the relative static transform from map to odom should be adjust according to your purposes and scenario. ::: * Flow chart ![](https://i.imgur.com/97UxC52.png) **IMU filter is madgwick filter** * tf tree ![](https://i.imgur.com/x37qdw5.png) * sensor_fusion.launch ```xml <launch> <node pkg="robot_localization" type="ekf_localization_node" name="ekf_localization" clear_params="true"> <param name="frequency" value="30"/> <param name="sensor_timeout" value="0.1"/> <param name="two_d_mode" value="false"/> <param name="map_frame" value="map"/> <param name="odom_frame" value="iris1/odometry_sensor1"/> <param name="base_link_frame" value="iris1/base_link"/> <param name="world_frame" value="iris1/odometry_sensor1"/> <param name="odom0" value="/tagslam/odom/body_iris1"/> <param name="imu0" value="/imu/data"/> <rosparam param="odom0_config">[true, true, true, false, false, false, false, false, false, false, false, false, false, false, false]</rosparam> <rosparam param="imu0_config">[false, false, false, false, false, false, false, false, false, false, false, false, true, true, true]</rosparam> <param name="odom0_differential" value="false"/> <param name="imu0_differential" value="false"/> <param name="imu0_relative" value="false"/> <param name="imu0_remove_gravitational_acceleration" value="true"/> <param name="odom0_queue_size" value="0"/> <param name="imu0_queue_size" value="50"/> </node> </launch> ``` * Commands ```shell ~/NCRL/AprilTag_Localization$ roslaunch rotors_gazebo iris_one.launch ~/NCRL/AprilTag_Localization$ roslaunch rotors_gazebo controller_geometry_iris.launch ~/NCRL/AprilTag_Localization$ rosrun imu_filter_madgwick imu_filter_node _use_mag:=false _publish_tf:=false _world_frame:="enu" /imu/data_raw:=/iris1/imu ~/NCRL/AprilTag_Localization$ roslaunch rotors_gazebo sensor_fusion.launch ~/NCRL/tagslam_root$ roslaunch tagslam tagslam_d435.launch ~/NCRL/tagslam_root$ roslaunch tagslam apriltag_d435.launch ``` * Hovering * GT (-6.87, 0, 1.97) * Measurement * Fusion ![](https://i.imgur.com/zBj3RmY.png) **Large error in x direction, probably because of poor initialization of imu.** * Tagslam only ![](https://i.imgur.com/NLEGMMY.png) * Fusion後誤差 | Hovering | Odom | GT | Error | | ---- | ---- | ---- | ----- | | x | -6.845 | -6.874 | 0.029 | | y | 0.016 | 2.22e-09 | 0.016 | | z | 1.971 | 1.978 | 0.007 | * moving error https://drive.google.com/file/d/19DejQdk8aUWCVUArmf0c9_T6O9HDsQBi/view?usp=sharing * Troubleshooting 1. All topics are correctly connected and nodes are fully opened, but tagslam is not providing any map or tf. **Make sure that the time stamp of every sensors and external odometry are synchronized, if not, set use_approximate_sync to true in tagslam.yaml** 2. The relative position of fixed tags and cameras are supposedly configured, however, the position tagslam provided is weird. Possible reasons are 1. Insufficient tags 2. Still some parameters are wrongly configured ![](https://i.imgur.com/vuWHV3e.png =500x250) 3. The cameras' axis are defined differently to Gazebo's. X and Y axis are reversed since the coordinate is respect to iris1's, which has z axis pointing downward. :::danger Every known tags pose needs to rotate around its normal vector with -90 degrees. ::: * tag's z axis toward z direction (world frame) tag ![](https://i.imgur.com/3KTZS7t.png) * tag's z axis toward -x direction (world frame) tag ![](https://i.imgur.com/VHggGQI.png) :::warning Tagslam's rotation is defined as Axis with angle magnitude (radians). Therefore, the XYZ Euler Angle Rotation with [1.57, 1.57, 0] will become [1.2091996, 1.2091996, 1.2091996]. The latter should be the correct value to fit in the camera_pose.yaml for cam_0. [Rotation Converter](https://www.andre-gaschler.com/rotationconverter/) ::: :::warning Beware that the intrinsic and image size of stereo and color camera of d435 bare different. ::: * TO-DO - [x] Test different environment and evaluate the performance - [x] Hovering - [x] Straight trajectory (Hallway) - [x] Fusion of Tagslam's odometry & IMU's * Mapping Output * poses.yaml ![](https://i.imgur.com/nKFRFyM.png) * add ros service to obtain position of tags directly from yaml ![](https://i.imgur.com/QzEROjL.png) **Enable users to directly deploy apriltag_localization right after mapping, without having to key in the position of tags manually.** * Work distribution | Andrew | Jakey | Chan | | --------------------------------------------------------------------- | -------------------------------------------------- | -------------------- | | Error evaluation for hovering scenario | Visual-Inertial Motion Capture System環境測試 | Setup & test Tagslam | | Create hallway scenario Gazebo simulation & tagslam config | Error evaluation for hallway scenario | | | Create Hover scenario Gazebo simulation & tagslam config | Error evaluation for hallway scenario after fusion | | | Write config for our camera and uav model setup with parameter tuning | | | | Sensor fusion of tagslam & imu using ekf_localization_node | | | | Setup & test Tagslam | | | | Add yaml ros service | | | * Camera test * Camera calibration ![](https://i.imgur.com/mQQxD0l.jpg) * Camera Parameters * resolution 640*480 * using USB 2.0 ![](https://i.imgur.com/fei59CV.jpg) * cameras.yaml ![](https://i.imgur.com/IC5eeFJ.jpg) * d435i test result ![](https://i.imgur.com/XiEzk6m.jpg) * Camera1 ![](https://i.imgur.com/VrsgOBg.jpg) * Camera2 ![](https://i.imgur.com/god0vD6.jpg) * mapping * 10 tags line up ![](https://i.imgur.com/4mTMjFP.jpg) * 5 tags ![](https://i.imgur.com/V6TWZk0.jpg) * 3D mapping ![](https://i.imgur.com/aZJ84xb.jpg) ## Web Interface ![](https://i.imgur.com/8wuWMA8.png) **++Step1++** ```shell= ~/NCRL/AprilTag_Localization/src/webPages/single_page_gui $ python -m SimpleHTTPServer ``` **++Step2++** Go to the address shown in terminal **++Step3++** Choose GUI.html ## Tagslam & Apriltag_ros python API ![](https://i.imgur.com/Rq3eyrL.png) Transform the poses.yaml to the format that apriltag_ros can understand with an user defined tag_bundles.yaml. * tag_bundles.yaml example parent is the center tag while the child is a list containing other bundle members' ids ```yaml= tag_bundles: [ { name: 'tag_bundles_forward_1', parent: 15, child: [16, 17, 18, 19, 5, 6, 7, 8, 9] }, { name: 'tag_bundles_forward_4', parent: 35, child: [36, 37, 38, 39, 25, 26, 27, 28, 29] } ] ``` **Downward tags are excluded** ## System Integration ### Software Block Diagram ![](https://i.imgur.com/WHswUdp.png) ### Read Tag coordinate * Use ROS service to read corresponding tag coordinate with given id from poses.yaml ![](https://i.imgur.com/RkKqFlj.png) ![](https://i.imgur.com/j5Z4I3n.png) ### Path Planning * Adjacency list ## Meeting Andrew &rarr; Vivian &rarr; Jakey &rarr; Leo &rarr; Shaw &rarr; Chan ### Week1 (07/04 21:00): Andrew * Last week progress: * build environment * surveying apriltag localization * This week expected progress: * Divide to two groups, Baseline and Experiment teams. * Baseline (Tagslam/...) * Member * Andrew * Jakey * Chan * Experiment (Multisensor calibration + multitags localization with EKF) * Member * Vivian * Shaw * Leo * Survey and implementation. * Position for this week * Meeting Host * Vivian * PPT * Jakey * Presentation * Andrew * PPT outline 1. Last week progress i. Survey ii. Experiment & Problem 2. This week expected progress i. Baseline Team &emsp; a. Get comprehensive knowledge of how to use the opensource code &emsp; &emsp; (i.e what information to send to the slam, etc.) &emsp; b. pull the repo to try it out ii. Experiment Team &emsp; a. survey 3. Gantt Chart **Shaw will be absence this week.** :::warning ++Requirements++ 1. Set goal point via assigning apriltag id instead of exact coordinates. 2. Integrate CBF obstacle avoidance system with our system 3. Error at final goal should be less than 10cm (First only the goal point then the whole odometry) 4. Provide an intuitive user interface ::: ### Week2 (07/11 20:00): Vivian * Week progress - baseline: tagslam - 參數設定 - 鋼體設定 - tag讀orientation+四個角落位置 - Next - 繼續研究參數,定位結果有落差 - motion capture system - 遇編譯問題 - tag讀orientation+四個角落位置 - Next - 繼續跑環境出來 - 多tag偵測 - 誤差計算 - robot_localization - 參數設定 - Next - 測試無pose資料時的結果 (Shaw) - 測試飛一段的結果看最後定點結果 (Vivian) - PPT outline 1. Last week progress i. Baseline Team &emsp; (i) Tagslam:照上面 &emsp; (ii) MoCap:照上面 ii. Experiment Team &emsp; (i) tagbundles:照上面 &emsp; (ii) robot localization:照上面 2. This week expected progress i. Baseline Team &emsp; a. Tagslam:找到誤差的成因 &emsp; b. MoCap:跑環境 ii. Experiment Team &emsp; a. 測試無pose資料時的結果 &emsp; b. 測試飛一段的結果看最後定點的結果 3. Gantt Chart * Position for this week * PPT * 週二晚上10點 * Shaw * Presentation * Andrew * Leo * 臨時動議 * 下週三 Meeting 7/20 缺:Andrew :::warning List out what each member is responsible for in detail during weekly report. ::: ### Week3 (07/18 20:00): Jakey * Week progress - baseline: tagslam - 計算tagslam誤差 - sensor fusion以及計算fusion後誤差 - robot_localization - 測試飛行過程中相機與imu fusion表現 - 解決無pose資料時的問題 (Shaw) - 測試飛一段的結果與最後定點結果 (Vivian):多tag偵測、計算無人機起飛及懸停誤差 - PPT outline 1. Last week progress i. Baseline Team &emsp; (i) Tagslam:照上面 ii. Experiment Team &emsp; (i) tagbundles:照上面 &emsp; (ii) robot localization:照上面 2. This week expected progress - 組裝無人機 - 討論要用哪種方法做定位 3. Gantt Chart * Position for this week * PPT * 週二晚上10點 * 家豪 * Presentation * 禹維 * 博慧 * 臨時動議 ### Week4 (07/25 20:00): Leo * Week progress * 星期三開會詢問教授 Project 最終的應用情境 * 一直線飛行: Experiment * 環繞飛行: Baseine * 設備、器材與金費的表單完成 * Project 行程規劃 | | Monday | Tuesday | Wednesday | Thursday | Friday | |:------:|:------:|:-------:|:---------:|:--------:|:------:| | Andrew |✔|✔|✗|✗|✗| | Jakey |✔|✔|✔|✔|✔| | Chan |✔|✔|✔|✔|✔| | Vivian |✔|✗|✔|✗|✗| | Shaw |✔|✔|✔|✔|✔| | Leo |✔|✔|✔|✔|✔| ==**先連續去兩天(7/27 三、7/28 四),再看之後的進度安排。**== - PPT outline 1. Last week progress - 完成無人機的組裝 & 組裝的步驟圖 - 馬達推力曲線測試完畢 - 試飛無人機 2. This week expected progress - 測試相機 - 問教授最終的應用情境 3. Gantt Chart * Position for this week * PPT * 週二晚上10點 * 禹維 * Presentation * 竣智 * 印 Apriltag(20x20 cm 暫定) 1 張 * 家豪 ### Week5 (08/01 20:00): Shaw * Week progress 1.相機重新校正以及測試Apriltag,TagSlam mapping 測試 2.老師要求的人機轉彎模擬結果,跑一個正方形 3.網頁API * PPT outline * Last week progress * 相機校正以及測試Apriltag:同上 * 無人機轉彎模擬﹔兩個影片都放,問教授要使用哪個 * 網頁API﹔同上 * This week progress * 設定好 upboard 以及 Px4 * 繼續測試相機 * Position for this week * PPT * 宴誠 * 週二晚上10點 * Presentation * 相機校正﹔禹維 * 無人機轉彎模擬﹔家豪 * 網頁API﹔看狀況 * 分工 | 相機校正以及測試Apriltag | 無人機轉彎模擬 | 其他 | | ------------------------ | ------------------------| ------- | | 竣智﹔ | 博慧﹔環境架設 |晏誠﹔網頁API| | 禹維﹔ | 家豪﹔模擬兩種型態的轉彎 | | | | 劭暐﹔程式優化 | | ### Week6 (08/08 20:00): Chan * Week progress 1.解決上週問題以及3D建圖 2.燒錄程式 3.網頁API * PPT outline * Last week progress * 3D建圖:同上 * 網頁API:同上 * 燒錄程式:同上 * This week progress * 控制馬達 * Position for this week * PPT * 博慧 * 週二晚上10點 * Presentation * 燒錄程式:劭暐 * 3D建圖:禹維 * 網頁API:晏誠 * 分工 | 3D建圖 | upboard & px4 Setting | 其他 | | ------------------------ | ------------------------| ------- | | 竣智﹔ | 博慧:upboard & px4 溝通測試 |晏誠﹔網頁API| | 禹維﹔ | 家豪:upboard & px4 程式燒錄 | | | | 劭暐:upboard & px4 程式燒錄 | | ### Week7 (08/15 20:00): Andrew * Week progress * Upboard: 同上 * Upboard + tag detection: 測試偵測 * Web guide: [hackmd](https://hackmd.io/OjbGFM3ZQJKNgJ5R46Ze3w) * PPT outline * Last week progress * Upboard: 同上 * 相機偵測測試 * 使用手冊撰寫 * This week progress * Upboard: Position control(first test offboard mode with given goal point then we'll test apriltag_detection on upboard) * The other two papers * Paper * Position for this week * PPT * Shaw * [paper note](https://hackmd.io/@Andrew95/r1OWO4I0q/edit) * 10:00 pm Tue. * Presentation * Upboard: Leo * Upbard + d435i tag detection: Andrew * Guide: Andrew * Paper: Andrew & Jakey * 分工 | d435i tag detection | upboard & px4 Setting | Web GUI user guide | | ------------------------ | ------------------------| ------- | | 竣智﹔ | 博慧:飛行控制參數設定 |晏誠| | 禹維﹔ | 家豪:飛行控制參數設定 & 試飛| | | | 劭暐:飛行控制參數設定 & 試飛| | ### Week7 (08/22 20:00): Vivian * Week progress * Upboard + tag detection: 測試偵測 * API for tagslam & apriltag localization * [PX4 原廠 guid](https://hackmd.io/kaFCDY5BQnCipTrlqiP3vw);PX4燒NCRL control * Web 前後端連接 * topic * PPT outline * Last week progress * API for tagslam & apriltag localization * Web * PX4 原廠 guide -> 手飛完成 * PX4 燒 NCRL control * This week progress * Web 與 UPboard 連接 * 相機硬體裝設 * Position for this week * PPT * Andrew * 10:00 pm Tue. * Presentation * PX4 原廠 guide -> 手飛完成: Leo * PX4 燒 NCRL control: Leo * API for tagslam & apriltag localization: Andrew * Web: Andrew * Paper * Paper (Miminum snap): Vivian * Paper (ECBF): Shaw * 分工 | | | | | ------------------------ | ------------------------| ------- | | 竣智﹔ | 博慧:PX4 控制程式修正 & Paper (Miminum snap)|晏誠 | | 禹維﹔ | 家豪:PX4 控制程式修正 & Paper (ECBF) | | | | 劭暐:PX4 控制程式修正 & Paper (ECBF) | | ### Week8 (08/29 20:00): Jakey * Week progress * 網頁與ncrl連結程式 * 網頁更新與功能整合 * PPT outline * Last week progress * 網頁與ncrl連結程式 * 網頁更新與功能整合 * This week progress * 測試無人機定位,利用相機定位試飛 * 初步路徑規劃 * Position for this week * PPT * 博慧 * 10:00 pm Tue. * Presentation * 網頁與ncrl連結程式: Leo * 網頁更新與功能整合:Andrew * Paper * Exponential Control Barrier Functions :Leo、Shaw * Learning Control Barrier Functions:Jakey、Andrew * 分工 | | | | | ------------------------ | ------------------------| ------- | | 竣智﹔ 相機、腳架架設 、 paper | 博慧:NCRL link & 網頁測試|晏誠 web integration | | 禹維: 相機、paper | 家豪:NCRL link & 網頁測試 & Paper (ECBF) | | | | 劭暐:NCRL link & 網頁測試 & Paper (ECBF) | | ### Week9 (09/05 20:00): Leo * Week progress * Apriltag 定位與 OptiTrack 定位之比較 * TagSlam 建圖完成 * 試飛 * PPT outline * Last week progress * Apriltag 定位與 OptiTrack 定位之比較 * TagSlam 建圖完成 * 試飛 * This week progress * 找出試飛的問題,使無人機能夠用 Apriltag 定位的資訊順利飛行 * Position for this week * PPT * Jacky * 10:00 pm Tue. * Presentation * Apriltag 定位與 OptiTrack 定位之比較: Leo * TagSlam 建圖完成: 竣智 * 分工 | | | | | ------------------------ | ------------------------| ------- | | 竣智﹔ 相機、腳架架設 、 paper | 博慧:NCRL link & 網頁測試|晏誠 web integration | | 禹維: 相機、paper | 家豪:NCRL link & 網頁測試 & Paper (ECBF) | | | | 劭暐:NCRL link & 網頁測試 & Paper (ECBF) | |