# Avetics Robotics
## 1 Drone
### 1

Degrees of Freedom (DOF) for a quadcopter would be 6.
- 3 for translation (Up/Down, Left/Right, Forward/Backwards)
- 3 for rotation (rotate around X axis, Y axis, Z axis)
### 2
The drone is not a holonomic system, which is a system that can reach any position and orientation regardless of the path taken.
Directly controllable DOF in a quadcopter:
- Thrust: This directly controls the vertical movement (up/down).
- Tilt (combined Pitch and Roll): By adjusting the speed of opposing motors, the drone tilts in a specific direction, leading to movement along the X (forward/backward) and Y (left/right) axes. Yaw (rotation) is also achieved through this combined tilting motion.
### 3
A LIDAR sensor can be added to the drone if the sensor is light enough to be installed on the drone facing downwards, which would allow the drone to create detailed 3D map of the ground and keep the distance fairly constant, which is less than 10 m.
Assuming that the room wall is uniform, we can install a passive infrared sensors on the walls, floor and ceiling to make sure that the drone stays at a fixed position in space. The infrared sensors would act as a point to signal where the drone should be. IF the drone drifts there would be a system triggered to put the drone back in place. A rough 2D sketch can be seen in the screenshot.

### 4
We can find the dominant frequency using the welch function imported from scipy, based on readings on the internet:
```python=
import os
import sys
import pandas as pd
import numpy as np
from scipy.signal import welch
def read_data(file_name):
df = pd.read_csv(file_name)
return df
def main():
data = read_data(path_to_raw_data) # path_to_raw_data is the file directory and location
y_acc = data[:, 1]
fs = some_integer_sampling_frequency
freqs, psd = welch(y_acc, fs=fs, nperseg=some_other_parameter)
idx = np.argmax(psd)
dominant_freq = freqs[idx]
print(dominant_freq)
if __name__ == '__main__':
main()
```
This snippet would print out the dominant frequency value.
### 5
We can design a filter to eliminate 5 kHz and above using python and scipy. Based on some readings we can use `butter` (Butterworth filter) and `filtfilt` functions from `scipy.signal` to accurately estimate and filter out but the crude idea would be to get a certain value for the threshold (5 kHz) and filter out the frequencies above 5 kHz to eliminate the high frequency through the filters.
### 6
Gravity would be the most likely reason that there is an offset in the sensor. With the drone constantly tries to hover, the constant gravitational force would keep the offset around $9.81 ms^{-2}$ which can be seen in

This would mean that throughout the length of time that the drone is airborne, it shifts upwards slighty at the peaks where the acceleration goes above 0.
## 2
What we have:
- Bounding box software (given that it's already trained to create a bounding box on red cars.)
- barometer,
- accelerometer,
- gyroscope,
- GPS,
- an exceptionally good camera on a gimbal with pan-tilt zoom capability
ASSUMPTION:
- Camera is mounted at the bottom of the drone and has view along the whole stretch of the road from X to Y.

Rough assumed design

**1. What would be your method in deriving the GPS coordinates of the car provided that you have the car in view and you know the centre of the car and the skew angle of the car in the image frame?**
We know the tilt θ that is needed to keep the car center based on the given software library. We would know the 3D coordinates using the GPS in the drone, therefore we would know the height of the drone to the ground (h). Assuming that the drone is relatively within the same plane as the center of the car, we can get the distance (t) by doing the following:
$$ t = h tan(θ) $$
As seen in the diagram:

If the drone is not on the same plane (relatively) to the car, I would assume that given the road is a public property and distance between the central position of the car can be measured measure to the relative center of the road (r) and the distance between the drone and the center of the road can be measured using the GPS (l), we can find the distance (d) using
$$ d = (l+r) tan(θ) $$

(This is a rough and simplified estimation for the tracking, a more accurate calculation would possibly use a powerful LiDAR installed in the drone to measure)
**2. Describe what you would do to make sure that you will not lose sight of the car without moving the drone (only adjusting the camera)?**
Continuously track the center of the car in the image frame using the provided software library.
Calculate the car's displacement from the image center (delta_x, delta_y).
Adjust the camera gimbal pan and tilt angles based on the displacement:
- If delta_x is positive, pan the camera right to keep the car centered.
- If delta_x is negative, pan the camera left.
- If delta_y is positive, tilt the camera down (within limits to avoid losing sight).
- If delta_y is negative, tilt the camera up (within limits to avoid tilting too high).
Adjust the zoom if the car appears too small or too large in the frame to maintain a good view for tracking.

**3. Describe what you would do to make sure that you will not lose sight of the car andthat the drone always maintains a constant distance away from the car?**
- Utilize the drone's onboard sensors (barometer,
gyroscope, accelerometer) to estimate its altitude and relative movement (forward/backward, up/down).
- Further Combine this information with the car's center displacement in the image frame to estimate the car's relative distance.
If the car is moving away (delta_y increasing):
- The drone needs to move closer. Accelerate the drone forward while maintaining altitude using GPS and barometer data.
If the car is getting closer (delta_y decreasing):
- The drone needs to move further awawy. Instruct the drone to decelerate while maintaining altitude.
This would make it such that the drone will always be relatively at a constant distance and altitude away from the car.
**4. If you could choose one sensor on a drone to improve the accuracy and robustness of the system, what sensor would you add, and how would you use it?**
Accuracy and robustness would be defined for the drone to more accurately track the movements and relative position of the car. As mentioned, a LiDAR would be a good addition since it can calculate distance more accurately than that of the rough estimation so long as the car is within range of the LiDAR
**5. If you could choose one sensor on a car to improve the accuracy and robustness of the system, what sensor would you add, and how would you use it, assuming you have a one-way communication link from the car to the drone? (You may not use the sensor you will add together with the sensor added to the drone in question 4)**
A highly accurate and high speed satellite GPS in the car that constantly sends its coordinates to the drone would be able to improve the accuracy and robustness of this tracking situation. Since it uses satellite GPS, means that the drone GPS information can be used as to calculate the displacement from the car. This would allow for a more accurate tracking for the drone.