# Smart Tracing Self-driving Car
**IoT Project**
**Github:** https://github.com/jschen9999/TrancingCar
[**Demo Video1** ](https://drive.google.com/file/d/1iHbgE7BhOogBMD05aesq9NW11m7E7JIo/view?usp=sharing)
[**Demo Video 2**](https://drive.google.com/file/d/1ENs3ucQwAOWbB5dM1LXoz0-QcCmVhPUz/view?usp=sharing)
## Project Functionality Description
**RFID Card Recognition:**
- Operators use an RFID card reader to identify the self-driving car's designated path (blue or green line).
- During the journey, a green LED light is lit along the path, and the camera captures images to detect colored line segments' positions, adjusting the car's direction accordingly.
**Ultrasonic Obstacle Detection:**
- Ultrasonic sensors check for obstacles in front of the car.
- If an obstacle is detected within 20 cm, the car stops, a red LED light is lit, and a warning sound is played until the obstacle is removed.
-
**Stop at Red Line Zone:**
- A red line zone can be set on the path.
- When the car detects this zone, it stops, a yellow light is lit, and a broadcast announces the current time, followed by recycling truck music.
- After the music ends, the car resumes its journey.
**Remote Control for Temporary Parking:**
- Blue path allows remote control for temporary parking via an IoT platform.
- If an operator insists on temporary parking on the green path, the car emits a warning signal against stopping.
**Accelerated Return to End Point:**
- When the camera detects a black line, indicating the completion of the designated route, the car stops distance measurement, parking, and remote control functions.
- The car accelerates along the black line back to the endpoint recycling facility.
- Upon reaching the designated point, the speaker reminds the operator to press "q" to end the program.
## Project Operation and Execution Flow
1. **RFID Card Recognition:**
- Prepared two RFID cards; the card reader identifies the card's ID to determine whether the self-driving car should follow the blue or green line.
2. **Smart Tracing Movement (See Figure 2 and Figure 3):**
- The self-driving car uses a camera to capture images, identifying three specified pixel positions in front to determine the path's color.
- Depending on the color:
- Left pixel turns left for 0.1 seconds.
- Middle pixel moves forward.
- Right pixel turns right while moving forward, achieving self-driving car smart tracing.

3. **Ultrasonic Obstacle Detection (See Figure 4):**
- Uses an ultrasonic distance sensor to check for obstacles in front.
- If an obstacle is within 20 cm, the car stops, a red light is lit, and a warning sound is emitted.
4. **Stop at Red Line Zone (See Figure 5):**
- When the camera reads a red line, the car stops, a yellow light is lit, and music is played.
5. **IoT Platform Remote Control for Temporary Parking (See Figure 6):**
- The team set the blue line as a zone where temporary parking is allowed; the green line is a no-parking zone.
- Uses the ubidots IoT platform for remote control of the self-driving car's temporary parking.
- In parking zones, the car stops, a yellow light is lit, and music is played; in no-parking zones, a warning is issued, and the car continues moving.

6. **Accelerated Return to Endpoint (See Figure 7):**
- Upon reaching a black line, the program stops fetching cloud data from the IoT platform, significantly reducing loop execution time.
- The self-driving car accelerates along the black line back to the designated endpoint.
7. **Arrival at Endpoint (See Figure 8):**
- The endpoint is designed as a long black line perpendicular to the original route.
- When the camera identifies three specified pixels as black, the car stops, three lights are lit, and the operator is notified to end the program.

## Hardware Circuit Diagram

## Software Program Execution Flowchart

## Most Time-Consuming Part
The most time-consuming part of the project was determining the color and position of the lines to ensure the self-driving car could follow the designated path accurately. Challenges included adjusting color values to accommodate different environmental conditions, as well as fine-tuning the walking distance to balance accuracy and smooth movement. Additional time was spent on mitigating external light interference by adding a shield above the camera. While these challenges were eventually addressed by adjusting parameters and optimizing hardware, some manual assistance was still required for the car to move smoothly.