<style>
.reveal pre code {
max-height: 1000px;
font-size: 0.5em;
}
#left {
left:-8.33%;
text-align: left;
float: left;
width:50%;
z-index:-10;
}
#right {
top: 75px;
float: right;
z-index:-10;
width:50%;
}
</style>
<!-- .slide: data-background="https://hackmd.io/_uploads/HJaiXwgya.jpg" -->
#### Dora-drives: Autonomous-driving made simple!
<div class="avatar margin-bottom--sm"><div class="avatar__intro" itemprop="author" itemscope="" itemtype="https://schema.org/Person"><div class="avatar__name"><a href="https://github.com/haixuantao" target="_blank" rel="noopener noreferrer" itemprop="url"><span itemprop="name">Haixuan Xavier Tao</span></a></div><small class="avatar__subtitle" itemprop="description">Maintainer of dora-rs</small></div></div>
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
Let's build an autonomous vehicle together...
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
Block we need:
- Sensors
- Perception
- Localisation
- Planning
- Control
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
```mermaid
flowchart TB
Control
Planning
Perception
Localisation
Sensors --> Perception
Sensors --> Localisation
Perception --> Planning
Planning --> Control
Localisation --> Planning
```
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
```mermaid
flowchart TB
subgraph Perception
perception/yolov5[/yolov5\]
end
subgraph Localisation
Localisation/gps[/gps\]
end
subgraph Control
Control/drive_by_wire[/drive_by_wire\]
end
subgraph Planning
Planning/optimizer[/optimizer\]
end
subgraph Sensors
Sensors/Drivers[/Drivers\]
end
Sensors --> Perception
Sensors --> Localisation
Perception --> Planning
Planning --> Control
Localisation --> Planning
```
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
```mermaid
flowchart TB
subgraph Sensors
Sensors/Drivers[/ros2/sdk\]
end
subgraph Perception
perception/yolov5[/ Python \]
end
subgraph Localisation
Localisation/gps[/ ros2/C++/Python \]
end
subgraph Control
Control/drive_by_wire[/ros2/sdk\]
end
subgraph Planning
Planning/optimizer[/c++/Python\]
end
Sensors --> Perception
Sensors --> Localisation
Perception --> Planning
Planning --> Control
Localisation --> Planning
```
#### Long implementation that take months
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
## ROS2
> Honestly, I don't even get what ROS is about? **ros2 seems an "amalgamation" of tools and scripts to create a general API with some ideas of robotics application in mind.**
> [name=HN First Comment ]
> From my perspective as a generalist software engineer who isn't a low-level robotics expert – as an ecosystem, **[ros2] seems to have adopted every bad practice available and invented some more of its own**. [...]. **Many of the architectural design decisions are frankly baffling**, although I appreciate that this is in part down to age, legacy, and the open nature of the platform.
> [name=HN Second Comment ]
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
### How can `dora-rs` help?
- Simple to use
- Declarative dataflow
- Excellent Python support
- AI friendly
- Can be built in days
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
## Carla simulator
<img src=https://hackmd.io/_uploads/HJgQIzPJ6.jpg width=650/>
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
From `carla`:
```python
from carla import Client, VehicleControl
...
vec_control = VehicleControl(steer, throttle, brake, hand_brake=False)
client.apply(vec_control)
```
to `dora-rs`:
```python
from dora import Node
from carla import Client, VehicleControl, get_car_info
node = Node()
client = Client()
for event in node:
if event["type"] == "INPUT":
if event["id"] == "control":
[throttle, steer, brake] = np.array(event["value"])
vec_control = VehicleControl(
steer,
throttle,
brake,
hand_brake=False,
)
client.apply(vec_control)
elif event["id"] == "tick":
camera_frame, speed = get_car_info()
node.send_output("image", camera_frame, metadata)
node.send_output("speed", speed, metadata)
```
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
`yolov5` example:
```python
import cv2
import numpy as np
import torch
from dora import Node
node = Node()
model = torch.hub.load("ultralytics/yolov5", "yolov5n")
for event in node:
match event["type"]:
case "INPUT":
match event["id"]:
case "image":
frame = event["value"]
results = model(frame) # includes NMS
arrays = np.array(results.xyxy[0].cpu())
node.send_output("obstacles", arrays, event["metadata"])
```
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
Planning example:
```python
import fot_wrapper
from dora import DoraStatus
class Operator:
def __init__():
# ...
def on_event(
self,
dora_event,
send_output: Callable[[str, bytes], None],
):
if dora_event["id"] == "position":
self.position = dora_event["value"]
return DoraStatus.CONTINUE
elif dora_event["id"] == "speed":
self.speed = dora_event["value"]
return DoraStatus.CONTINUE
elif dora_event["id"] == "obstacles":
obstacles = dora_event["value"]
waypoints = fot_wrapper.run_fot(
self.position,
self.speed,
obstacles,
)
send_output(
"waypoints",
waypoints,
dora_event["metadata"],
)
return DoraStatus.CONTINUE
```
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
## Graph example:
```yaml
nodes:
- id: oasis_agent
custom:
inputs:
waypoints: fot_op/waypoints
outputs:
- speed
- image
source: ../../carla/carla_source_node.py
- id: carla_gps_op
operator:
python: ../../carla/carla_gps_op.py
outputs:
- position
- id: yolov5
operator:
outputs:
- obstacles
inputs:
image: oasis_agent/image
python: ../../operators/yolov5_op.py
- id: fot_op
operator:
python: ../../operators/fot_op.py
outputs:
- waypoints
inputs:
position: carla_gps_op/position
speed: oasis_agent/speed
obstacles: yolov5/obstacles
```
---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
#### Node hub:

---
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
#### Hot reloading and opentelemetry
<iframe width="560" height="315" src="https://www.youtube.com/embed/lTzPTxCUFSM?si=UWMqgddKmksJBpxs" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
---
### dora-drives: Autonomous Driving Starter Kit
  
See: https://github.com/dora-rs/dora-drives
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
---
### Implemented Operators
- Yolov5 Object detection
- YoloP Lane and Drivable area detection
- Midas Depth Estimation
- Strong Sort Object tracking
- Frenet Optimal Planning Planning
- PID Controller
- Integration to Carla
- ...
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
---
## dora-rover
<div>
<div id="left">
<img src=https://hackmd.io/_uploads/rJk6Hzwyp.jpg width=350/>
</div>
<div id="right">
- Lidar
- GNSS, IMU
- Camera
- NVIDIA Orin
</div>
</div>
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
---
## dora-car
<div>
<div id="left">
<img src=https://hackmd.io/_uploads/B1aS9zwJp.jpg width=550/>
</div>
<div id="right">
- Chinese Lidar
- GNSS, IMU
- 6 Cameras
- RK3588 ARM Computers
</div>
</div>
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
---
## Hackathon
Inviting students to participate in an International Hackathon on Autonomous Driving with a prize pool of 500k RMB.
Getting started guide for the Hackathon: https://docs.carsmos.cn/#/en/
Official Website for Hackathon registration: https://competition.atomgit.com/competitionInfo
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
---
## dora-rs
Github: https://github.com/dora-rs/dora
Website: https://dora.carsmos.ai/
DIscord: https://discord.gg/XqhQaN8P
<!-- .slide: data-background="https://hackmd.io/_uploads/BktKQDlyp.png" -->
{"title":"dora-drives","description":"Rust-Python FFI & multilanguage system debugging!","slideOptions":"{\"theme\":\"white\"}","contributors":"[{\"id\":\"dcd8580f-6041-4708-8c3f-3f0de43b5626\",\"add\":36774,\"del\":27195}]"}