# Routing tasks for shadow running
We will need to handle three different events.
## Order Events
### Event: Created Order -> Import Order Command Handler
- Retieve Warehouse(CP -> WarehouseID). As first approach we can have hardcoded the CPs for the evalutaed warehouses.
- Find Dataset in Repo (WarehouseID, Date)
- Only non **final** datasets can be used. Since final datasets are static ones that could have been used by static executions, making the results inconsistent.
- [Dataset NOT exists]
- Create Dataset
- The dataset will only contain the order received.
- Create Config.
- We may have a configuration already for this ?. We probably want to create a fresh new one. In any case we must be using `tags` since we must be sure that this configuration is not shared among different days/warehouse, since this will provoke being modifying an unexpected execution.
- This must be a defaut configurarion with only one vehicle.
- Create Execution
- With (Dataset, Config, Algorithm). We can use by default Plotwise.
- [Dataset Exists]
- Update Dataset
- Build new order based on the info provided by the cmd.
- Add Order to the dataset.
- *[Maybe]* Update config based on the number of vehicles per number of orders
- We need to identify a non final configuraiton by (dataset, warehouseID), we need again tagging as a way to locate the configurations. Then, we need to be sure that the config is in dynamic state.
- Update execution.
- Send the new Order to Plotwise.
### Event: Updated Order -> Update Order Command Handler
- Find Dataset using OrderID. **Possible performance issues since we are storing all the orders as a list of JSONs**.
- Check if the updated order still belongs to the same dataset
- [Belongs to the same dataset]
- Update dataset.
- Send order update to the algorithm.
- [Plotwise] Updating an order for plotwise require to delete the old one and a new one.
- [Belongs to other dataset]
- Remove order from old dataset
- Update algorithm executions removing the deleted order.
- Add order to new dataset/execution:
- Find Dataset. We will repeate the process for a created Order. So maybe it may makes sense to call the **ImportOrderCommandHandler**. The main problem with this approach is that we need to share the transaction among the commamd Handlers.
### Event: Deleted Order
- Find Dataset using OrderID
- Update Dataset -> Remove order from list of orders
- Update Algorithm execution -> Remove order from the continuos execution
## Parcel Events
### Parcel Created
- Find dataset by OrderID
- Update Dataset.
- Update Algorithm Execution. Again we are going to need to remove the old order/event and a add a new one with the package.
### Parcel Updated.
- Find dataset by OrderID
- Update Dataset.
- Update Algorithm Execution. Again we are going to need to remove the old order/event and a add a new one with the package.
### Parcel Deleted.
- Find dataset by OrderID
- Update Dataset.
- Update Algorithm Execution. Again we are going to need to remove the old order/event and a add a new one with the package.
## Database changes
- Do migrations on dataset to allow `dynamic` flag
- Do migrations on config to allow `dynamic` flag
- We need to add tags to configs in order to not share configurations among different continuos executions for differents days.
- Supplementary table for plotwise to know the event associated for an order, in case of having to update/modify the order.
# Using default configuration: Routing tasks for shadow running
## Simplications
- Using a hardcoded configuration
We will need to handle three different events.
## Order Events
### Event: Created Order -> Import Order Command Handler
- Retieve Warehouse(CP -> WarehouseID). As first approach we can have **hardcoded** the CPs for the evalutaed warehouses.
- Find Dataset in Repo (WarehouseID, Date)
- Only non **final** datasets can be used. Since final datasets are static ones that could have been used by static executions, making the results inconsistent.
- [Dataset NOT exists]
- Create Dataset
- The dataset will only contain the order received.
- Create Config.
- Create Execution
- With (Dataset, Hardoded Config, Plotwise Algorithm).
- [Dataset Exists]
- Update Dataset
- Build new order based on the info provided by the cmd.
- Add Order to the dataset.
- Update execution.
- Send the new Order to Plotwise.
### Event: Updated Order -> Update Order Command Handler
- Find Dataset using OrderID. **Possible performance issues since we are storing all the orders as a list of JSONs**.
- Check if the updated order still belongs to the same dataset
- [Belongs to the same dataset]
- Update dataset.
- Send order update to the algorithm.
- [Plotwise] Updating an order for plotwise require to delete the old one and a new one.
- [Belongs to other dataset]
- Remove order from old dataset
- Update algorithm executions removing the deleted order.
- Add order to new dataset/execution:
- Find Dataset. We will repeate the process for a created Order.
### Event: Deleted Order
- Find Dataset using OrderID
- Update Dataset -> Remove order from list of orders
- Update Algorithm execution -> Remove order from the continuos execution
## Parcel Events
### Parcel Created
- Find dataset by OrderID
- Update Dataset.
- Update Algorithm Execution. Again we are going to need to remove the old order/event and a add a new one with the package.
### Parcel Updated.
- Find dataset by OrderID
- Update Dataset.
- Update Algorithm Execution. Again we are going to need to remove the old order/event and a add a new one with the package.
### Parcel Deleted.
- Find dataset by OrderID
- Update Dataset.
- Update Algorithm Execution. Again we are going to need to remove the old order/event and a add a new one with the package.
## Database changes
- Do migrations on dataset to allow `dynamic` flag
- Do migrations on config to allow `dynamic` flag
- We need to add tags to configs in order to not share configurations among different continuos executions for differents days.
- Supplementary table for plotwise to know the event associated for an order, in case of having to update/modify the order.
# Using executions as entrypoint
A new order arrived.
1. Look for WarehouseID
2. Look for executions(WhID, date)
3. [Found Executions]
4. Create new dataset.
5. Create new config -> Hash() (date, warehouse)
6. Create new execution.
5. [Not Found Executions]