# 5. Advanced Introduction - Event
<text style="font-size:17pt">**Use Case: Lakeside Oasis Café**</text>
## Data Process Pre-Planning
A fast food restaurant, *Lakeside Oasis Café*, has just opened and the manager of the café wants to use LOC CLI to monitor the orders between the counter and the kitchen.
:::success
- You can click **Details** to see more codes.

:::
### Business Process Analysis
There are two business processes:
1. Send Order - counter staff to key in order data
2. Deliver Order - kitchen staff to be assigned for each responsible food/drink item.

### Data Process Analysis
Each data process has two generic logics and one aggregator logic:
| Data Process | (Generic) Logic #1 | (Generic) Logic #2 | Aggregator Logic |
| ---------------------- | ----------------------- | ------------------- | ------------------ |
| ++Send Order++ (#A) | ```setOrder``` | ```orderEvent``` | (return status ok) |
| ++Deliver Order++ (#B) | ```getOrderEvent``` | ```deliverOrder``` | (return status ok) |
For both data processes, there is one generic logic to create events in each data process:
| Label Name (from which generic logic) | SourceDID | TargetDID | Meta |
| -------------------------------------- | ------------------------- | ----------------------- | ------------- |
| ```setOrder``` (Logic #2 from #A) | Item: ```[item name]``` | Order: ```[order id]``` | order details |
| ```deliverOrder``` (Logic #2 from #B) | Staff: ```[staff name]``` | Item: ```[item name]``` | order details |
### Data Process and Logic Flow Chart

<!--
The 1^st^ event is essentially the feed to the 2^nd^ data process.

-->
## LOC CLI Preparation
### Order Data
Here we provide test data comprised of 5 orders.
:::spoiler Order Data
```json=
[
{
"OrderId": "200",
"OrderItems": {
"French Fries": 1,
"Hamburger": 1,
"Fried Chicken": 1,
"Chicken Nugget": 4,
"Coke": 0,
"Sprite": 1,
"Coffee": 0,
"Salad": 1,
"Ice Cream": 1
}
},
{
"OrderId": "201",
"OrderItems": {
"French Fries": 1,
"Hamburger": 1,
"Fried Chicken": 1,
"Chicken Nugget": 0,
"Coke": 1,
"Sprite": 0,
"Coffee": 0,
"Salad": 1,
"Ice Cream": 0
}
},
{
"OrderId": "202",
"OrderItems": {
"French Fries": 0,
"Hamburger": 1,
"Fried Chicken": 1,
"Chicken Nugget": 6,
"Coke": 0,
"Sprite": 0,
"Coffee": 0,
"Salad": 1,
"Ice Cream": 1
}
},
{
"OrderId": "203",
"OrderItems": {
"French Fries": 1,
"Hamburger": 0,
"Fried Chicken": 2,
"Chicken Nugget": 10,
"Coke": 1,
"Sprite": 0,
"Coffee": 0,
"Salad": 0,
"Ice Cream": 1
}
},
{
"OrderId": "204",
"OrderItems": {
"French Fries": 1,
"Hamburger": 1,
"Fried Chicken": 0,
"Chicken Nugget": 0,
"Coke": 1,
"Sprite": 0,
"Coffee": 1,
"Salad": 0,
"Ice Cream": 0
}
}
]
```
:::
## Step 1-1. Initiate 1^st^ Data Process Template File
At the beginning, let us start the whole journey by
```
loc login
```
And
```
loc new [template name]
```
:::info
We suggest to use username as the template name. For instance`loc new John_Cafe_DPA`.
:::
## Step 1-2. Edit YAMLs for API Route and Configuration
**++Recap of Data Process and Logics Flow Chart++**

- API Route: You need to
(1) change the method (GET --> POST) because we are going to parse a value from the request body according to 1^st^ logic;
(2) rename the API route, such as **John_cafe_0621**;
(3) create your own path, such as **/john/cafe_0621**;
(4) add another `pid` under `dataProcessPids` as there will be 2 data processes
```yaml=
method: POST
mode: Sync
encapsulation: true
name: John_cafe_0621
path: /john/cafe_0621
dataProcessPids:
- pid: 00000000-0000-0000-0000-000000000000
revision: latest
- pid: 00000000-0000-0000-0000-000000000000
revision: latest
```
:::info
ADDITIONAL
- Configuration: You can also set your config.yaml as
(1) rename the configuration name that you favour, such as **John_cafe_DPA**;
(2) rename the logic names that you favour, such as **aggregator_0621**, **generic-1_0621**, **generic-2_0621**;
```yaml=
version: 0.1.0
name: John_cafe_DPA
description: description
timeoutSeconds: 180
aggregatorLogic:
name: aggregator_0621
file: aggregator-logic.js
genericLogics:
- name: generic-1_0621
file: 1.js
- name: generic-2_0621
file: 2.js
```
:::
## Step 1-3. Create Data Process #A - Set Order
The first data process is to address the [order data](###Order-Data) (in the format of JSON payload) and make it available for subscription via events.
**++Recap of Data Process and Logic Flow Chart++**

### Generic Logic #1 of Data Process #A
The first logic of #A is to read and parse the JSON data from the POST request body, and then to make all the food/drink items that need to be prepared into events and send them to the event store.
:::spoiler Generic Logic 1 of Data Process A
```javascript=
/**
*
* Data Process #A - Generic Logic #1
*
* The codes in 'run' are executed when no error occurrs in Generic Logic.
*
*/
export async function run(ctx) {
// a function that transforms byte array to string
const UTF8ArrToStr = (aBytes) => {
let utf8decoder = new TextDecoder();
return utf8decoder.decode(new Uint8Array(aBytes));
}
// read and parse JSON data from the request body
const payload = JSON.parse(UTF8ArrToStr(ctx.payload.http.body));
// extract all order items and re-package them into customised event objects
// for the next logic to process
let orderAsItems = [];
payload.forEach(order => {
const orderItems = order?.OrderItems;
// iterate through property names of an order object (JSON field names)
for (let itemName in orderItems) {
// skip if the item has incorrect or zero quantity
if (!Number.isInteger(orderItems[itemName]) || orderItems[itemName] <= 0) {
ctx.agents.logging.error(
`Incorrect quantity for item ${itemName} in order ${order.OrderId}`
);
continue;
}
// prepare a new order item object
let newOrderItem = {
OrderId: order.OrderId, // order id
Name: itemName, // item name
Quantity: orderItems[itemName] // item quantity
};
// push the order item into the array
orderAsItems.push(newOrderItem);
}
});
// write the order item array into session storage
await ctx.agents.sessionStorage.putJson("orderAsItems", orderAsItems);
}
/**
*
* The codes in 'handleError' is executed when an error occurrs
* in Aggregator Logic, or the CURRENT running Logic just gets an error.
*
*/
export async function handleError(ctx, error) {
ctx.agents.logging.error(error.message); // log error
}
```
:::
For every item in orders (and if its quantity is not zero), the JSON data will be sent to the session storage, for example:
```json
{
"OrderId": "200",
"Name": "French Fries",
"Quantity": 1
}
```
The next logic would take over these data by querying the session storage.
### Generic Logic #2 of Data Process #A
This logic will get the order items from the session storage and make them into events.
:::spoiler Generic Logic 2 of Data Process A
```javascript=
/**
*
* Data Process #A - Generic Logic #2
*
* The codes in 'run' are executed when no error occurrs in Generic Logic.
*
*/
export async function run(ctx) {
// load order event objects from session storage (prepared by the previous logic)
const orderAsItems = await ctx.agents.sessionStorage.get("orderAsItems");
let events = [];
orderAsItems.forEach(item => {
// create a new event object and add to array
// the item name itself is the source, order id is the target
let newEvent = {
sourceDID: `Item: ${item.Name}`, // event source
targetDID: `Order: ${item.OrderId}`, // event target
labelName: "setOrder", // event label name
meta: JSON.stringify(item), // convert item object to JSON string
type: 'default', // event group
};
events.push(newEvent);
});
// send events to event store for all order items
await ctx.agents.eventStore.emit(events);
}
/**
*
* The codes in 'handleError' are executed when an error occurrs
* in Aggregator Logic or the CURRENT running Logic just gets an error.
*
*/
export async function handleError(ctx, error) {
ctx.agents.logging.error(error.message); // log error
}
```
:::
For every order item, the event is supposed to like this:
```
sourceDID: "Item: French Fries"
targetDID: "Order: 200"
labelName: "setOrder"
meta: "{ OrderId: 200, Name: French Fries, Quantity: 1 }"
type: "default"
```
Essentially, the order item data is embedded in the ```meta``` field.
### Aggregator Logic of Data Process #A
The aggregator logic is to return an execution response.
:::spoiler Aggregator Logic of Data Process A
```javascript=
/**
*
* Data Process #A - Aggregator Logic
*
* The codes in 'run' are executed when no error occurrs in Generic Logic.
*
*/
export async function run(ctx) {
// signal this data process is executed properly
ctx.agents.result.finalize({
DPA_status: "ok",
DPA_taskId: ctx.task.taskId,
});
}
/**
*
* The codes in 'handleError' are executed when an error occurres
* in Aggregator Logic or the CURRENT running Logic just gets an error.
*
*/
export async function handleError(ctx, error) {
ctx.agents.logging.error(error.message); // log error
}
```
:::
## Step 2-1. Initiate 2^nd^ Data Process Template File
```
loc new [template name]
```
:::info
We suggest to use username as the template name. For instance`loc new John_Cafe_DPB`.

:::
## Step 2-2. Create Data Process #B - Deliver Order
The kitchen will process the order items and "assign" them to kitchen staff.
**++Recap of Data Process and Logic Flow Chart++**

### Generic Logic #1 of Data Process #B
The first logic of #B is to read order items out of event store and store them in the session storage.
:::spoiler Generic Logic 1 of Data Process B
```javascript=
/**
*
* Data Process #B - Generic Logic #1
*
* The codes in 'run' are executed when no error occurrs in Generic Logic.
*
*/
export async function run(ctx) {
// event search parameters
const searchReq = {
queries: [
{
Match: {
field: 'label_name',
value: 'setOrder',
},
},
],
excludes: [],
filters: [],
from: 0,
size: 1000,
sorts: [],
};
// search events that match and extract event objects
const search = await ctx.agents.eventStore.search(searchReq);
const events = search?.events;
// read metadata (order items) from the events
let orderAsItems = [];
events.forEach(orderEvent => {
// convert string to JSON object
const orderItem = JSON.parse(orderEvent.meta);
// save the object in array
orderAsItems.push(orderItem);
});
// write the order items into session storage
await ctx.agents.sessionStorage.putJson("orderAsItems", orderAsItems);
}
/**
*
* The codes in 'handleError' are executed when an error occurrs
* in Aggregator Logic or the CURRENT running Logic just gets an error.
*
*/
export async function handleError(ctx, error) {
ctx.agents.logging.error(error.message); // log error
}
```
:::
### Generic Logic #2 of Data Process #B
Now we assign kitchen staff to prepare order items, and then emit another set of events(```deliverOrder```).
:::spoiler Generic Logic 2 of Data Process B
```javascript=
/**
*
* Data Process #B - Generic Logic #2
*
* The codes in 'run' are executed when no error occurrs in Generic Logic.
*
*/
export async function run(ctx) {
// employees and what order items they should be responsible for
const kitchen_staffs = {
John: ["French Fries", "Fried Chicken", "Chicken Nugget"],
Ann: ["Hamburger", "Ice Cream"],
Emily: ["Coke", "Black Tea", "Coffee"]
};
// function to search each order item that is responsible by which kitchen staff to prepare
const searchStaff = (itemName) => {
for (let staff in kitchen_staffs) {
for (let responsibility of kitchen_staffs[staff]) {
if (responsibility === itemName) return staff;
}
}
return null;
}
// load order items from session storage
const orderAsItems = await ctx.agents.sessionStorage.get("orderAsItems");
// iterate through items
let events = [];
orderAsItems.forEach(item => {
// get the staff who is responsible for this item
const staff = searchStaff(item.Name);
if (staff) { // if a staff is found, create an event
let newEvent = {
sourceDID: `Staff: ${staff}`, // event source
targetDID: `Item: ${item.Name}`, // event target
labelName: "deliverOrder", // event label name
meta: JSON.stringify(item), // convert item object to JSON string
type: 'default', // event group
};
events.push(newEvent);
} else {
// no staff found, output an error message
ctx.agents.logging.error(
`No staff found for item ${item.Name} from order ${item.OrderId}`
);
}
});
// send events to event store for all order items
await ctx.agents.eventStore.emit(events);
}
/**
*
* The codes in 'handleError' are executed when an error occurrs
* in Aggregator Logic or the CURRENT running Logic just gets an error.
*
*/
export async function handleError(ctx, error) {
ctx.agents.logging.error(error.message); // log error
}
```
:::
### Aggregator Logic of Data Process #B
:::spoiler Aggregator Logic of Data Process B
```javascript=
/**
*
* Data Process #B - Aggregator Logic
*
* The codes in 'run' are executed when no error occurs in Generic Logic.
*
*/
export async function run(ctx) {
// signal this data process is executed properly
ctx.agents.result.finalize({
DPB_status: "ok",
DPB_taskId: ctx.task.taskId,
});
}
/**
*
* The codes in 'handleError' are executed when an error occurs
* in Aggregator Logic or the CURRENT running Logic just gets an error.
*
*/
export async function handleError(ctx, error) {
ctx.agents.logging.error(error.message); // log error
}
```
:::
## Step 3. Deploy Data Process
Once the setting above is completed, you can now deploy these 2 data processes at the same time by using this command: `loc deploy [template name]`. Here we use `loc deploy John_Cafe_DPA` and `loc deploy John_Cafe_DPB`.
**++John_Cafe_DPA++**:

**++John_Cafe_DPB++**:

## Step 4. Configure API Route
**++Recap of Data Process and Logic Flow Chart++**

After the deployment of the 2 data processes is done, you will get 2 permanent IDs (PID) for each data process (circled in red). Next, you must put these PIDs into the file of `api-route-config.yaml`.

And now you can finally configure the API route by using this command:
`loc ar deploy -f [template name]/api-route-config.yaml`.
Here we use
`./loc ar deploy -f John_Cafe_DPB/api-route-config.yaml`.

## Step 5. Trigger API Route with Payload
You can use an API client tool such as **[Postman](https://www.postman.com/)** or **[Insomnia](https://insomnia.rest/)** to request an URL.
Here we use ++Postman++ to demonstrate.
Select ```POST``` and use the endpoint we set in ```api-route-config.yaml``` with domain URL.
Afterwards, use the JSON payload as below to get the data that we need:
++**payload**++
:::spoiler Details
```json=
[
{
"OrderId": "200",
"OrderItems": {
"French Fries": 1,
"Hamburger": 1,
"Fried Chicken": 1,
"Chicken Nugget": 4,
"Coke": 0,
"Sprite": 1,
"Coffee": 0,
"Salad": 1,
"Ice Cream": 1
}
},
{
"OrderId": "201",
"OrderItems": {
"French Fries": 1,
"Hamburger": 1,
"Fried Chicken": 1,
"Chicken Nugget": 0,
"Coke": 1,
"Sprite": 0,
"Coffee": 0,
"Salad": 1,
"Ice Cream": 0
}
},
{
"OrderId": "202",
"OrderItems": {
"French Fries": 0,
"Hamburger": 1,
"Fried Chicken": 1,
"Chicken Nugget": 6,
"Coke": 0,
"Sprite": 0,
"Coffee": 0,
"Salad": 1,
"Ice Cream": 1
}
},
{
"OrderId": "203",
"OrderItems": {
"French Fries": 1,
"Hamburger": 0,
"Fried Chicken": 2,
"Chicken Nugget": 10,
"Coke": 1,
"Sprite": 0,
"Coffee": 0,
"Salad": 0,
"Ice Cream": 1
}
},
{
"OrderId": "204",
"OrderItems": {
"French Fries": 1,
"Hamburger": 1,
"Fried Chicken": 0,
"Chicken Nugget": 0,
"Coke": 1,
"Sprite": 0,
"Coffee": 1,
"Salad": 0,
"Ice Cream": 0
}
}
]
```
:::
Click ```Send``` and if all go well, here is the response that you can expect.

Once the execution above has been done, we will have events visualised as below.

There will be a source DID, target DID, and a label name for each event which can be used to conduct analyses. Below section, we temporarily use Neo4j and Kibana to demonstrate the event analysis in this workshop. We recommend to use any other licensed or commonly used BI tools in your company.
## Demonstration of Events Visualisation
### Events Lineage and Relationships
After log into Neo4j, please use the command below to put into the search bar (circled in red) and search your emitted events. DO NOT forget to fill in the execution ID from Postman's response.

`MATCH (s:DigitalIdentity_system_search)-[r:LABEL_system_search]->(t:DigitalIdentity_system_search) WHERE r.executionId='YrF36sTy8KTuHXayZt2GJA' RETURN s,r,t`

This graph showcases the relationship between each event.
### Events Discovery

After logging into Kibana and selecting a proper time-frame, you will see all the events emitted during this period.

### Events Analysis

After switching to **Dashboard** and choosing **Lakeside_Workshop**, you will be able to put your execution ID into the search bar to visualise the event analyses.

The left graph is to analyse the portion of each order item sold, while the right graph is to analyse the workload of each kitchen staff.
::: success
What you see here is a demonstration for events. It is suggested that you choose more suitable tools, such as your company's BI tools with license for visualisation.
:::
---
###### tags: `Workshop`