We recommend you finish our Quick Start first before continuing this section.
In this section, we will introduce some more advanced use cases applied with financial domain know-how. Please note that, even though these use cases are based on real-life scenarios, they have been modified and simplified. Therefore, you might still need BA/SA to refine the actual analyses.
Both DocGen and APAR are, in fact, part of a larger policy process comprised with 8 data processes, but these are more than enough to demonstrate that how LOC can help our financial partners to govern their data flow in a clearer and better way.
You will see the complete event result at the end of this article.
DocGen
In DocGen, an insurance policy is expected to be recorded into the insurance system with a few data and the format transformed, and a payment barcode (QR code) generated from the insurance policy info for the corresponding insurer. Moreover, the transaction will be recorded into the accounting system, therefore you can expect this B/D process will generate an Accounts Receivable (AR) JSON data at the end.
This business process can be split into 3 generic logics and 1 aggregator logic:
Generic Logics:
Aggregator Logic: to aggregate the transformed policy data and generate an AR JSON result.
Data Process | (Generic) Logic #1 | (Generic) Logic #2 | (Generic) Logic #3 | Aggregator Logic |
---|---|---|---|---|
DocGen | importData |
transData |
emitEvents |
aggregateResults |
In the DocGen data process, one of the generic logics is used to create 2 types of events:
Event Name (from which generic logic) | SourceDID | TargetDID | Meta |
---|---|---|---|
setPolicy: [policy No.] (Logic #3) |
Policy No.: [policy No.] |
Customer ID: [ID] |
transformed policy details |
setWriteOff: [write-off No.] (Logic #3) |
Write-off No.: [write-off No.] |
Policy No.: [policy No.] |
transformed policy details |
The 1st event is to established the connection between the policy and policyholder. The 2nd event is essentially the input feed for our APAR B/D data process, which we will discuss in the second half of this use case.
As in the Quick Start - Create a “Greeting” Data Process, you need to create a project and a scenario before creating new data processes.
Detailed instructions can be found HERE.
Here we provide test data comprised of the insurance policy information, which would be filled out by the potential customer. We will use this to invoke our data process later.
In practice, both InsuNo
(insurance policy number) and AcuNo
(accounting number) might should be generated in the data process instead of being provided by the user. But the gist is there.
This DocGen data process is simplified from real-life examples. It will generate
Since we have completed setting up the project, scenario, and a data process, what you need to do next is to add codes to the corresponding blocks in the data process.
The data processes and the logics can be named whatever you like, but we will use the name we have listed above.
The first logic is to read and parse the JSON data from the POST request body.
After the insurance policy info from Insurance Policy Data has been parsed, you can expect jsonInput
contains a JSON dataset in the session storage.
The next logic would take over these data by querying the session storage.
This logic will get the policy info from the session storage and transform it.
This logic will get the transformed data from the session storage and make them into events.
For every policy info, the 1st event is supposed to look like this:
And the 2nd event would look like this:
Essentially, the transformed insurance policy data is embedded in the meta
field.
Our aggregator logic doesn't do much here, except to return an execution status and the data required for an AR file.
Right-click the data process and select Deploy Data Process. You should see the message from LOC Studio that your processes are successfully deployed.
Please note that
Now deploy an API route as well so that we can invoke the data process through it. Here we choose /APAR/DocGen
as the route and remember to set it for POST requests.
Then add the data process you just created and click Create when you are done.
Now we can trigger this DocGen data process with the API route (in our example it's https://api.loc-xxx/APAR/DocGen
) with the test JSON payload. Here's a screenshot of Postman:
LOC returns status 200 (OK) as well as an executionid
, which we can use to look up the events generated during this execution.
Please note that if there is an error message in the response, the reasons could be
Here are some of the common error messages you might encounter only for your reference.
Now go back to LOC Studio's Data Discovery window. You should see new events appear in the list:
Remember the executionId
field we've got from the request response? Filter events by this Execution ID and enter the value:
Please note that only when there are events written in the logics, can they be searched here.
Now if you click the Event Lineage Graph, you can now view all the related events of this particular execution (you may have to drag the nodes around yourself):
To sum up, with this DocGen data process, the insurance policy info can be transformed, and there will be a payment barcode generated as well as an AR file containing the data just like what we have in the Postman response.
APAR
Once the DocGen (i.e., Document Generator) data process is completed and the insurance premium (the payment amount for the insurance policy) has been paid, we are supposed to write off Accounts Receivable (AR) in the accounting system. Please be noted that this is a convenient simplification, which cannot directly represent more complicated real-life scenarios.
After executing the DocGen, there is a new Accounts Receivable (AR) record in the accounting system. In the APAR process, we need to write this AR off after the insurance premium is paid. Therefore, the APAR will generate a write-off JSON data for the accounting system.
This business process can also be split into 3 generic logics and 1 aggregator logic:
Generic Logic:
Aggregator Logic: to aggregate the mapping and generate a write-off file
There are 3 generic logics and 1 aggregator logic in this data process:
Data Process | (Generic) Logic #1 | (Generic) Logic #2 | (Generic) Logic #3 | Aggregator Logic |
---|---|---|---|---|
APAR | inputPayment |
getEvents |
emitEvents |
aggregateAPAR |
In the data process, Logic #3 is used to create 2 types of events:
Label Name | SourceDID | TargetDID | Meta |
---|---|---|---|
setBilling: [paid amount] |
Billing: [payment method] |
Write-off No.: [write-off No.] |
write-off details |
Authorisation No.: [authorisation No.] |
Issuing Bank: [issuing bank] |
Billing: [payment method] |
write-off details |
Here we provide another set of test data comprised of the insurance payment information. We will use this to invoke our APAR data process later.
Be noted that we assume the policyholder has to pay two installments for this policy. Here the customer paid $4000 with credit card for the 1st installment and pay $239 with cash for the 2nd installment. We also assume that it's possible the customer might pay installments separately, so the APAR process may receive only one of the payments at a time.
Like the DocGen, our APAR data process is also simplified from a real-life example. Here it will generate
All the data processes and the logics can be renamed as whatever you like, but we will use the name listed above.
The 1st logic is to read and parse the JSON data of the payment info from the POST request body.
After the payment info from Insurance Payment Data has been parsed, you can expect nextPaymentInfo
will contain a JSON dataset in the session storage.
The 2nd logic is to query and read the insurance policy info (generated by DocGen) from the event store and store them in the session storage as well.
The next logic would take over these data by querying the session storage.
After querying the session storage, we can now map the data from the generic logic #1 and #2 and make them into new events. More specifically, this logic will send two write-off events for each payment.
For every billing info, the event is supposed to look like this:
Essentially, the payment data (firstPayment and/or secondPayment) is embedded in the meta
field.
Our aggregator logic doesn't do much here, except to return an execution status and the data required for a write-off file.
Right-click the data process and select Deploy Data Process. You should see the message from LOC Studio that your processes are successfully deployed.
Please note that
Now deploy an API route as well so that we can invoke the data process through it. Here we choose /APAR/APAR
as the route and remember to set it for POST requests.
Then add the data process you just created and click Create when you are done.
Now we can trigger this APAR data process with the API route (in our example it's https://api.loc-xxx/APAR/APAR
) with the test JSON payload. Here's a screenshot of Postman:
LOC returns status 200 (OK) as well as an executionid
, which we can use to look up the events generated during this execution.
Please note that if there is an error message in the response, the reasons could be
Here are some of the common error messages you might encounter only for your reference.
Now go back to LOC Studio's Data Discovery window.
Remember the executionId
field we've got from the request response? Select this Execution ID and enter the value:
Please note that only when there are events written in the logics, can they be searched here.
Now if you click the Event Lineage Graph, you can now view all the related events of this particular execution (you may have to drag the nodes around yourself):
To sum up, with this DocGen data process, the insurance policy info can be transformed, and there will be a payment barcode generated as well as an AR file containing the data just like what we have in the Postman response.
To put what we have in DocGen and APAR altogether, you will have this decent graph in Data Discovery.
With these 2 use cases, we aim to demonstrate our LOC's capability in the insurance industry apart from how LOC Studio helps our customers to monitor their business activity and event sourcing.
LOC Studio