Try   HackMD

Back to LOC Studio Guidebook

Use Case: DocGen & APAR

We recommend you finish our Quick Start first before continuing this section.

In this section, we will introduce some more advanced use cases applied with financial domain know-how. Please note that, even though these use cases are based on real-life scenarios, they have been modified and simplified. Therefore, you might still need BA/SA to refine the actual analyses.

  • DocGen (Document Generator)
    • Triggered when a customer (potential policyholder) submits their application data and a new policy document needs to be generated. A transformed JSON data, including a barcode link, will be returned to the account system.
    • DocGen emits events to connect the policy number with the customer, and the Accounts Receivable(AR) accounting number with the policy number.
  • APAR (Accounts Payable/Accounts Receivable)
    • Triggered when the policyholder makes one or two payment(s) for their insurance policy and the write-off needs to be processed. The payment JSON data will be returned to the accounting system.
    • APAR emits events to connect payment(s) with different methods (credit card, cash, etc.) with the write-off accounting number.

Both DocGen and APAR are, in fact, part of a larger policy process comprised with 8 data processes, but these are more than enough to demonstrate that how LOC can help our financial partners to govern their data flow in a clearer and better way.

You will see the complete event result at the end of this article.

Data Process: DocGen

Design the Process

In DocGen, an insurance policy is expected to be recorded into the insurance system with a few data and the format transformed, and a payment barcode (QR code) generated from the insurance policy info for the corresponding insurer. Moreover, the transaction will be recorded into the accounting system, therefore you can expect this B/D process will generate an Accounts Receivable (AR) JSON data at the end.

Image Not Showing Possible Reasons
  • The image was uploaded to a note which you don't have access to
  • The note which the image was originally uploaded to has been deleted
Learn More →

Business Analysis

This business process can be split into 3 generic logics and 1 aggregator logic:

Generic Logics:

  1. Import Data - to read the insurance policy info (we will use simulated data) keyed into the insurance core system.
  2. Transform Data - to deal with data transformation (eg, hash or mask data) and output them in specific format (eg, XML to JSON).
  3. Emit Events - to put the insurance policy info into events for future use.

Aggregator Logic: to aggregate the transformed policy data and generate an AR JSON result.

System Analysis

Data Process (Generic) Logic #1 (Generic) Logic #2 (Generic) Logic #3 Aggregator Logic
DocGen importData transData emitEvents aggregateResults

In the DocGen data process, one of the generic logics is used to create 2 types of events:

Event Name (from which generic logic) SourceDID TargetDID Meta
setPolicy: [policy No.] (Logic #3) Policy No.: [policy No.] Customer ID: [ID] transformed policy details
setWriteOff: [write-off No.] (Logic #3) Write-off No.: [write-off No.] Policy No.: [policy No.] transformed policy details

The 1st event is to established the connection between the policy and policyholder. The 2nd event is essentially the input feed for our APAR B/D data process, which we will discuss in the second half of this use case.

LOC Studio Preparation

Create Project/Scenario/Data Process

As in the Quick Start - Create a “Greeting” Data Process, you need to create a project and a scenario before creating new data processes.

Detailed instructions can be found HERE.

Insurance Policy Data

Here we provide test data comprised of the insurance policy information, which would be filled out by the potential customer. We will use this to invoke our data process later.

{
    "RequestJSON":{
        "Meta":{
            "InsuNo":"Insu-019",
            "ReqId":"Request-102",
            "AcuNo":"Acu-048",
            "Amount":"4239"
        },
        "CustomerInfo":{
            "email":"global-demo@fst.network",
            "Address":{
                "address":"No.97, Songren Rd.",
                "city":"Taipei City",
                "country":"Taiwan",
                "postalCode":"110"
            },
            "phoneNumber":"886223456789",
            "firstName":"Jack",
            "lastName":"Lin",
            "identity":"A123456789",
            "birthday":"1980-01-01",
            "gender":"male",
            "citizenship":"Taiwan"
        },
        "AgencyInfo":{
            "country":"Taiwan",
            "agencyCode":"ABCD",
            "name":"AgencyName",
            "effectiveDate":"2022-04-01",
            "Address":{
                "address":"No.97, Songren Rd.",
                "city":"Taipei City",
                "country":"Taiwan",
                "postalCode":"110"
            }
        }
    }
}

In practice, both InsuNo (insurance policy number) and AcuNo (accounting number) might should be generated in the data process instead of being provided by the user. But the gist is there.

Create Data Process

This DocGen data process is simplified from real-life examples. It will generate

  1. the insurance policy info available for subscription via events;
  2. the transformed policy data and barcode available for payment;
  3. an AR JSON result data ready for the accounting system.

Since we have completed setting up the project, scenario, and a data process, what you need to do next is to add codes to the corresponding blocks in the data process.

The data processes and the logics can be named whatever you like, but we will use the name we have listed above.

Generic Logic #1

The first logic is to read and parse the JSON data from the POST request body.

[if OK]
/** * * The codes in 'run' are executed when no error occurs in Generic Logic. * */ async function run(ctx) { const UTF8ArrToStr = (aBytes) => { let utf8decoder = new TextDecoder(); return utf8decoder.decode(new Uint8Array(aBytes)); } // read and parse JSON data from the request body const payload = JSON.parse(UTF8ArrToStr(ctx.payload.http.body)); await ctx.agents.sessionStorage.putJson("jsonInput", payload); }

After the insurance policy info from Insurance Policy Data has been parsed, you can expect jsonInput contains a JSON dataset in the session storage.

[if Error]
/** * * The codes in 'handleError' are executed when an error occurrs * in Aggregator Logic, or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

The next logic would take over these data by querying the session storage.

Generic Logic #2

This logic will get the policy info from the session storage and transform it.

[if OK]
/** * * The codes in 'run' are executed when no error occurs in Generic Logic. * */ async function run(ctx) { const jsonInput = await ctx.agents.sessionStorage.get("jsonInput"); let jsonOutput = { OutputJSON: { InsuNo: "", ReqId: "", AcuNo: "", Amount: "", Name: "", EffectiveDate: "", Identity: "", Barcode: "", } } // data transformation // -------------------------------------------------- // extract fields jsonOutput.OutputJSON.InsuNo = jsonInput.RequestJSON.Meta.InsuNo; jsonOutput.OutputJSON.ReqId = jsonInput.RequestJSON.Meta.ReqId; jsonOutput.OutputJSON.AcuNo = jsonInput.RequestJSON.Meta.AcuNo; jsonOutput.OutputJSON.Amount = jsonInput.RequestJSON.Meta.Amount; jsonOutput.OutputJSON.Name = jsonInput.RequestJSON.CustomerInfo.lastName + jsonInput.RequestJSON.CustomerInfo.firstName; // format birthday let effectiveDataString = jsonInput.RequestJSON.AgencyInfo.effectiveDate + ""; if (effectiveDataString) { let date = effectiveDataString.split('-'); jsonOutput.OutputJSON.EffectiveDate = `${date[0]-1911}-${date[1]}-${date[2]}` } // hide part of the policyholder ID as * let identitystring = jsonInput.RequestJSON.CustomerInfo.identity; if (identitystring.length == 10) { jsonOutput.OutputJSON.Identity = `${identitystring.substring(0, 3)}****${identitystring.substring(7, 10)}`; } // generate barcode link let barcodestring = `${jsonInput.RequestJSON.Meta.AcuNo}` if (barcodestring) { jsonOutput.OutputJSON.Barcode = `https://chart.googleapis.com/chart?chs=300x300&cht=qr&chl=${barcodestring}&choe=UTF-8` } // -------------------------------------------------- // write transformed data to next logic await ctx.agents.sessionStorage.putJson("jsonOutput", jsonOutput); }
[if Error]
/** * * The codes in 'handleError' are executed when an error occurrs * in Aggregator Logic or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

Generic Logic #3

This logic will get the transformed data from the session storage and make them into events.

[if OK]
/** * The codes in 'run' are executed when no error occurrs in Generic Logic. * */ async function run(ctx) { const jsonOutput = await ctx.agents.sessionStorage.get("jsonOutput"); // emit 1st event to event store: connect policy to policyholder await ctx.agents.eventStore.emit([ { sourceDID: `Policy No.: ${jsonOutput.OutputJSON.InsuNo}` , // source DID targetDID: `Customer ID: ${jsonOutput.OutputJSON.Identity}`, // target DID labelName: `setPolicy: ${jsonOutput.OutputJSON.InsuNo}`, // event label meta: JSON.stringify(jsonOutput), // meta type: "default", // default group }, ]); // emit 2nd event to event store: info for APAR await ctx.agents.eventStore.emit([ { sourceDID: `Writeoff No.: ${jsonOutput.OutputJSON.AcuNo}`, // source DID targetDID: `Policy No.: ${jsonOutput.OutputJSON.InsuNo}`, // target DID labelName: `setWriteOff: $${jsonOutput.OutputJSON.Amount}`, // event label meta: JSON.stringify(jsonOutput), // meta type: "default", // default group }, ]); }

For every policy info, the 1st event is supposed to look like this:

sourceDID: "Policy No.: Insu-019"
targetDID: "Customer ID: A12****789"
labelName: "setPolicy: Insu-019"
meta: [transformed policy JSON data]
type: "default"

And the 2nd event would look like this:

sourceDID: "Writeoff No.: Acu-048"
targetDID: "Policy No.: Insu-019"
labelName: "setWriteOff: $4239"
meta: [transformed policy JSON data]
type: "default"

Essentially, the transformed insurance policy data is embedded in the meta field.

[if Error]
/** * * The codes in 'handleError' are executed when an error occurrs * in Aggregator Logic or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

Aggregator Logic

Our aggregator logic doesn't do much here, except to return an execution status and the data required for an AR file.

[if OK]
/** * * The codes in 'run' are executed when no error occurs in Aggregator Logic. * */ async function run(ctx) { const bodyResult = await ctx.agents.sessionStorage.get("jsonOutput"); // output the transformed data as AR JSON result ctx.agents.result.finalize({ AR_Data: bodyResult }); }
[if Error]
/** * * The codes in 'handleError' are executed when an error occurres * in Aggregator Logic or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

Deploy Data Process

Right-click the data process and select Deploy Data Process. You should see the message from LOC Studio that your processes are successfully deployed.

Please note that

  • Deploying a data process is equivalent to storing it in the backend of LOC Studio.
  • When a data process is deployed, this data process in the menu will be marked with a green solid circle.

Create API Route

Now deploy an API route as well so that we can invoke the data process through it. Here we choose /APAR/DocGen as the route and remember to set it for POST requests.

Then add the data process you just created and click Create when you are done.

Invoke Data Process

Now we can trigger this DocGen data process with the API route (in our example it's https://api.loc-xxx/APAR/DocGen) with the test JSON payload. Here's a screenshot of Postman:

LOC returns status 200 (OK) as well as an executionid, which we can use to look up the events generated during this execution.

Please note that if there is an error message in the response, the reasons could be

  • the URL is being requested for too long, so you need to re-send the request on Postman/Insomnia again.
  • the data process execution time is longer than your previous setup, so please change the setting. (Reminder in Data Process)
  • the code snippets in your (generic or aggregator) logics might be erroneous, so please revise them and re-deployed.

Here are some of the common error messages you might encounter only for your reference.

View Result in Data Discovery

Now go back to LOC Studio's Data Discovery window. You should see new events appear in the list:

Filter by Execution ID

Remember the executionId field we've got from the request response? Filter events by this Execution ID and enter the value:

Please note that only when there are events written in the logics, can they be searched here.

Now if you click the Event Lineage Graph, you can now view all the related events of this particular execution (you may have to drag the nodes around yourself):

To sum up, with this DocGen data process, the insurance policy info can be transformed, and there will be a payment barcode generated as well as an AR file containing the data just like what we have in the Postman response.

Data Process: APAR

Once the DocGen (i.e., Document Generator) data process is completed and the insurance premium (the payment amount for the insurance policy) has been paid, we are supposed to write off Accounts Receivable (AR) in the accounting system. Please be noted that this is a convenient simplification, which cannot directly represent more complicated real-life scenarios.

Design the Process

After executing the DocGen, there is a new Accounts Receivable (AR) record in the accounting system. In the APAR process, we need to write this AR off after the insurance premium is paid. Therefore, the APAR will generate a write-off JSON data for the accounting system.

Business Analysis

This business process can also be split into 3 generic logics and 1 aggregator logic:

Generic Logic:

  1. Input Payment Data - to simulate the payment data received from other systems.
  2. Get Data - to query the insurance policy info from the event store where we have kept the insurance policy info in DocGen.
  3. Emit Events - to map the insurance policy info with the payment data and create an event with the required data of the write-off file.

Aggregator Logic: to aggregate the mapping and generate a write-off file

System Analysis

There are 3 generic logics and 1 aggregator logic in this data process:

Data Process (Generic) Logic #1 (Generic) Logic #2 (Generic) Logic #3 Aggregator Logic
APAR inputPayment getEvents emitEvents aggregateAPAR

In the data process, Logic #3 is used to create 2 types of events:

Label Name SourceDID TargetDID Meta
setBilling: [paid amount] Billing: [payment method] Write-off No.: [write-off No.] write-off details
Authorisation No.: [authorisation No.] Issuing Bank: [issuing bank] Billing: [payment method] write-off details

LOC Studio Preparation

Insurance Payment Data

Here we provide another set of test data comprised of the insurance payment information. We will use this to invoke our APAR data process later.

{
    "PaymenInfo": {
        "firstPayment": {
            "InsuNo": "Insu-019",
            "ReqId": "Request-102",
            "AcuNo": "Acu-048",
            "PaidAmount": "4000",
            "PaymentMethod": "Credit Card",
            "IssuingBank": "DBS",
            "TransactionDate": "2022-03-25T05:53:01.652Z",
            "AuthorisationNo": "ABC1234"
        },
        "secondPayment": {
            "InsuNo": "Insu-019",
            "ReqId": "Request-102",
            "AcuNo": "Acu-048",
            "PaidAmount": "239",
            "PaymentMethod": "Cash",
            "IssuingBank": "NA",
            "TransactionDate": "2022-03-26T07:26:01.652Z",
            "AuthorisationNo": "NA"
        }
    }
}

Be noted that we assume the policyholder has to pay two installments for this policy. Here the customer paid $4000 with credit card for the 1st installment and pay $239 with cash for the 2nd installment. We also assume that it's possible the customer might pay installments separately, so the APAR process may receive only one of the payments at a time.

Create Data Process

Like the DocGen, our APAR data process is also simplified from a real-life example. Here it will generate

  1. the payment info available for subscription via events;
  2. a write-off JSON data ready for the accounting system

All the data processes and the logics can be renamed as whatever you like, but we will use the name listed above.

Generic Logic #1

The 1st logic is to read and parse the JSON data of the payment info from the POST request body.

[if OK]
/** * * The codes in 'run' are executed when no error occurs in Generic Logic. * */ async function run(ctx) { const UTF8ArrToStr = (aBytes) => { let utf8decoder = new TextDecoder(); return utf8decoder.decode(new Uint8Array(aBytes)); } // read and parse JSON data from the request body const thisPaymentInfo = JSON.parse(UTF8ArrToStr(ctx.payload.http.body)); await ctx.agents.sessionStorage.putJson("nextPaymentInfo", thisPaymentInfo); }

After the payment info from Insurance Payment Data has been parsed, you can expect nextPaymentInfo will contain a JSON dataset in the session storage.

[if Error]

/** * * The codes in 'handleError' are executed when an error occurrs * in Aggregator Logic or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

Generic Logic #2

The 2nd logic is to query and read the insurance policy info (generated by DocGen) from the event store and store them in the session storage as well.

[if OK]

/** * * The codes in 'run' are executed when no error occurrs in Generic Logic. * */ async function run(ctx) { const paymentInfo = await ctx.agents.sessionStorage.get("nextPaymentInfo"); const paymentData = paymentInfo.PaymenInfo?.firstPayment || paymentInfo.PaymenInfo?.secondPayment; // if there are payment(s) if (paymentData) { // search for a setWriteOff event which match the payment info const requests = { queries: [{ Match: { field: "source_digital_identity", value: `Writeoff No.: ${paymentData.AcuNo}`, }, }, { Match: { field: "target_digital_identity", value: `Policy No.: ${paymentData.InsuNo}`, }, }, ], excludes: [], filters: [], from: 0, size: 1, sorts: [], }; const query = await ctx.agents.eventStore.search(requests); const getEvents = query?.events; if (getEvents) { // get the first event and store selected fields into session storeage const getMeta = getEvents[0].meta; await ctx.agents.sessionStorage.putJson("thisMetaItem", JSON.parse(getMeta)); const getSource = getEvents[0].sourceDigitalIdentity; await ctx.agents.sessionStorage.putJson("thisSource", getSource); } } }
[if Error]
/** * * The codes in 'handleError' are executed when an error occurrs * in Aggregator Logic, or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

The next logic would take over these data by querying the session storage.

Generic Logic #3

After querying the session storage, we can now map the data from the generic logic #1 and #2 and make them into new events. More specifically, this logic will send two write-off events for each payment.

[if OK]
/** * The codes in 'run' are executed when no error occurrs in Generic Logic. * */ async function run(ctx) { const sourceFromPolicy = await ctx.agents.sessionStorage.get("thisSource"); const paymentInfo = await ctx.agents.sessionStorage.get("nextPaymentInfo"); const payments = ["firstPayment", "secondPayment"]; for (let payment of payments) { // if firstPayment and/or secondPayment exist in paymentInfo if (payment in paymentInfo.PaymenInfo) { const paymentData = paymentInfo.PaymenInfo[payment]; // emit 1st event await ctx.agents.eventStore.emit([{ sourceDID: `Billing: ${paymentData.PaymentMethod}`, targetDID: sourceFromPolicy, labelName: `setBilling: $${paymentData.PaidAmount}`, meta: JSON.stringify(paymentData), type: "default", }]); // emit 2nd event await ctx.agents.eventStore.emit([{ sourceDID: `Issuing Bank: ${paymentData.IssuingBank}`, targetDID: `Billing: ${paymentData.PaymentMethod}`, labelName: `Authorisation No.: ${paymentData.AuthorisationNo}`, meta: JSON.stringify(paymentData), type: "default", }]); } } }

For every billing info, the event is supposed to look like this:

sourceDID: "Billing: Credit Card"
targetDID: "Writeoff No.: Acu-048"
labelName: "setBilling: $4000"
meta: [payment JSON data]
type: "default"
sourceDID: "Issuing Bank: DBS"
targetDID: "Billing: Credit Card"
labelName: "Authorisation No.: ABC1234"
meta: [payment JSON data]
type: "default"

Essentially, the payment data (firstPayment and/or secondPayment) is embedded in the meta field.

[if Error]
/** * * The codes in 'handleError' are executed when an error occurrs * in Aggregator Logic or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

Aggregator Logic

Our aggregator logic doesn't do much here, except to return an execution status and the data required for a write-off file.

[if OK]
/** * * The codes in 'run' are executed when no error occurs in Aggregator Logic. * */ async function run(ctx) { const thisMetaItem = await ctx.agents.sessionStorage.get("thisMetaItem"); const sourceFromPolicy = await ctx.agents.sessionStorage.get("thisSource"); const paymentInfo = await ctx.agents.sessionStorage.get("nextPaymentInfo"); // output write-off event meta, event source and payment data ctx.agents.result.finalize({ taskId: ctx.task.taskId, allResult: thisMetaItem, printResult: sourceFromPolicy, anotherResult: paymentInfo }); }
[if Error]
/** * * The codes in 'handleError' are executed when an error occurres * in Aggregator Logic or the CURRENT running Logic just gets an error. * */ async function handleError(ctx, error) { ctx.agents.logging.error(error.message); }

Deploy Data Process

Right-click the data process and select Deploy Data Process. You should see the message from LOC Studio that your processes are successfully deployed.

Please note that

  • Deploying a data process is equivalent to storing it in the backend of LOC Studio.
  • When a data process is deployed, this data process in the menu will be marked with a green solid circle.

Create API Route

Now deploy an API route as well so that we can invoke the data process through it. Here we choose /APAR/APAR as the route and remember to set it for POST requests.

Then add the data process you just created and click Create when you are done.

Invoke Data Process

Now we can trigger this APAR data process with the API route (in our example it's https://api.loc-xxx/APAR/APAR) with the test JSON payload. Here's a screenshot of Postman:

LOC returns status 200 (OK) as well as an executionid, which we can use to look up the events generated during this execution.

Please note that if there is an error message in the response, the reasons could be

  • the URL is being requested for too long, so you need to re-send the request on Postman/Insomnia again.
  • the data process execution time is longer than your previous setup, so please change the setting. (Reminder in Data Process)
  • the code snippets in your (generic or aggregator) logics might be erroneous, so please revise them and re-deployed.

Here are some of the common error messages you might encounter only for your reference.

View Result in Data Discovery

Now go back to LOC Studio's Data Discovery window.

Filter by Execution ID

Remember the executionId field we've got from the request response? Select this Execution ID and enter the value:

Please note that only when there are events written in the logics, can they be searched here.

Now if you click the Event Lineage Graph, you can now view all the related events of this particular execution (you may have to drag the nodes around yourself):

To sum up, with this DocGen data process, the insurance policy info can be transformed, and there will be a payment barcode generated as well as an AR file containing the data just like what we have in the Postman response.

Summary of DocGen and APAR

To put what we have in DocGen and APAR altogether, you will have this decent graph in Data Discovery.

With these 2 use cases, we aim to demonstrate our LOC's capability in the insurance industry apart from how LOC Studio helps our customers to monitor their business activity and event sourcing.

tags: LOC Studio