Request for Data Handling in OL Automate

Hi
In somewhat complex workflows, we often need to modify or add values retrieved by the data mapper.
I understand that OL Automate cannot handle metadata, but is it possible to import and edit JSON data or something similar?
We are specifically looking at the area circled in red in the diagram.
Since the Output type of the Query Items node does not support JSON output like the Output type of OL Connect, we have encountered a roadblock and are currently at a standstill.

flows.zip (4.0 KB)

With the current version this is achieved by returning the data record IDs in the ‘query items’ node. These IDs are returned as an array. Next, you can iterate this array using the ‘split’ node and then fetch the full data structure using the ‘get data’ node, which returns the data in JSON format for each record. At this point, you have the ability to modify the record. After making modifications, you can use the ‘join’ node to consolidate these updated records back into an array. This array can then be utilized by the ‘data update’ node.

In the example below, I utilized the ‘change’ node to perform data updates, though this operation could alternatively be executed within a ‘function’ node.

We are exploring the option to enhance efficiency by returning data records directly based on the array of record IDs retrieved within the ‘data get’ node rather than individual records. This approach will streamline the flow.

Hope this helps,

Erik

Hi Erik,
Thank you for your reply.
The method you provided was exactly what I was looking for.
Although it’s a simple task, I was able to add the flow variable’s value to the data.
I’m not yet fully comfortable handling JSON data to the extent of working with metadata, so I think it will take some time, but thanks to you, I feel like I can move forward.

1 Like

After reviewing your fields, you might want to consider using runtime parameters. These can be supplied to the data mapping configuration and incorporated into the records during the data mapping process. I’ll compose a short sample.

Alternative solution using runtime parameters in a data mapping config.

Hope this helps,

Erik

  1. Define a parameter in the Data Mapping configuration

image

  1. Add a field via an Extraction Step and retrieve the value from the parameter by setting ‘Based on’ to ‘Runtime parameter’ and set the ‘Parameter’ field to the name of the parameter.

  1. In my flow I’ve defined the batchNr property in the ‘inject’ node (could be a ‘function’ or ‘change’ node). In my example I used the $moment function in a J expression field to add a datetimestamp using a custom format.

  1. In the Properties panel of the ‘data mapping’ node I’ve added the ‘batch’ parameter and set it’s value to msg.batchNr

When processing the data file, the batchNr is added to the ‘batch’ field of each data record. See the screendump below.

image