I am wanting to automate the task of mapping fields in OL Connect Planet Press Data Mapper.
I have a sample .txt file that I can load into the application and then I have to select the fields I want to map based on the position and the length of the field.
I have another (excel) file that gives me the following coordinates for the .txt file
Position | Length | Field Name
So I want to automatically map the field with the position, length, and field name parameters. I was thinking using Auto hot key for this potentially, but don’t care how it is scripted as long as it works. Does anyone have any ideas how to accomplish this? I was thinking using Auto Hoy Key, but i don’t know how i would tell it to start at a certain position and then select the following length with how you have to map the data in OL connect.
You’d need a program/script to loop through the Excel file to get each field parameter, then an API call to use this information to map a field. But no such API exists. The REST API doesn’t contain methods to programmatically build a data map. You could attempt to build the dataminingconfig.xml manually, and then attempt to copy it into a pre-existing “empty” data map file/zip archive. But that XML spec is very likely to change from version to version, so this would be an on-going project for sure.
A datamodel needs to have a fixed structure. Therefore, assigning field names dynamically is not possible.
What you could do, however, is create a detail table that contains the following fields: name and value.
Then what you could do is pass a JSON array to the DataMapper as a runtime parameter. That JSON array would contain the field names/positions that are stored in your Excel sheet. For instance:
Your DataMapper would then loop through each element in the array and dynamically extract the values at the proper location. You would then store the field name and it’s value in a new record in the detail table.
Using the same technique, you could bypass the detail table entirely and instead, opt to have the DataMapper build a single JSON record containing all the field names and values you have extracted, and store that inside a single JSON field in the data model. That would probably make using those values even easier in a Template, especially when using Handlebars.
Using Phil’s ingenious method, it doesn’t matter what your source file is. You would create a program or script to parse it, whatever it is, and emit the JSON file. The Data Map and/or Template would use the JSON file, not the original “parameter & position” file. And by “use the JSON file”, that means authoring some custom JavaScript in the Data Map that loops through the JSON file and adds detail records.
Attached is a sample DataMapper configuration ( Dynamic Extract.OL-datamapper ) that uses the technique I described earlier. It expects a runtime parameter named Locations and dynamically extracts all values with a simple script into a single field named FullRecord.
If the data file was an Excel sheet, or a PDF, then you’d just have to modify the script’s data.extract() statement accordingly.
I see how it is setting the fields in the data mapper, but how would you call on those fields in the designer? For instance, if I wanted to place the CustomerNumber field, how would I input that into the sheet? Using @CustomerNumber@ did not work.
If you have a 2022.2 version of the Designer, dragging and dropping the field from the Data Model pane onto the template will automatically generate a Handlebar expression in the form of {{FullRecord.CustomerNumber}}.
No script is required, this will be recognized immediately by the content creation engine.
With an older version, you will have to use a script that parses the JSON in the field and that returns the element you are looking for.
var a = record.FullRecord;
results.html(a.CustomerNumber);