Wondering if anyone in the community has had success in establishing monitoring of Planet Press via Elasticsearch?
Elasticsearch is a popular enterprise grade monitoring software suite, which can ship log files to a central server and alert on rules.
Typically, log files which are ingested into Elasticsearch are either directly pushed into Elasticsearch via a REST API by the source application, or shipped into Elasticsearch via a Filebeat utility service. The use of the Filebeat is to support products which are not directly coded to use Elasticsearch.
As we’re likely to use Filebeats, I’m specifically wondering if anyone in this community has created GROK patterns to ingest Planet Press Connect log files, that they would be willing to share? These can be cumbersome and time consuming to create by hand.
Are there any APIs to receive logs in Planet Press?
May I please recommend a Feature Request to provide the logs in JSON line format, for easy interpretation by third party logging utilities?
Following closely this thread. We use Grafana and it’s stack, but the concepts of log ingesting and patterns still apply the same.
This would be huge benefit.
I’ve tried to play around the subject, the log produced by the Workflow is configurable in the server settings, mostly adding\removing\rearranging predefined tokens and a regex to parse the default one is provided in the docs.
The main issue I am facing is the unclear (for me) structure of the logging. Each Connect module produces its own logging (Datamapper, Weaver, Merge engines) but I don’t understand how one can follow and job across the different engines since each one logs with a new identifier unrelated to the previous engine.
Overall a good guide and documentation on the Connect and Workflow logging would be beneficial at least to navigate the logs.