Monitoring platform events
Requirements
active Router2 application with EximeeEvents
application provided with the Eximee platform since version 2.3167.0
active WebformsRest application
application provided with the Eximee platform
active MonitoringLogProcessor application
application provided with the Eximee platform 2.3019.0
active database storing events
the application requires PostgreSQL version >=9.5 to run
active Kafka instance
the Eximee platform provides Kafka in the appropriate version
if you want to use a Kafka existing in the bank we recommend version >= 3.4
active Grafana instance min 8.X.X
a sample monitoring dashboard is provided with the platform in the grafana.json file
Enabling the functionality in the platform is controlled by adding the value "monitoring" to the COMPOSE_PROFILES variable.
Operating assumptions
The Webforms application generates events related to the status of forms.
Various implementations of event logging can be active at the same time. It depends on configuration and deployment. EximeeEventsConnector (one of the possible implementations) collects events and sends them to EximeeEvents.
EximeeEvents writes them to the appropriate Kafka topic.
MonitoringEventProcessor reads events from Kafka, verifies them and writes them to the database (PostgreSQL) in a format readable by Grafana.
Events in the database can be read by external applications for visualization or analysis. A Grafana configuration for visualizing events will be provided with the platform.

Router2 application with EximeeEvents
Purpose
EximeeEvents is a plugin for the Router2 application. It provides an endpoint that accepts events, which are then written to the appropriate Kafka topic.
The topic and message group sent to Kafka must be consistent with the MonitoringLogProcessor configuration.
Configuration
Configuration and more information here [Process event registration]
MonitoringLogProcessor application
Purpose
The application's task is to populate the database (currently only PostgreSQL) with platform events.
The application reads events from the source (Kafka) and verifies whether it is a system event (originating from the platform). Then system events are adapted for Grafana and written to the database.
Configuration
Database - the target place for storing events
MONITORING_LOG_PROCESSOR_DATABASE_DIALECT
YES
org.hibernate.dialect.PostgreSQL95Dialect
Dialect of the database intended for storing events.
MONITORING_LOG_PROCESSOR_DATABASE_URL
YES
Pointer to the database.
MONITORING_LOG_PROCESSOR_DATABASE_USERNAME
YES
db_user
User with permissions to write events.
MONITORING_LOG_PROCESSOR_DATABASE_PASSWORD
YES
db_pass
Password of the database access user.
Kafka
MONITORING_LOG_PROCESSOR_KAFKA_TOPIC
YES
all
Topic to which event messages will be sent
MONITORING_LOG_PROCESSOR_KAFKA_GROUP
YES
eximee-events
eximee event group
Recorded events
Events must be of system origin (origin = SYSTEM).
Event types are mapped to a business type. A type that has no counterpart in a business event receives an empty value. Information about the mapping of event types is below:
FORM_NEW_STARTED
FORM_ENTRY
Entry into the form
FORM_NEXT_PAGE FORM_PREVIOUS_PAGE
PAGE_SAVE
Saving the state of the current page - moving to the next or previous page
FORM_CLIENT_PAGE_CHANGED
PAGE_ENTRY
Page change - entering a different page.
FORM_SAVE
FORM_SAVE
Form submission
FORM_DRAFT_SAVE
FORM_DRAFT_SAVE
Saving a draft version of the form
FORM_DRAFT_RESTORE
FORM_DRAFT_RESTORE
Restoring a draft version of the form
Database structure (PostgreSQL)
Defined database structure with business events.
Table
events
Field name
Type
nullable
Description
id
bigserial
FALSE
Sequential event identifiers
origin
text
FALSE
Event origin (SYSTEM / BUSINESS)
type
text
FALSE
Business event type (the table with event types is shown above)
timestamp
timestamp
FALSE
Time of the event invocation (date and time)
payload
jsonb
TRUE
Event data in JSON form, if the event has additional data.
Migration scripts
Migration scripts to create and modify the table are saved in an archive to be delivered with the platform.
Webforms application with event logging
Purpose
The event logging mechanism is designed to collect events related to changes in the status of forms across the platform. Events are sent in batches to Kafka using EximeeEvents.
Configuration
To enable event collection, set the configuration accordingly webforms.xml. Below is a fragment of the configuration that is responsible for the active event collection flow.
<?xml version="1.0" encoding="UTF-8" ?>
Cycles in the dependency graph are resolved arbitrarily (cut off after the component deeper in the graph)
....
It is possible to enable refreshing of all components in the graph (regardless of whether a direct predecessor changed state). This is an administrative action and requires changing the following entry in the /etc/eximee/webforms.xml file:
...
<features>
...
<eventLog>
<enabled>true</enabled>
<coreExecutorPoolSize>5</coreExecutorPoolSize>
<maxExecutorPoolSize>10</maxExecutorPoolSize>
<eximeeEventsProcessor>
<enabled>true</enabled>
<url>http://localhost:9002</url>
</eximeeEventsProcessor>
</eventLog>
</features>
<server>
<nonBlockingGraphFormTemplates>template_name1,template_name2</nonBlockingGraphFormTemplates>And a table presenting the installation configuration.
Active event collection mechanism
server.features.eventLog.enabled
true
Activity of the event collection mechanism
server.features.eventLog.coreExecutorPoolSize
5
Number of threads for event processing
server.features.eventLog.maxExecutorPoolSize
10
Maximum number of threads for event processing
Active event logging mechanism
server.features.eventLog.eximeeEventsProcessor.enabled
true
Activity of the EximeeEvents mechanism
server.features.eventLog.eximeeEventsProcessor.url
Pointer to the EximeeEvents application for writing events to Kafka
Last updated
Was this helpful?
