Data Gateways configuration files

Configuration files are created from the configured Data Gateways settings and outputs. They are composed of three main sections, Input, Output, and Filter, as outlined below.

Input

This section is automatically generated by Blue Prism based on the Blue Prism database settings. It determines how events are pulled into the Data Gateways engine for processing. In the example below, they are retrieved from the BPADataPipelineInput table in the Blue Prism database.

input {

jdbc {

jdbc_driver_library => "..\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"

jdbc_connection_string => "jdbc:sqlserver://SQL_SERVER_INSTANCE:1433;databaseName=BP_DATABASE;"

jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"

jdbc_user => "<%Data Gateways BP Database SQL User.username%>"

jdbc_password => "<%Data Gateways BP Database SQL User.password%>"

statement => "delete top(3000)from BPADataPipelineInput with (rowlock, readpast) output deleted.eventdata"

schedule => "*/3* * * * *"

}

}

If required, the following areas of the input can be edited to suit the required preferences.

 

`schedule => "*/3* * * * *"`

This determines how often the SQL query to request data from the BPADataPipelineInput table is executed. The default value of every three seconds can be updated by replacing 3 with the required value.

 

`statement => "delete top(3000)from BPADataPipelineInput with (rowlock, readpast) output deleted.eventdata"`

This is the SQL statement that is executed against the Blue Prism database to pull events out of the BPADataPipelineInput table. The value controls the maximum number of rows pulled from the BPADatapipelineInput table every interval. The default value of 3000 can be changed if required.

Filter

Filters can be used to perform intermediary processing on an event. These could be actions such as adding, removing, or modifying certain fields of an event before they are sent to the outputs, for example, removing the AttributeXML field of a session log.

A list of all filter plug-ins available are listed here: https://www.elastic.co/guide/en/logstash/current/filter-plugins.html

By default, the configuration generated by Blue Prism will contains a single filter:

filter{

json {

source => "eventdata"

target => "event"

}

}

By default, configurations contain a single filter for JSON that is used to parse and expand the JSON string that contains the configured data types (session logs, dashboards etc.) so the contents are accessible in the configuration file.

This section can be edited to add and remove filters but the default JSON filter should not be removed or amended.

Output

Outputs determine where events are sent. If outputs have been configured using the Data Gateways wizard, they will be included here. Every event processed will be sent to every output listed in the configuration.

output {

file {

path => "C:\data.txt"

}

 

csv {

path => "C:\data.csv"

}

}

In the example above, a .txt file and a .csv output are specified. Every event sent to the Data Gateway system will be written into a text file at C:\data.txt and also a csv file at C:\data.csv

For a list of outputs available see here: https://www.elastic.co/guide/en/logstash/current/output-plugins.html

Event structure

This section details the structure of events in Logstash after they are received from Blue Prism. This information can be used to construct conditional statements in the Logstash configuration to divert events to outputs based on their content, or for creating custom message formats for your outputs.

The event (either session log or published dashboard) is stored in the Blue Prism database as a JSON string. In order to turn this JSON string into a set of fields which can be used in Logstash the JSON filter is added to the configuration:

filter{

json{source => "eventdata"

target => "event"}

}

This adds the session log / published dashboard as fields nested under the “event” field.

For example:

[event][eventType] contains the type of event (session log, published dashboard or custom object data).

[event][EventData] contains the data for the event as nested fields.

[event][EventData][SessionNumber] contains the Session number if this is a session log event.

To send only session logs from a process named “ProcessA” to a text file you a conditional statement around your output can be used:

output{

If [event][eventType] == 1 and [event][EventData][ProcessName] == “ProcessA” {

file {

path => “C:\log.txt”

}

}

}

For a full listing of the available fields see the following tables.

General

Event

Description

[event][eventType]

The number that represents the event type:

1 = Session Log

2 = Published Dashboard

3 = Custom

4 = Work Queue Analysis

[event][EventData]

The data for the event. The structure of this data will differ depending on the event type.

Event type – session logs

Event

Description

[event][EventData][StartDate]

The start date of the process stage formatted in ISO 8601 notation. For example: "2019-02-11T07:59:54.829674+00:00"

[event][EventData][SessionNumber]

The session number for the session this session log belongs to.

[event][EventData][ResultType]

The result type of the process stage.

[event][EventData][Result]

The result of the process stage.

[event][EventData][AttributeXML]

The input and output parameters of the stage serialized to XML.

[event][EventData][ProcessName]

The name of the process this stage belongs to. This will be empty if the session log is logged from a business object.

[event][EventData][ObjectName]

The name of the business object this stage belongs to. This will be empty if the session log is logged from a process.

[event][EventData][ActionName]

If this log is from an Action stage, this is the name of the Action. Otherwise it will be empty.

[event][EventData][PageName]

The name of the page this stage which created this session log belongs to.

[event][EventData][StageType]

The type of stage which created this session log.

[event][EventData][StageId]

The ID of the stage which created this session log.

Event type – published dashboards

Event

Description

[event][EventData][Source]

The name of the published dashboard.

[event][EventData][Subject]

The name of the dashboard tile which generated the data.

[event][EventData][Values]

The data from the dashboard tile.

Event type – custom object data

Event

Description

[event][EventData][CustomDataCollection]

The custom data from the process that will be sent.

[event][EventData][SessionNumber]

The session number of the process the data is coming from.

[event][EventData][StageID]

The ID of the stage that this action is called from.

[event][EventData][StageName]

Name of the Send Custom Data action stage.

[event][EventData][StageType]

The type of the Send Custom Data action stage.

[event][EventData][StartDate]

The start date of the session that the Send Custom Data action is running on.

[event][EventData][ProcessName]

The name of the process that the action is being called from.

[event][EventData][PageName]

The name of the page of the process the custom data action is on.

[event][EventData][ObjectName]

The object that the data is coming from – will always be "Data gateways".

[event][EventData][actionName]

The action that the data is coming from – will always be "Send Custom Data".

Directing data to outputs based on content

When session logs and dashboard data are sent to separate text files, conditional statements can be applied to the outputs that will only pass events to an output if it meets one or more conditions of those conditions. This allows outputs, customized in the advanced editor or created in an external text editor, to support Logstash functionality not provided in the Data Gateway Configuration wizard. For example, outputs can be edited to only send data for specified processes or dashboard tiles.

In this example the conditional statements around the file outputs check for a certain EventType value. Session logs have an event type of 1, and dashboards have an even type of 2.

input {

jdbc {

jdbc_driver_library => "..\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"

jdbc_connection_string => "jdbc:sqlserver://localhost\sqlexpress:1433;databaseName=a;"

jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"

jdbc_user => "<%Data Gateways BP Database SQL User.username%>"

jdbc_password => "<%Data Gateways BP Database SQL User.password%>"

statement => "delete top(3000)from BPADataPipelineInput with (rowlock, readpast) output deleted.eventdata"

schedule => "*/3 * * * * *"

}

}

filter {

json {

source => "eventdata"

target => "event"

}

}

output {

if [event][EventType] == 2 and [event][EventData][Source] == "Dashboard 1" {

file {

path => "C:\dashboardlogs.txt"

codec => line { format => "%{event}"}

}

}

if [event][EventType] == 1 {

file {

path => "C:\sessionlogs.txt"

codec => line { format => "%{event}"}

}

}

}

For information about the structure of events pulled from the Blue Prism database see Event structure.

Advanced configuration for database outputs

Database outputs, configured in the wizard, must adhere to an expected format:

  • There must be an eventType column of type integer – this stores the type of the event.
  • There must be an eventData column of type nvarchar(max) – this stores the events serialized to a JSON string.

In advanced configurations, the columns of the table and the data inserted into the table can be customized.

In this example, certain fields from the session log events are sent to the tableabc table in a database.

The jdbc database output inserts the session number, process name, and attributexml fields from the session log into the appropriate columns of the tableabc table.

input {

jdbc {

jdbc_driver_library => "..\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"

jdbc_connection_string => "jdbc:sqlserver://localhost\sqlexpress:1433;databaseName=a;"

jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"

jdbc_user => "<%Data Gateways BP Database SQL User.username%>"

jdbc_password => "<%Data Gateways BP Database SQL User.password%>"

statement => "delete top(3000)from BPADataPipelineInput with (rowlock, readpast) output deleted.eventdata"

schedule => "*/3 * * * * *"

}

}

filter {

json {

source => "eventdata"

target => "event"

}

}

output {

if [event][EventType] == 1 {

bpjdbc {

connection_string => "jdbc:sqlserver://TheServer;databaseName=MyDB;integratedSecurity=true;"

driver_jar_path => "..\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"

driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"

statement => ["insert into tableabc(EventType, EventData) values(?, ?)", "[event][EventType]", "[event][EventData]"]

}

}

}

Custom configuration examples

For a full list of all events and more information about the event structure, see Event structure.

Filter events and divert outputs using conditional statements

In this example the [event][EventType] field is used to send event types to separate files depending on whether event type is a session log (EventType == 1) or a published dashboard (EventType == 2).

The event type for custom object data (EventType == 3) is not specified and so any data of this type in the Data Gateways engine is not included in discarded.

input {

jdbc {

jdbc_driver_library => "..\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"

jdbc_connection_string => "jdbc:sqlserver://localhost\sqlexpress:1433;databaseName=a;"

jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"

jdbc_user => "<%Data Gateways BP Database SQL User.username%>"

jdbc_password => "<%Data Gateways BP Database SQL User.password%>"

statement => "delete top(3000)from BPADataPipelineInput with (rowlock, readpast) output deleted.eventdata"

schedule => "*/3 * * * * *"

}

}

filter {

json {

source => "eventdata"

target => "event"

}

}

output {

if [event][EventType] == 2 and [event][EventData][Source] == "Dashboard 1" {

file {

path => "C:\dashboardlogs.txt"

codec => line { format => "%{event}"}

}

}

if [event][EventType] == 1 {

file {

path => "C:\sessionlogs.txt"

codec => line { format => "%{event}"}

}

}

}

Send events based on session log process names

In this example events to a particular output based on a process name from a session log. There are two outputs:

  • All events get sent to the C:\allevents.txt text file
  • Session log events from the Process123 process are additionally sent to the specified HTTP endpoint.

input {

jdbc {

jdbc_driver_library => "..\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"

jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=ExampleDB;"

jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"

jdbc_user => "<%SQL Serv.username%>"

jdbc_password => "<%SQL Serv.password%>"

statement => "delete top(3000)from BPADataPipelineInput with (rowlock, readpast) output deleted.eventdata"

schedule => "*/3 * * * * *"

}

}

filter {

json {

source => "eventdata"

target => "event"

}

}

output {

if [event][EventType] == 1 {

file {

path => "c:\allevents.txt"

codec => line { format => "%{event}"}

}

}

if [event][EventType] == 1 and [event][EventData][ProcessName] == "Process123" {

bphttp {

url => "localhost:8080/api/post"

http_method => "post"

headers => {"Authorization" => "Basic <base64><%SQL Serv.username%>:<%SQL Serv.password%></base64>"}

}

}

}

Credentials in custom configurations

When credentials or other sensitive data is required in the configuration, they should be added to a Blue Prism credential and then referenced in the configuration by the credential name.

When creating a Blue Prism credential for use in Data Gateways configurations, the credential type must be Data Gateway Credential. These credentials are accessible only by the Data Gateway system and are not accessible to Blue Prism processes.

Credentials can be referenced in the configuration using the syntax <%{credentialname}.{property}%>, where {credentialname} is the name of the credential and {property} is the name of the property in the credential.

For example, to use the username of a credential named cred1 the configuration code would be <%cred1.username%> and <%cred1.password%> respectively.

Credential custom properties can be accessed using the property name.