Manual Configuration
Configuration of the agent using the YAML file
After learning about the different types of scenarios, the available Expect Types and how to format your CSV data, it's time to jump into how to build a scenario using YAML.
The Agent YAML file can be found at
1
/etc/ardexa/ardexa.yaml
Copied!
All properties are key/value pairs. Keys should be written as snake_case using all lowercase letters, however Values can use almost any combination of characters from the Unicode character set (UTF-8).

YAML overview

For the purposes of this guide, we will be using only basic types:
  • Scalar values: Strings and Numbers
  • Lists: sequential list of items
  • Maps: group of key/value pairs
YAML uses indentation to indicate nesting.

Strings

Strings can use single or double quotes, or no quotes at all.
1
example 1
2
'example 2'
3
"example 3"
Copied!

Lists

Each list item is denoted by a hyphen
1
- one
2
- 2
3
- "three"
Copied!

Maps

Maps are a set of key/value pairs, starting with a key, followed by a colon and a space
1
key: value
2
one: 1
3
two: 2
Copied!

Nesting

That all combines together when building scenarios
1
run:
2
- table: example
3
source: one
4
expect: keyword
5
command: "echo one"
6
frequency: 30
7
- table: example
8
source: two
9
expect: keyword
10
command: "echo two"
11
frequency: 30
Copied!

General information

Scenarios are a List of Maps such as the Nesting example above. In this article, we will discuss the simplest Scenario types: RUN (execute a command and process the output) and CAPTURE (monitor a file and treat each new line as a separate record). First we will discuss the structure of a scenario and then discuss the specifics of each Scenario type.

Mandatory items

Table and Source

Table is used to specify which table in the cloud you would like the information to be stored in.
Source is used to identify the "thing" that generated the information stream, e.g. a solar inverter.
Table and Source combine to uniquely identify each scenario, usually presented as table.source. Therefore, full stop cannot be used in Table or Source.
An Agent configuration can have many sources logging to the same Table, or the same Source logging to different tables.
1
table: solar
2
source: inverter-01
Copied!

Expect

Expect identifies the type of information you expect to arrive. The different types are listed here. Generally speaking, we strongly recommend the use of CSV as the primary expect type as it gives users the greatest amount of flexibility and helps to simplify data collection.
1
expect: csv
Copied!

Fields (CSV)

The CSV expect type adds a new mandatory property, Fields. Fields is a List of Maps that describes what information will appear in each column. Each entry requires a Name and Expect property, along with Units for Number-based Expect types.
1
fields:
2
- name: one
3
expect: date
4
- name: two
5
expect: keyword
Copied!

Units (Numbers)

For Integer and Decimal types, Units are considered mandatory. If you do not want or need units, you still need to provide an entry, so simply use a hash or full stop to indicate that the value should be ignored. Units are nested inside the Meta property.
1
table: example
2
source: one
3
file: /tmp/power.txt
4
expect: integer
5
meta:
6
units: "kW"
Copied!

RUN specific properties

Command

This is the command to run. It is passed to a shell for execution, so pipes and redirects are supported
1
command: 'echo 1,2 >> /tmp/log.csv'
Copied!

Frequency

An integer representing how often, in seconds, the target command should be executed
1
frequency: 30
Copied!

CAPTURE specific properties

Capture is listed as "tail" in the configuration file in reference to the ubiquitous Unix tool of the same name. Like the tail command, Capture scenarios will monitor files for new lines being added to a given file.

File

The absolute path to the file to be monitored. If the file doesn't exist, the agent will wait for it to become available and then start monitoring. If the file is moved, removed, renamed or truncated, the agent will restart it's monitor, ensuring it is always targeting the correct file.
1
file: /path/to/log.csv
Copied!

Example

Here is a complete example of a RUN scenario and a CAPTURE scenario working together
1
# Broker configuration
2
amqp:
3
...
4
5
# Run scenario
6
run:
7
- table: test
8
source: run5
9
expect: keyword
10
command: 'echo 1,2 >> /tmp/run.csv && echo success'
11
frequency: 5
12
- table: test
13
source: run30
14
expect: keyword
15
command: 'echo 3,4 >> /tmp/run.csv && echo success'
16
frequency: 30
17
18
# Capture scenario
19
tail:
20
- table: test
21
source: tail
22
expect: csv
23
file: /tmp/run.csv
24
fields:
25
- name: one
26
expect: integer
27
meta:
28
units: '!'
29
- name: two
30
expect: integer
31
meta:
32
units: '@'
Copied!
Last modified 2yr ago