You can find the Tests screen under 'Support' in the configuration menu.
Click on 'Create Test', and the following screen pops up:
Give the test an easily identifiable name, for example the process it works on.
The drop-down menu shows all available processes on the system. Choose the one you create this test for.
Click 'confirm'. The new test is now added to the overview screen, with 'Not Run' as its status.
A test run of a process is successful, when the outcome matches your expectation.
To configure a test, select it from the list. This gets you to the following screen:
The name of the test.
The process this test will run on.
Using YAML, describe your starting assumption.
Using YAML, describe the outcome you expect from the process.
The actual output of the process, once the test has run.
YAML is the language used to describe the starting assumption and outcome assertion for tests.
Different from many other formatting languages, YAML is extremely straight forward. It does not require using brackets - curly, square, or otherwise - , stays clear of closing tags, and many other features that make a computer language seem daunting to the uninitiated.
YAML is a clean way to write data in a natural, concise, and easy to read manner. The structure of what you write, is determined by indentations. This makes it very intuitive.
How you configure a test using YAML, is best shown through an example.
You have just finished configuring a process, that will update a risk profile. Before putting it live, you want to test whether the process gives you the outcome you want from it.
First, create the test as explained above. It will show up in the Tests overview with a ‘Not Run’ status.
Next, click on the test to get to the configuration screen.
On the next line, specify one or more values for this instance. You do this by choosing from the information keys you have added to the process you're testing, for example 'reference_currency'.
Before you write the information key, begin the line with three spaces. This creates the characteristic YAML-indentation, which tells the system the information key is a hierarchical part of the instance.
Here's what that looks like:
The first line describes an instance of the ontology 'ContractualRelationship'.
Under that, we list six information keys, that are part of the 'Risk profile update' process. Note each is written on a new line that starts with three spaces. The use of YAML clearly shows this means the six items are children of the parent
Following each information key, we have provided hypothetical values. How you write those values, depends on what information type goes with them.
If the value is chosen from a taxonomy, you wrap it in square brackets and single quotation marks. See for example the values for
Is the information type of the value a string or a number, you write it without any special characters. See
Your assertion of what the outcome of the process's test run should be, can contain any or all of these four parts:
Instances: on the following line, write the key of the ontology, for example 'Person'. Use indentation. On the next line, specify one or more values using the Information keys as discussed under 'Situation' above. Again, use indentation to mark that the values are child parts of the parent ontology. Adding values to the ontology is optional.
Documents: list the id numbers of the documents you expect to be added to the process.
Sections: list the id numbers of the sections you designed to be added to the process.
Issues: list the id numbers of the rules you expect to be attached to the process.
All id numbers can be found in the configuration. Click on, for example, 'Documents' in the configuration menu. Take the id (only the numbers, not the prefix DOC-) from the first column of the Documents overview screen.
Here's what an assertion could look like:
In this example, based on the described starting situation, we expect the outcome of our process to be that Documents 1633, 1634, and 1643 are added. We also assert that Rules 839 and 840 will be applied.
If these assertions match the actual output of the test run, the test status will change to 'Passed'. If not, the status will be marked 'Failed'.
If we have made a language error in describing the Situation or the Assertion, for example by forgetting a
Documents, the status will show as 'Exception'.