Manual Testing with ProR

Automated test are great (we use them – the RMF project currently runs over 200 automated tests, all passing).  But there are some situations where automating tests is just not worth the while – GUI tests are one example.  So the manual tests and their results have to be recorded somewhere.  An obvious, quick approach would be to use a spreadsheet.  But there are advantages of using a more sophisticated tool, like ProR.  We’ll show here how this could be realized.

You can download the .reqif file used in this example (but please note that for production use, you would probably want to straighten is out considerably). Download Sample ReqIF File >>

Recording Tests

A simple first approach would be to create a list of test descriptions, with one column for each test run.  The following screenshot shows how this can be realized in ProR:

The top pane shows a table with three columns, “Description”, “Expected” and “0.4.0”.  The first one briefly describes the test, the second describes the expected result.  The “0.4.0” column shows the results of the test run for version 0.4.0 (in the last row it is shown how the value is selected from a drop-down, rather than typed in).

If you look at the Properties View (lower pane), you see that the entry has many more attributes that are not shown in the table view.  This includes past test runs (0.1.0 to 0.3.0).  These are kept for audits, but would only clutter up the table view.  When a new version is prepared for release, a new attribute (0.5.0) would be created and shown in the table view.  0.4.0 would be removed from the table view, but of course it could always be accessed via the Properties View.  The Properties View also contains a Note field for additional information.

Traces between Requirements and Tests

All this could also be realized in a Spreadsheet somehow.  But the real power of a tool like ProR unfolds when traceability is used.  Traceability could be used for tracing requirements or specification elements to their tests, as shown here:

You can see that there is an outgoing link from row 1.1.  The link target (the requirement) has been selected in the right column, resulting in the full requirements text being visible in the properties view.  In addition, the link has an annotation in the Description column: “On Linux, hold down Ctrl-Shift”.

More possibilities

What has been shown here can be realized with the standard ProR tool and only touches the surface on what could be realized.  With a little effort, this approach could be extended significantly, for example:

  • More elaborate testing instructions could be linked to the individual test entries.
  • Using the “suspect link” tool from ProR Essentials to double check test descriptions when a requirement changes.
  • With a little scripting, unit test results could be added automatically to the test report.
  • As ProR writes XML, graphical reports could be generated from the model with a standard XML processing tool.  It would also be possible to build an Eclipse plug-in for generating these reports inside ProR.

There are many more possibilities, by leveraging both the power of the existing ProR features, the underlying XML data model, and Eclipse as an integration platform.