Testing PLplot

From PLplotWiki
Revision as of 21:13, 14 March 2010 by Airwin (Talk | contribs) (Invocation of tests in build tree using new testing framework)

Jump to: navigation, search

We encourage those who build PLplot to test both their build-tree version and their installed-examples-tree version and report any problems back to either the plplot-general or plplot-devel mailing lists.

Testing Prerequisites

The legacy test for the build tree requires ctest (available as part of CMake) and bash on Unix systems or win-bash on Windows. The legacy test for the installed examples tree works only on Unix (Linux, Mac OS X, and presumably legacy Unix) and requires GNU-bash, make, and pkg-config.

In contrast to our legacy testing system, the new testing framework is essentially identical for both the build tree and installed examples tree. It just requires CMake (using any generator that is suitable for the Unix or Windows platform that is being used for the test), and bash on Unix or win-bash on Windows.

Build-tree tests

Build-tree tests can only be performed if cmake is invoked with the -DBUILD_TEST=ON option. Such tests are done from the top-level directory of the build tree (the directory where you invoke cmake command that configures the build of PLplot). The methods for invoking these tests are given below both for the legacy and new testing methods. These tests include executing all our 31 standard examples for all language interfaces and non-interactive device drivers that we currently support. This is a comprehensive test of the PLplot build. For example, our standard examples exercise virtually all of the PLplot API. Furthermore, this series of tests generates more than 2GB of plot files in various formats. The tests also include a comparison of PostScript (-dev psc) results and stdout from each language interface to PLplot with the corresponding results for C. In general, these results are identical which is a stringent test of our language bindings. (Note it is the user's responsibility to insure the locales are consistent for all languages since inconsistent locales can produce inconsistent stdout results which have nothing to do with PLplot bindings or examples issues.)

You should search test results for obvious errors such as segfaults. In addition, you can test for rendering errors by viewing the file results using an appropriate viewer.

Results for -dev psc are a special case. To illustrate this, here are typical -dev psc results for example 1.

x01a.psc, x01c.psc, x01cxx.psc, x01d.psc, x01f.psc, x01f95.psc, x01j.psc, x01lua.psc, x01o.psc, x01ocaml.psc, x01p.psc, x01pdl.psc, and x01t.psc.

These correspond to Ada, C, C++, D, Fortran 77, Fortran 95, Java, Lua, Octave, OCaml, Python, Perl/PDL, and Tcl -dev psc (colour PostScript) results for our standard example 1. The test referred to above compares everything in this list, but x01c.psc against that file so rendering errors only need to be looked for in x01c.psc (with your favorite Postscript viewing application), and any of these files which the report shows are different from x01c.psc. And similarly for the -dev psc results for the rest of our 31 standard examples.

Here are typical plot file results from our standard example 1 for devices other than -dev psc.

x01c.pdfcairo, x01c.ps, x01c.psttf, x01c.psttfc, x01c01.bmpqt, x01c01.epsqt, x01c01.jpgqt, x01c01.pdfqt, x01c01.pngcairo, x01c01.pngqt, x01c01.ppmqt, x01c01.svg, x01c01.svgcairo, x01c01.svgqt, x01c01.tiffqt, and x01c01.xfig.

Since different devices are involved in all cases, these should be looked at individually for rendering errors (and similarly for the remaining 31 standard examples). Note such visual inspection is a huge job so we certainly don't expect it of our testers very often, but once every year or so and especially for newer examples that haven't been checked before is worth doing. On Unix platforms a general all-purpose viewer for all these file formats is the ImageMagick display application.

Invocation of legacy tests in the build tree

After running "make all" from the top-level of the build tree, then run

ctest --verbose >& ctest.out

This creates test plot file results in the plplot-test subdirectory of the build tree, and ctest.out should contain a table of comparisons of Postscript results from each of our standard examples and each of our language bindings against the corresponding C versions.

Invocation of tests in the build tree using new testing framework

One advantage of the new testing framework is that it has full dependencies implemented (unlike ctest which requires "make all" to be run first).

A second advantage of the new testing framework is that for Generators with parallel execution ability, you can save a lot of time on hardware with more than one cpu by using the parallel execution options (e.g., the -j option for GNU make).

A final advantage of the new testing framework is the tests are divided in a much finer-grained manner. To see all the tests that are possible run

make help |grep test

(On platforms with CMake generators other than make, you will have to do something equivalent to this search to find all test targets.)

Three important test targets are test_diff_psc, test_noninteractive and test_interactive.

test_diff_psc generates all -dev psc results and compares them with the same report that is obtained from ctest. Note this target excludes anything but -dev psc results.

test_noninteractive runs test_diff_psc as well as every other PLplot example that produces a file. (Note that test_noninteractive is somewhat more comprehensive than legacy ctest and considerably more comprehensive than the test_diff_psc target.)

test_interactive runs all interactive devices for the standard C examples as well as all special interactive examples. Very little user-intervention is required to run these tests because, where possible, the PLplot -np (no-pause) command-line option is used for these tests.

Install tree tests

After PLplot has been configured (with "cmake"), built (with "make"), and installed (with "make install"), you can test the PLplot install on Unix systems by doing the following commands:

cp -a $prefix/share/plplot$plplot_version/examples /tmp
cd /tmp/examples
make test_noninteractive >& make_test.out
make test_interactive

where "$prefix" is the installation prefix chosen at the configuration stage, and $plplot_version is the PLplot version (currently 5.9.4). The effect of the above "cp" and "cd" commands is to copy the examples subtree of the install tree to /tmp and build and test the examples in the copied subtree to keep a clean install tree. However, an alternative is to replace those two commands with

cd $prefix/share/plplot$plplot_version/examples

and build and test the install-tree examples right in the examples subtree of the install tree with the above "make" commands.

Regardless of whether you build and test the examples in a copy of the examples subtree of the install tree or directly in that subtree, check all the *.out files for any errors.

N.B. the above "make test_noninteractive" command does the same tests for the installed PLplot version as ctest does for the build-tree version of PLplot so the same remarks hold about the comprehensive nature of the test, the size (2GB) of the files produced, the comparison of results for various language interfaces with the C result for -dev psc, locale issues, and testing non-psc results using the viewer appropriate for the format. Although the tests are the same, the Makefile implementation is different from the ctest implementation in the build tree so that in particular you can do these install tree tests in parallel using the (GNU-)make -j option to greatly speed up these tests on a multi-processor computer.

N.B. the above "make test_interactive" command executes our interactive examples. There is currently no counterpart for this test in the build tree.