Test Class Sample: Difference between revisions

From STRIDE Wiki
Jump to navigation Jump to search
 
(58 intermediate revisions by 8 users not shown)
Line 1: Line 1:
==Introduction==
==Introduction==


The following content relates to the sample files and workspaces installed in ''%STRIDE_DIR%\Samples\TestUnits\TestClass''.  This sample consists of a [http://en.wikipedia.org/wiki/Microsoft_Visual_Studio Visual Studio] workspace for building an [[Windows_Off-Target_Apps| off-target simulator]], sample [[Test Units|test class]] source code, and a STRIDE workspace for doing more advanced test class execution.
These examples cover the use of [[Scl test class|Test Classes]] to create logical groupings of test cases.  For unit testing, one test class is typically created for each class under test, although more complicated scenarios often justify other arrangements. Test Classes require a c++ compiler.


==Getting Started==
''NOTE:'' each of the example test classes is grouped in namespace corresponding to its category (e.g. ''Basic'' or ''Runtime Services'').  This is something shown for organizational purposes only -- it is '''not''' a general requirement that test classes be placed into namespaces. 


To begin, open the Visual Studio Solution file in the sample directory.  This solution (and corresponding project) were created for Visual Studio 2005.  If you have a later version of Visual Studio installed, you should be able to open this solution and it will be automatically upgraded if necessary.  If you do not currently have any version of Visual Studio, we recommend that you install the current free (as in beer) version of [http://en.wikipedia.org/wiki/Visual_Studio_Express Visual Studio Express].
==Tests Description==


Once you have successfully opened the solution, rebuild it.  The build process has custom STRIDE build rules integrated and will produce a STRIDE database, intercept module source files, and an off-target simulator application that incorporates the test class source.
===Basic===


Once the build is complete, perform the following steps to run the test classes in the workspace:
These examples cover the simplest, easiest way to code a STRIDE test class. These examples use simple [http://en.wikipedia.org/wiki/Plain_Old_Data_Structures POD] return types to indicate status.


# launch the off-target simulator, TestClass.exe.  This will run in a standard console window.
====Basic::Simple====
# open a command prompt window and change to this sample's directory.
# at the command prompt, run the command <tt>'''TestUnitRun.pl -v'''</tt>.  This will execute all of the test units in the workspace and open a browser to display the results.
# quit the TestClass.exe application by typing 'q' in its console window.


==Sample Test Classes==
This example demonstrates passing and failing tests using the four primary integer types (int, bool, short, and char) that infer status from.


Now that you have built the off-target simulator and executed the test classes it contains, you can take time to peruse the test class source and the corresponding results that each produces.  This section provides a brief description for each.
====Basic::Fixtures====


''NOTE:'' each of the example test classes is grouped in namespace corresponding to its top-level category (e.g. ''Basic'' or ''Runtime Services'')This is something we have chosen to do for organizational purposes only -- it is '''not''' a general requirement that test classes be placed into namespaces.  
This example demonstrates how to use [[Test_Unit_Pragmas#Fixturing_Pragmas|setup and teardown]] fixturesThe setup and teardown methods are called immediately before and after the execution of each test method, respectively.


===Basic Examples===
====Basic::Exceptions====


These examples cover the simplest, easiest way to code a STRIDE test class. These examples use simple [http://en.wikipedia.org/wiki/Plain_Old_Data_Structures POD] return types to indicate status and do not annotate the tests with any rich information (such as comments).
This example demonstrates exceptions thrown from a test method are caught by the intercept module and noted in the results. Any exception that is caught by the harness is assumed to indicate failure. If you want to write tests for expected exceptions, consider using our [[Test Macros#Exception_Macros|exception macros]]


====01_01_Basic_Simple====
====Basic::Parameterized====


Demonstrates passing and failing tests using the four primary POD types that we can infer status from (int, bool, short, and char).
This example demonstrates how to pass arguments to the constructor of your test unit. This is something that is useful when you want to run the same test scenario with different sets of input data, for instance, as described by [http://xunitpatterns.com/Parameterized%20Test.html this pattern].


====01_02_Basic_Fixtures====
===Runtime Services===


Shows how to use [[Test_Units#Pragmas_for_Test_Units|setup and teardown]] fixtures. The setup and teardown methods are called immediately before and after the execution of each test method, respectively.
These examples cover basic usage of the [[Runtime_Test_Services#C_Test_Functions|Runtime Test Services API]] (as declared in srtest.h).


====01_03_Basic_Exceptions====
====RuntimeServices::Simple====


Demonstrates how exceptions thrown from a test method are caught by our intercept module and noted in the results.  Any exception that is caught by the harness is assumed to indicate failure.
This example demonstrates how to use [[Runtime_Test_Services#srTestCaseSetStatus|srTestCaseSetStatus]] to set status and [[Runtime_Test_Services#srTestCaseAddAnnotation|srTestCaseAddAnnotation]] to add a comment.


====01_04_Basic_Constructors====
====RuntimeServices::Dynamic====


Demonstrates how test classes may have non-trivial constructors.  These arguments can be passed using our [[Ascript#ascript.TestUnits|ascript scripting model]] for test units.
This example demonstrates how to use [[Runtime_Test_Services#srTestSuiteAddCase|srTestSuiteAddCase]], [[Runtime_Test_Services#srTestSuiteAddAnnotation|srTestSuiteAddAnnotation]], and [[Runtime_Test_Services#srTestAnnotationAddComment|srTestAnnotationAddComment]] for dynamic case, and annotation creation in the context of a single test method.


===Runtime Services Examples===
====RuntimeServices::Override====


These examples cover basic usage of our Runtime Test Services API (as declared in srtest.h).
This example demonstrates how to use [[Runtime_Test_Services#srTestCaseSetStatus|srTestCaseSetStatus]] to override the status that would otherwise be inferred from the return value.


====02_01_RuntimeServices_Simple====
====RuntimeServices::VarComment====


Shows how to use [[Test_Units#srTestCaseSetStatus|srTestCaseSetStatus]] to set status, [[Test_Units#srTestCaseAddComment|srTestCaseAddComment]] to add a comment, and srTEST_ADD_COMMENT_WITH_LOCATION to add a comment that automatically includes line and file information.  
This example demonstrates the use of [http://en.wikipedia.org/wiki/Printf printf] style format strings with [[Runtime_Test_Services#srTestCaseAddAnnotation|srTestCaseAddAnnotation]].


====02_02_RuntimeServices_Dynamic====
===srTest===


Shows how to use [[Test_Units#srTestSuiteAddSuite|srTestSuiteAddSuite]] and [[Test_Units#srTestSuiteAddTest|srTestSuiteAddTest]] for dynamic suite and test creation in the context of a single test method.
These examples show to how to use the [[Runtime_Test_Services#C.2B.2B_Test_Classes|stride::srTest]] base class for your test classes.  When you publicly inherit from srTest, you get access to default testCase and testSuite members and their associated methods.


====02_03_RuntimeServices_Override====
====srTest::Simple====


Shows how to use [[Test_Units#srTestCaseSetStatus|srTestCaseSetStatus]] to override the status that would otherwise be inferred from the return value.
This example demonstrates the use of [[Runtime_Test_Services#SetStatus|testCase.setStatus]] to set the status for test cases.


====02_04_RuntimeServices_VarComment====
====srTest::Dynamic====


Demonstrates the use of [http://en.wikipedia.org/wiki/Printf printf] style format strings with [[Test_Units#srTestCaseAddComment|srTestCaseAddComment]].
This example demonstrates how to use [[Runtime_Test_Services#method_AddCase|AddCase]], [[Runtime_Test_Services#method_AddAnnotation_2|testCase.AddAnnotation]], and [[Runtime_Test_Services#method_AddComment|testAnnotation.AddComment]] for dynamic test case, and annotation creation within the context of one test method.


===srTest Examples===
== Run Tests ==


These examples show to how to use the [[Test_Units#C.2B.2B_Test_Classes|stride::srTest]] base class for your test classes.  When you publicly inherit from srTest, you get access to default testCase and testSuite members and their associated methods.
Now launch the test app (if you have not already) and execute the runner  with the following commands:


====03_01_srTest_Simple====
''Test Class tests'':
<pre>
stride --device="TCP:localhost:8000" --database="../out/TestApp.sidb" --run="s2_testclass::Basic::Exceptions; s2_testclass::Basic::Fixtures;  s2_testclass::Basic::Parameterized; s2_testclass::Basic::Simple" --output=TestClass.xml
</pre>


Demonstrates the use of [[Test_Units#SetStatus|testCase.setStatus]] to set the status for test cases.
Note the command will produce distinct result files for the run (per the ''--output'' command above). Please use the result file to peruse the results by opening each in your browser.


====03_02_srTest_Dynamic====
== Observations ==
This sample shows the techniques available for packaging and writing test units using classes. If you have a C++ capable compiler, we recommend that you use test classes to package your unit tests, even if your APIs under test are C only. Review the source code in the directory and follow the sample description.


Shows how to use [[Test_Units#AddSuite|testSuite.AddSuite]] and [[Test_Units#AddTest|testSuite.AddTest]] for dynamic suite and test case creation within the context of one test method.
The following are some test observations:
* all of these example classes have been put into one or more namespaces. This is just for organization purposes, mainly to avoid name collisions when built along with lots of other test classes. Your test classes are '''not''' required to be in namespaces, but it can be helpful in avoiding collisions as the number of tests in your system grows.
* we've documented our test classes and methods using [https://en.wikipedia.org/wiki/Doxygen doxygen] style comments. This documentation is automatically extracted by our tools and added to the results report - more information about this feature is [[Test_Documentation_in_C/C%2B%2B|here]]
* you can optionally write test classes that inherit from a base class that we've defined ([[Runtime_Test_Services#class_srTest|stride::srTest]]). We recommend you start by writing your classes this way so that your classes will inherit some methods and members that make some custom reporting tasks simpler.
* exceptions are generally handled by the STRIDE unit test harness, but can be disabled if your compiler does not support them (see ''s2_testclass_basic_exceptions_tests.h/cpp'').
* parameterized tests are supported by test classes as well. In these tests, simple constructor arguments can be passed during execution and are available at runtime to the test unit. The STRIDE infrastructure handles the passing of the arguments to the device and the construction of the test class with these arguments. Parameterization of test classes can be a powerful way to expand your test coverage with data driven test scenarios (varying the input to a single test class).


==Test Class Execution==
This sample demonstrates two different techniques for executing test classes.
===Command Line Execution===
Command line execution for test classes is done using the [[Test_Runners#TestUnitRun.pl|TestUnitRun utility]].  Here are several examples of specific syntax to execute test classes.  All of these commands can be invoked from a standard command shell and the arguments shown assume that the commands are executed with the sample's directory as the starting directory. You must have your TestClass.exe application running in order for the runner to be able to initiate a connection to the target simulator. In addition, you should verify that your %STRIDE_DIR%\bin\transport.cfg file is using the TCP transport to connect to port 8000 (these are the default settings when the product is installed).
====Simple execution of all test units====
The following command executes all of the test units found in the STRIDE database you have previously generated.  For the purpose of this sample, since there is only one database, the -d parameter is not strictly needed, but we show it here for completeness.
  TestUnitRun.pl -d TestClass.sidb
This command executes all Test Units found in the database in descending alpha-numeric sort order.  Any Test Class initialization arguments are given default values (typically zero or NULL).
When you run this command, you should see console output like:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Constructors...
  Running Test Basic::Exceptions...
  Running Test Basic::Fixtures...
  Running Test Basic::Simple...
  Running Test RuntimeServices::Dynamic...
  Running Test RuntimeServices::Override...
  Running Test RuntimeServices::Simple...
  Running Test RuntimeServices::VarComment...
  Running Test srTest::Dynamic...
  Running Test srTest::Simple...
  Disconnected from device.
  Test Results saved to C:\s2\seaside\Samples\TestUnits\TestClass\TestClass.xml
  Test Report saved to C:\s2\seaside\Samples\TestUnits\TestClass\TestClass.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              28
    Failed:              12
    In Progress:          2
    Not Applicable:      2
    ...in 12 suites.
  ***************************************************************************
====Execution based on an order file====
TestUnitRun can optionally base its execution on simple text file input. We have provided a simple order file, ''RunSimple.txt'', which specifies a subset of all the Test Classes in this sample. This order file also shows how to create subsuites in the final output by using the special '''{suitepath}''' syntax, as described in [[Test_Runners#Usage|the usage section]].
  TestUnitRun.pl -d TestClass.sidb -o RunSimple.txt
...and will produce this output:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Simple...
  Running Test RuntimeServices::Simple...
  Running Test srTest::Simple...
  Disconnected from device.
  Test Results saved to C:\s2\seaside\Samples\TestUnits\TestClass\TestClass.xml
  Test Report saved to C:\s2\seaside\Samples\TestUnits\TestClass\TestClass.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              13
    Failed:              6
    In Progress:          2
    Not Applicable:      2
    ...in 6 suites.
  ***************************************************************************
====Execution based on filesystem order====
TestUnitRun can also try to infer execution order and suite hierarchy from filesystem organization.  If you have organized your test class source files (only the files that contain the scl_test_class pragma matter here) in a filesystem hierarchy that you want to mimic in your tests, you can specify a root of a directory tree that contains the relevant test class source.  TestUnitRun will walk the directory structure and determine order and hierarchy of tests based on the directory structure.  To see an example of this in action, you can execute this command with the sample:
  TestUnitRun.pl -d TestClass.sidb -f Tests
This will cause the runner to examine the Tests subdirectory structure and build a hierarchy of tests based on that directory tree.  Subdirectories will map to suites in the final report.  Here is the output for this example:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Simple...
  Running Test Basic::Fixtures...
  Running Test Basic::Exceptions...
  Running Test Basic::Constructors...
  Running Test RuntimeServices::Simple...
  Running Test RuntimeServices::Dynamic...
  Running Test RuntimeServices::Override...
  Running Test RuntimeServices::VarComment...
  Running Test srTest::Simple...
  Running Test srTest::Dynamic...
  Disconnected from device.
  Test Results saved to C:\s2\seaside\Samples\TestUnits\TestClass\TestClass.xml
  Test Report saved to C:\s2\seaside\Samples\TestUnits\TestClass\TestClass.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              28
    Failed:              12
    In Progress:          2
    Not Applicable:      2
    ...in 15 suites.
  ***************************************************************************
===Workspace-Based Execution===
We also provide a sample STRIDE workspace that demonstrates the use of script execution with STRIDE Studio to manage test order and hierarchy.  This workspace was created using [[WorkspaceSetup.pl]] and the [[Provided_Frameworks#WindowsTestApp|WindowsTestApp framework]].  The setup and teardown folders provided basic infrastructure scripts that start/stop the simulator application (TestClass.exe) and to manage traceviews used for [[Runtime_Reference#Logging_Services|srPrint]] message collection.  The scripts that drive the testing are in the workspace '''test''' folder. What follows is a brief description for each.
====RunAll====
This folder contains a single script that iterates through the entire collection of test units and executes them one at a time. The order of execution will be in descending alpabetical order (on name) since we explictly called the [[Ascript#ascript.TestUnits|ArrangeBy]] collection method.
====CallConstructor====
This folder contains a simple script example that shows how to invoke test classes with constructor arguments.  In this example, the Basic::Constructors test class is executed twice with different initialization (constructor) arguments both times.
====Run Individual====
This folder shows how to use individual scripts to execute test classes. Each script takes the very simple form:
  ascript.TestUnits.Item('''TEST_CLASS_NAME''').Run();
..where '''TEST_CLASS_NAME''' is the name of the test class you want to run.  You can then use the Studio tree control to change the order and hierarchy of each item by moving the scripts and creating folders. The sample contains individual scripts for a few of the sample test classes - you are free to move, add, or delete any items as you experiment with the workspace.


[[Category: Samples]]
[[Category: Samples]]

Latest revision as of 19:00, 28 December 2018

Introduction

These examples cover the use of Test Classes to create logical groupings of test cases. For unit testing, one test class is typically created for each class under test, although more complicated scenarios often justify other arrangements. Test Classes require a c++ compiler.

NOTE: each of the example test classes is grouped in namespace corresponding to its category (e.g. Basic or Runtime Services). This is something shown for organizational purposes only -- it is not a general requirement that test classes be placed into namespaces.

Tests Description

Basic

These examples cover the simplest, easiest way to code a STRIDE test class. These examples use simple POD return types to indicate status.

Basic::Simple

This example demonstrates passing and failing tests using the four primary integer types (int, bool, short, and char) that infer status from.

Basic::Fixtures

This example demonstrates how to use setup and teardown fixtures. The setup and teardown methods are called immediately before and after the execution of each test method, respectively.

Basic::Exceptions

This example demonstrates exceptions thrown from a test method are caught by the intercept module and noted in the results. Any exception that is caught by the harness is assumed to indicate failure. If you want to write tests for expected exceptions, consider using our exception macros

Basic::Parameterized

This example demonstrates how to pass arguments to the constructor of your test unit. This is something that is useful when you want to run the same test scenario with different sets of input data, for instance, as described by this pattern.

Runtime Services

These examples cover basic usage of the Runtime Test Services API (as declared in srtest.h).

RuntimeServices::Simple

This example demonstrates how to use srTestCaseSetStatus to set status and srTestCaseAddAnnotation to add a comment.

RuntimeServices::Dynamic

This example demonstrates how to use srTestSuiteAddCase, srTestSuiteAddAnnotation, and srTestAnnotationAddComment for dynamic case, and annotation creation in the context of a single test method.

RuntimeServices::Override

This example demonstrates how to use srTestCaseSetStatus to override the status that would otherwise be inferred from the return value.

RuntimeServices::VarComment

This example demonstrates the use of printf style format strings with srTestCaseAddAnnotation.

srTest

These examples show to how to use the stride::srTest base class for your test classes. When you publicly inherit from srTest, you get access to default testCase and testSuite members and their associated methods.

srTest::Simple

This example demonstrates the use of testCase.setStatus to set the status for test cases.

srTest::Dynamic

This example demonstrates how to use AddCase, testCase.AddAnnotation, and testAnnotation.AddComment for dynamic test case, and annotation creation within the context of one test method.

Run Tests

Now launch the test app (if you have not already) and execute the runner with the following commands:

Test Class tests:

stride --device="TCP:localhost:8000" --database="../out/TestApp.sidb" --run="s2_testclass::Basic::Exceptions; s2_testclass::Basic::Fixtures;  s2_testclass::Basic::Parameterized; s2_testclass::Basic::Simple" --output=TestClass.xml 

Note the command will produce distinct result files for the run (per the --output command above). Please use the result file to peruse the results by opening each in your browser.

Observations

This sample shows the techniques available for packaging and writing test units using classes. If you have a C++ capable compiler, we recommend that you use test classes to package your unit tests, even if your APIs under test are C only. Review the source code in the directory and follow the sample description.

The following are some test observations:

  • all of these example classes have been put into one or more namespaces. This is just for organization purposes, mainly to avoid name collisions when built along with lots of other test classes. Your test classes are not required to be in namespaces, but it can be helpful in avoiding collisions as the number of tests in your system grows.
  • we've documented our test classes and methods using doxygen style comments. This documentation is automatically extracted by our tools and added to the results report - more information about this feature is here
  • you can optionally write test classes that inherit from a base class that we've defined (stride::srTest). We recommend you start by writing your classes this way so that your classes will inherit some methods and members that make some custom reporting tasks simpler.
  • exceptions are generally handled by the STRIDE unit test harness, but can be disabled if your compiler does not support them (see s2_testclass_basic_exceptions_tests.h/cpp).
  • parameterized tests are supported by test classes as well. In these tests, simple constructor arguments can be passed during execution and are available at runtime to the test unit. The STRIDE infrastructure handles the passing of the arguments to the device and the construction of the test class with these arguments. Parameterization of test classes can be a powerful way to expand your test coverage with data driven test scenarios (varying the input to a single test class).