Test Macros Sample: Difference between revisions

From STRIDE Wiki
Jump to navigation Jump to search
No edit summary
 
(33 intermediate revisions by 6 users not shown)
Line 1: Line 1:
==Introduction==
==Introduction==
The Test Macros Samples are part of the [[Test_Unit_Samples|STRIDE Test Unit Samples]]. The following content relates to the sample files and workspaces installed in ''%STRIDE_DIR%\Samples\TestUnits\TestClassMacros''.  This sample consists of a [http://en.wikipedia.org/wiki/Microsoft_Visual_Studio Visual Studio] workspace for building a [[Windows_Off-Target_Apps|Windows Off-Target App]], sample [[Test Units|test class]] source code,  and a STRIDE workspace for doing more advanced test class execution.


==Getting Started==
These examples cover simple uses of each of the [[Test Macros]]. The names of the test methods contain either ''Pass'' or ''Fail''. All methods containing ''Fail'' illustrates uses of test macros that result in failures. Methods containing ''Pass'' illustrate passing uses. While these examples use [[Test_Units#Test__Unit_Packaging| Test Class packaging]], these macros works basically the same within c compilation units.


To begin, open the Visual Studio Solution file in the sample directory.  This solution (and corresponding project) were created for Visual Studio 2005.  If you have a later version of Visual Studio installed, you should be able to open this solution and it will be automatically upgraded if necessary.  If you do not currently have any version of Visual Studio, it is recommended that you install the current free version of [http://en.wikipedia.org/wiki/Visual_Studio_Express Visual Studio Express].
==Tests Description==


Once you have successfully opened the solution, rebuild it.  The build process has custom STRIDE build rules integrated and will produce a STRIDE database, intercept module source files, and a Windows Off-Target App that incorporates the test class source.
===Bool===


Once the build is complete, perform the following steps to run the test classes in the workspace:
This example demonstrates uses of the [[Test Macros#Boolean_Macros|''srEXPECT_TRUE()'' and ''srEXPECT_FALSE()'']] test class macros.


# launch the Windows Off-Target App, TestClass.exe.  This will run in a standard console window.
===Comparison===
# open a command prompt window and change to this sample's directory.
# at the command prompt, run the command <tt>'''TestUnitRun.pl -v'''</tt>.  This will execute all of the test units in the workspace and open a browser to display the results.
# quit the TestClass.exe application by typing 'q' in its console window.


==Sample Test Classes==
This example demonstrates uses of the [[Test Macros#Comparison_Macros|''srEXPECT_EQ()'', ''srEXPECT_NE()'', ''srEXPECT_GT()'', ''srEXPECT_GE()'', ''srEXPECT_LT()'' and ''srEXPECT_LE()'']] macros.


Now that you have built the Windows Off-Target App and executed the test classes it contains, you can take time to peruse the test class source and the corresponding results that each produces.  This section provides a brief description for each.
===CString===


''NOTE:'' each of the example test classes is grouped in namespace corresponding to its top-level category (e.g. ''Basic'' or ''Runtime Services'').  This is something shown for organizational purposes only -- it is '''not''' a general requirement that test classes be placed into namespaces.  
This example demonstrates use of the C-string (zero terminated character sequence) macros [[Test Macros#C_String_Comparison_Macros|''srEXPECT_STREQ()'', ''srEXPECT_STRNE(''), ''srEXPECT_STRCASEEQ()'' and ''srEXPECT_STRCASENE()'']].  


===Basic Examples===
===FloatingPointComparison===


These examples cover the simplest, easiest way to code a STRIDE test class. These examples use simple [http://en.wikipedia.org/wiki/Plain_Old_Data_Structures POD] return types to indicate status and do not annotate the tests with any rich information (such as comments).
This example demonstrates use of the macro used to test equality (or near equality) of floating point values [[Test Macros#Floating_Point_Comparison_Macros|''srEXPECT_NEAR()'']].


====01_01_Basic_Simple====
===Predicates===


This example demonstrates passing and failing tests using the four primary POD types that infer status from (int, bool, short, and char).
This example demonstrates use of the predicate based macros [[Test Macros#Predicate_Macros|''srEXPECT_PRED<n>()'']].


====01_02_Basic_Fixtures====
===Note===
TBD


This example demonstrates how to use [[Test_Units#Pragmas_for_Test_Units|setup and teardown]] fixtures.  The setup and teardown methods are called immediately before and after the execution of each test method, respectively.
===Assert===


====01_03_Basic_Exceptions====
This example illustrates the use of the [[Test Macros#General_Guidelines_for_Test_Macros|assertion macros]] (''srASSERT_xx'') which, in contrast to the [[Test Macros#General_Guidelines_for_Test_Macros|expectation macros]] (''srEXPECT_xx''), cause the rest of the test case code to be bypassed.


This example demonstrates how exceptions thrown from a test method are caught by the intercept module and noted in the results.  Any exception that is caught by the harness is assumed to indicate failure.
===ExitUnit===
TBD


====01_04_Basic_Constructors====
===ExitUnitFixture===
TBD


This example demonstrates how test classes may have non-trivial constructors.  These arguments can be passed using the [[AutoScript#ascript.TestUnits|ascript]] scripting model for test units.
===ExitUnitConstructor===
TBD


===Runtime Services Examples===
===Exceptions===


These examples cover basic usage of the Runtime Test Services API (as declared in srtest.h).
This example demonstrates use of the exception verification macros [[Test Macros#Exception_Macros|''srEXPECT_THROW()'', ''srEXPECT_THROW_ANY()'' and ''srEXPECT_NO_THROW()'']]. Note there is only C++ implementation for this test.


====02_01_RuntimeServices_Simple====
This example demonstrates how to use [[Test_Units#srTestCaseSetStatus|srTestCaseSetStatus]] to set status, [[Test_Units#srTestCaseAddComment|srTestCaseAddComment]] to add a comment, and srTEST_ADD_COMMENT_WITH_LOCATION to add a comment that automatically includes line and file information.
====02_02_RuntimeServices_Dynamic====
This example demonstrates how to use [[Test_Units#srTestSuiteAddSuite|srTestSuiteAddSuite]], [[Test_Units#srTestSuiteAddTest|srTestSuiteAddTest]], [[Test_Units#srTestSuiteAddAnnotation|srTestSuiteAddAnnotation]], and [[Test_Units#srTestAnnotationAddComment|srTestAnnotationAddComment]] for dynamic suite, test, and annotation creation in the context of a single test method.
====02_03_RuntimeServices_Override====
This example demonstrates how to use [[Test_Units#srTestCaseSetStatus|srTestCaseSetStatus]] to override the status that would otherwise be inferred from the return value.
====02_04_RuntimeServices_VarComment====
This example demonstrates the use of [http://en.wikipedia.org/wiki/Printf printf] style format strings with [[Test_Units#srTestCaseAddComment|srTestCaseAddComment]].
===srTest Examples===
These examples show to how to use the [[Test_Units#C.2B.2B_Test_Classes|stride::srTest]] base class for your test classes.  When you publicly inherit from srTest, you get access to default testCase and testSuite members and their associated methods.
====03_01_srTest_Simple====
This example demonstrates the use of [[Test_Units#SetStatus|testCase.setStatus]] to set the status for test cases.
====03_02_srTest_Dynamic====
This example demonstrates how to use [[Test_Units#AddSuite|testSuite.AddSuite]], [[Test_Units#AddTest|testSuite.AddTest]], [[Test_Units#AddAnnotation|testSuite.AddAnnotation]], and [[Test_Units#AddAnnotation.AddComment|testSuite.AddAnnotation.AddComment]] for dynamic suite, test case, and annotation creation within the context of one test method.
==Test Class Execution==
This sample demonstrates two different techniques for executing test classes.
===Command Line Execution===
Command line execution for test classes is done using the [[Test_Runners#TestUnitRun.pl|TestUnitRun utility]].  Here are several examples of specific syntax to execute test classes.  All of these commands can be invoked from a standard [http://en.wikipedia.org/wiki/Command_Prompt_(Windows) command shell] (or other shell of your choosing) and the arguments shown assume that the commands are executed with the sample's directory as the starting directory. You must have your TestClass.exe application running in order for the runner to be able to initiate a connection to the target simulator. In addition, you should verify that your %STRIDE_DIR%\bin\transport.cfg file is using the TCP transport to connect to port 8000 (these are the default settings when the product is installed).
====Simple execution of all test units====
The following command executes all of the test units found in the STRIDE database you have previously generated.  For the purpose of this sample, since there is only one database, the -d parameter is not strictly needed, but it is shown here for completeness.
  TestUnitRun.pl -d TestClass.sidb
This command executes all Test Units found in the database in descending alpha-numeric sort order.  Any Test Class initialization arguments are given default values (typically zero or NULL).
When you run this command, you should see console output like:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Constructors...
  Running Test Basic::Exceptions...
  Running Test Basic::Fixtures...
  Running Test Basic::Simple...
  Running Test RuntimeServices::Dynamic...
  Running Test RuntimeServices::Override...
  Running Test RuntimeServices::Simple...
  Running Test RuntimeServices::VarComment...
  Running Test srTest::Dynamic...
  Running Test srTest::Simple...
  Disconnected from device.
  Test Results saved to C:\STRIDE\Samples\TestUnits\TestClass\TestClass.xml
  Test Report saved to C:\STRIDE\Samples\TestUnits\TestClass\TestClass.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              26
    Failed:              12
    In Progress:          2
    Not Applicable:      2
    ...in 12 suites.
  ***************************************************************************
====Execution based on an order file====
TestUnitRun can optionally base its execution on simple text file input. A simple order file, ''RunSimple.txt'', is provided which specifies a subset of all the Test Classes in this sample. This order file also shows how to create subsuites in the final output by using the special '''{suitepath}''' syntax, as described in [[Test_Runners#Usage|the usage section]].
  TestUnitRun.pl -d TestClass.sidb -o RunSimple.txt
...and will produce this output:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Simple...
  Running Test RuntimeServices::Simple...
  Running Test srTest::Simple...
  Disconnected from device.
  Test Results saved to C:\STRIDE\Samples\TestUnits\TestClass\TestClass.xml
  Test Report saved to C:\STRIDE\Samples\TestUnits\TestClass\TestClass.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              13
    Failed:              6
    In Progress:          2
    Not Applicable:      2
    ...in 6 suites.
  ***************************************************************************
====Execution based on filesystem order====
TestUnitRun can also try to infer execution order and suite hierarchy from filesystem organization.  If you have organized your test class source files (only the files that contain the scl_test_class pragma matter here) in a filesystem hierarchy that you want to mimic in your tests, you can specify a root of a directory tree that contains the relevant test class source.  TestUnitRun will walk the directory structure and determine order and hierarchy of tests based on the directory structure.  To see an example of this in action, you can execute this command with the sample:
  TestUnitRun.pl -d TestClass.sidb -f Tests
This will cause the runner to examine the Tests subdirectory structure and build a hierarchy of tests based on that directory tree.  Subdirectories will map to suites in the final report.  Here is the output for this example:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Simple...
  Running Test Basic::Fixtures...
  Running Test Basic::Exceptions...
  Running Test Basic::Constructors...
  Running Test RuntimeServices::Simple...
  Running Test RuntimeServices::Dynamic...
  Running Test RuntimeServices::Override...
  Running Test RuntimeServices::VarComment...
  Running Test srTest::Simple...
  Running Test srTest::Dynamic...
  Disconnected from device.
  Test Results saved to C:\STRIDE\Samples\TestUnits\TestClass\TestClass.xml
  Test Report saved to C:\STRIDE\Samples\TestUnits\TestClass\TestClass.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              26
    Failed:              12
    In Progress:          2
    Not Applicable:      2
    ...in 15 suites.
  ***************************************************************************
===Workspace-based Execution===
TestClass.ssw, a workspace in the TestClass directory, demonstrates the use of script execution with Studio to manage test order and hierarchy.  This workspace was created using [[WorkspaceSetup.pl]].  The setup and teardown folders provide basic infrastructure scripts that start/stop the simulator application (TestClass.exe) and to manage traceviews used for [[Runtime_Reference#Logging_Services|srPrint]] message collection.  The scripts that drive the testing are in the workspace '''test''' folder. What follows is a brief description for each.
====RunAll====
This folder contains a script, All.js, that iterates through the entire collection of test units and executes them one at a time. The order of execution will be in ascending alpabetical order (by name) since the [[AutoScript#ascript.TestUnits|ArrangeBy]] collection method was called.
====CallConstructor====
This folder contains a simple script example that shows how to invoke test classes with constructor arguments.  In this example, the Basic::Constructors test class is executed twice with different initialization (constructor) arguments both times.
====Run Individual====
This folder shows how to use individual scripts to execute test classes. Each script has the following form:
  ascript.TestUnits.Item('''TEST_CLASS_NAME''').Run();
The '''TEST_Class_NAME''' is the name of the scl_test_class test to be run.  The order and hierarchy of each item may be changed via the Studio tree control by moving the item within the scripts and/or folders. The sample contains individual scripts for a few of the sample scl_test_class tests - you are free to move, add, or delete any items as you experiment with the workspace.


[[Category: Samples]]
[[Category: Samples]]

Latest revision as of 23:49, 6 June 2011

Introduction

These examples cover simple uses of each of the Test Macros. The names of the test methods contain either Pass or Fail. All methods containing Fail illustrates uses of test macros that result in failures. Methods containing Pass illustrate passing uses. While these examples use Test Class packaging, these macros works basically the same within c compilation units.

Tests Description

Bool

This example demonstrates uses of the srEXPECT_TRUE() and srEXPECT_FALSE() test class macros.

Comparison

This example demonstrates uses of the srEXPECT_EQ(), srEXPECT_NE(), srEXPECT_GT(), srEXPECT_GE(), srEXPECT_LT() and srEXPECT_LE() macros.

CString

This example demonstrates use of the C-string (zero terminated character sequence) macros srEXPECT_STREQ(), srEXPECT_STRNE(), srEXPECT_STRCASEEQ() and srEXPECT_STRCASENE().

FloatingPointComparison

This example demonstrates use of the macro used to test equality (or near equality) of floating point values srEXPECT_NEAR().

Predicates

This example demonstrates use of the predicate based macros srEXPECT_PRED<n>().

Note

TBD

Assert

This example illustrates the use of the assertion macros (srASSERT_xx) which, in contrast to the expectation macros (srEXPECT_xx), cause the rest of the test case code to be bypassed.

ExitUnit

TBD

ExitUnitFixture

TBD

ExitUnitConstructor

TBD

Exceptions

This example demonstrates use of the exception verification macros srEXPECT_THROW(), srEXPECT_THROW_ANY() and srEXPECT_NO_THROW(). Note there is only C++ implementation for this test.