Test Macros Sample: Difference between revisions

From STRIDE Wiki
Jump to navigation Jump to search
 
(26 intermediate revisions by 6 users not shown)
Line 1: Line 1:
==Introduction==
==Introduction==
The Test Macros Samples are part of the [[Test_Unit_Samples|STRIDE Test Unit Samples]]. The following content relates to the sample files and workspaces installed in ''%STRIDE_DIR%\Samples\TestUnits\TestClassMacros''.  This sample consists of a [http://en.wikipedia.org/wiki/Microsoft_Visual_Studio Visual Studio] workspace for building a [[Windows_Off-Target_Apps|Windows Off-Target App]], sample test class macros source code,  and a STRIDE workspace for doing more advanced test class execution.


==Getting Started==
These examples cover simple uses of each of the [[Test Macros]]. The names of the test methods contain either ''Pass'' or ''Fail''. All methods containing ''Fail'' illustrates uses of test macros that result in failures. Methods containing ''Pass'' illustrate passing uses. While these examples use [[Test_Units#Test__Unit_Packaging| Test Class packaging]], these macros works basically the same within c compilation units.


To begin, open the Visual Studio Solution file in the sample directory.  This solution (and corresponding project) were created for Visual Studio 2005.  If you have a later version of Visual Studio installed, you should be able to open this solution and it will be automatically upgraded if necessary.  If you do not currently have any version of Visual Studio, it is recommended that you install the current free version of [http://en.wikipedia.org/wiki/Visual_Studio_Express Visual Studio Express].
==Tests Description==


Once you have successfully opened the solution, rebuild it.  The build process has custom STRIDE build rules integrated and will produce a STRIDE database, intercept module source files, and a Windows Off-Target App that incorporates the test class source.
===Bool===


Once the build is complete, perform the following steps to run the test classes in the workspace:
This example demonstrates uses of the [[Test Macros#Boolean_Macros|''srEXPECT_TRUE()'' and ''srEXPECT_FALSE()'']] test class macros.


# launch the Windows Off-Target App, TestClass.exe.  This will run in a standard console window.
===Comparison===
# open a command prompt window and change to this sample's directory.
# at the command prompt, run the command <tt>'''TestUnitRun.pl -v'''</tt>.  This will execute all of the test units in the workspace and open a browser to display the results.
# quit the TestClass.exe application by typing 'q' in its console window.


==Sample Test Class Macros==
This example demonstrates uses of the [[Test Macros#Comparison_Macros|''srEXPECT_EQ()'', ''srEXPECT_NE()'', ''srEXPECT_GT()'', ''srEXPECT_GE()'', ''srEXPECT_LT()'' and ''srEXPECT_LE()'']] macros.


Now that you have built the Windows Off-Target App and executed the test classes it contains, you can take time to peruse the test class source and the corresponding results that each produces.  This section provides a brief description for each.
===CString===


''NOTE:'' each of the example test classes is grouped in the ''Basic'' namespace.  This is for organizational purposes only -- it is '''not''' a general requirement that test classes be placed into namespaces.  
This example demonstrates use of the C-string (zero terminated character sequence) macros [[Test Macros#C_String_Comparison_Macros|''srEXPECT_STREQ()'', ''srEXPECT_STRNE(''), ''srEXPECT_STRCASEEQ()'' and ''srEXPECT_STRCASENE()'']].  


===Basic Examples===
===FloatingPointComparison===


These examples cover simple uses of each of the macro families. The names of the test methods contain either ''Pass'' or ''Fail''. All methods containing ''Fail'' illustrates uses of test macros that result in failures. Methods containing ''Pass'' illustrate passing uses.  
This example demonstrates use of the macro used to test equality (or near equality) of floating point values [[Test Macros#Floating_Point_Comparison_Macros|''srEXPECT_NEAR()'']].


====01_01_Basic_ExpectBool====
===Predicates===


This example demonstrates uses of the ''srEXPECT_TRUE()'' and ''srEXPECT_FALSE()'' test class macros.  
This example demonstrates use of the predicate based macros [[Test Macros#Predicate_Macros|''srEXPECT_PRED<n>()'']].


====01_02_Basic_Comparison====
===Note===
TBD


This example demonstrates uses of the ''srEXPECT_EQ()'', ''srEXPECT_NE()'', ''srEXPECT_GT()'', ''srEXPECT_GE()'', ''srEXPECT_LT()'' and ''srEXPECT_LE()'' macros.
===Assert===


====01_03_Basic_CString====
This example illustrates the use of the [[Test Macros#General_Guidelines_for_Test_Macros|assertion macros]] (''srASSERT_xx'') which, in contrast to the [[Test Macros#General_Guidelines_for_Test_Macros|expectation macros]] (''srEXPECT_xx''), cause the rest of the test case code to be bypassed.


This example demonstrates use of the C-string (zero terminated character sequence) macros ''srEXPECT_STREQ()'', ''srEXPECT_STRNE(''), ''srEXPECT_STRCASEEQ()'' and ''srEXPECT_STRCASENE()''.
===ExitUnit===
TBD


====01_04_Basic_Exceptions====
===ExitUnitFixture===
TBD


This example demonstrates use of the exception verification macros ''srEXPECT_THROW()'', ''srEXPECT_THROW_ANY()'' and ''srEXPECT_NO_THROW()''.
===ExitUnitConstructor===
TBD


====01_05_Basic_Predicates====
===Exceptions===


This example demonstrates use of the predicate based macros ''srEXPECT_PRED<n>()''.
This example demonstrates use of the exception verification macros [[Test Macros#Exception_Macros|''srEXPECT_THROW()'', ''srEXPECT_THROW_ANY()'' and ''srEXPECT_NO_THROW()'']]. Note there is only C++ implementation for this test.


====01_06_Basic_FloatingPt====
This example demonstrates use of the macro used to test equality (or near equality) of floating point values ''srEXPECT_NEAR()''.
====01_07_Basic_Assert====
This example illustrates the use of the assertion macros (''srASSERT_xx'') which, in contrast to the expectation macros (''srEXPECT_xx''), cause the rest of the test case code to by bypassed.
==Test Class Macros Execution==
This sample demonstrates two different techniques for executing test classes.
===Command Line Execution===
Command line execution for test classes is done using the [[Test_Runners#TestUnitRun.pl|TestUnitRun utility]].  Here are several examples of specific syntax to execute test classes.  All of these commands can be invoked from a standard [http://en.wikipedia.org/wiki/Command_Prompt_(Windows) command shell] (or other shell of your choosing) and the arguments shown assume that the commands are executed with the sample's directory as the starting directory. You must have your TestClass.exe application running in order for the runner to be able to initiate a connection to the target simulator. In addition, you should verify that your %STRIDE_DIR%\bin\transport.cfg file is using the TCP transport to connect to port 8000 (these are the default settings when the product is installed).
====Simple execution of all test units====
The following command executes all of the test units found in the STRIDE database you have previously generated.  For the purpose of this sample, since there is only one database, the -d parameter is not strictly needed, but it is shown here for completeness.
  TestUnitRun.pl -d TestClassMacros.sidb
This command executes all Test Units found in the database in descending alpha-numeric sort order.  Any Test Class initialization arguments are given default values (typically zero or NULL).
When you run this command, you should see console output like:
  Attempting connection using [Sockets (S2)] transport ...
  Connected to device.
  Initializing STRIDE database objects...
  Done.
  Running Test Basic::Assert...
  Running Test Basic::CString...
  Running Test Basic::Comparison...
  Running Test Basic::Exceptions...
  Running Test Basic::ExpectBool...
  Running Test Basic::FloatingPt...
  Running Test Basic::Predicates...
  Disconnected from device.
  Test Results saved to C:\STRIDE\Samples\TestUnits\TestClassMacros\TestClassMacros.xml
  Test Report saved to C:\STRIDE\Samples\TestUnits\TestClassMacros\TestClassMacros.html
  ***************************************************************************
  Results Summary
  ***************************************************************************
    Passed:              7
    Failed:              8
    In Progress:          0
    Not Applicable:      0
    ...in 7 suites.
  ***************************************************************************
===Workspace-based Execution===
TestClass.ssw, a workspace in the TestClass directory, demonstrates the use of script execution with Studio to manage test order and hierarchy.  This workspace was created using [[WorkspaceSetup.pl]].  The setup and teardown folders provide basic infrastructure scripts that start/stop the simulator application (TestClass.exe) and to manage traceviews used for [[Runtime_Reference#Logging_Services|srPrint]] message collection.  The scripts that drive the testing are in the workspace '''test''' folder. What follows is a brief description for each.
====RunAll====
This folder contains a script, All.js, that iterates through the entire collection of test units and executes them one at a time. The order of execution will be in ascending alpabetical order (by name) since the [[AutoScript#ascript.TestUnits|ArrangeBy]] collection method was called.
====CallConstructor====
This folder contains a simple script example that shows how to invoke test classes with constructor arguments.  In this example, the Basic::Constructors test class is executed twice with different initialization (constructor) arguments both times.
====Run Individual====
This folder shows how to use individual scripts to execute test classes. Each script has the following form:
  ascript.TestUnits.Item('''TEST_CLASS_NAME''').Run();
The '''TEST_Class_NAME''' is the name of the scl_test_class test to be run.  The order and hierarchy of each item may be changed via the Studio tree control by moving the item within the scripts and/or folders. The sample contains individual scripts for a few of the sample scl_test_class tests - you are free to move, add, or delete any items as you experiment with the workspace.


[[Category: Samples]]
[[Category: Samples]]

Latest revision as of 23:49, 6 June 2011

Introduction

These examples cover simple uses of each of the Test Macros. The names of the test methods contain either Pass or Fail. All methods containing Fail illustrates uses of test macros that result in failures. Methods containing Pass illustrate passing uses. While these examples use Test Class packaging, these macros works basically the same within c compilation units.

Tests Description

Bool

This example demonstrates uses of the srEXPECT_TRUE() and srEXPECT_FALSE() test class macros.

Comparison

This example demonstrates uses of the srEXPECT_EQ(), srEXPECT_NE(), srEXPECT_GT(), srEXPECT_GE(), srEXPECT_LT() and srEXPECT_LE() macros.

CString

This example demonstrates use of the C-string (zero terminated character sequence) macros srEXPECT_STREQ(), srEXPECT_STRNE(), srEXPECT_STRCASEEQ() and srEXPECT_STRCASENE().

FloatingPointComparison

This example demonstrates use of the macro used to test equality (or near equality) of floating point values srEXPECT_NEAR().

Predicates

This example demonstrates use of the predicate based macros srEXPECT_PRED<n>().

Note

TBD

Assert

This example illustrates the use of the assertion macros (srASSERT_xx) which, in contrast to the expectation macros (srEXPECT_xx), cause the rest of the test case code to be bypassed.

ExitUnit

TBD

ExitUnitFixture

TBD

ExitUnitConstructor

TBD

Exceptions

This example demonstrates use of the exception verification macros srEXPECT_THROW(), srEXPECT_THROW_ANY() and srEXPECT_NO_THROW(). Note there is only C++ implementation for this test.