Test Point Sample: Difference between revisions

From STRIDE Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 1: Line 1:
The Test Point samples demonstrate a simple technique to monitor and test activity occurring in another thread. The samples show a common testing scenario of this type: where we want to verify the behavior of a state machine.


== Building and Running the Samples ==
== Introduction ==
Please refer to the section appropriate for your target build environment:


*[[Building and Running Test Unit Samples Under Windows| Windows Off-Target]]
The Test Point samples demonstrate a simple technique to monitor and test activity occurring in another thread. The samples show a common testing scenario of this type: where we want to verify the behavior of a state machine.
 
*[[Building and Running Test Unit Samples Under Linux|Linux]]
== Source under test ==


*[[Building and Running Test Unit Samples Under WinMobile|WinMobile]]
=== s2_testpoint_source.c / h ===
=== StateController.c / h ===
These files impelement a simple state machine that we wish to test. The state machine runs in its own thread, and starts when the thread function '''StateControllerTask''' is executed.
These files impelement a simple state machine that we wish to test. The state machine runs in its own thread, and starts when the thread function '''StateControllerTask''' is executed.


Line 23: Line 19:
Each state transition is managed through a call to SetNewState() which communicates the state transition to the test thread using the srTEST_POINT() macro. We set the macro argument to the name of the state we are transitioning to as this is the 'value' of the testpoint that will be received by the test thread.
Each state transition is managed through a call to SetNewState() which communicates the state transition to the test thread using the srTEST_POINT() macro. We set the macro argument to the name of the state we are transitioning to as this is the 'value' of the testpoint that will be received by the test thread.


=== SequenceTest.h ===
== Tests Description ==
This file defines a class--SequenceTest--that is specified as a STRIDE test class using the '''scl_test_class()''' pragma.


SequenceTest derives from stride::srTest and thus inherits the test functionality implemented in this class.
=== s2_testpoint_basic ===


=== SequenceTest.cpp ===
This example implements three tests of the state machine implemented in StateController.c. These tests demonstrate the use of the '''srTEST_POINT_WAIT()''' and '''srTEST_POINT_CHECK()''' macros to verify activity in another thread.
This file implements three tests of the state machine implemented in StateController.c. These tests demonstrate the use of the '''srTEST_POINT_WAIT()''' macro to verify activity in another thread.


Each test follows the same pattern in preparing and using the test point feature:
Each test follows the same pattern in preparing and using the test point feature:
Line 35: Line 29:
# set the expectation array using '''srTestPointExpect()'''
# set the expectation array using '''srTestPointExpect()'''
# start the state machine
# start the state machine
# use '''srTEST_POINT_WAIT()''' macro to validate the expected test points
# use '''srTEST_POINT_CHECK() or srTEST_POINT_WAIT()''' macro to validate the expected test points


As the macro name suggests, we create an "expectation" of activity and then validate the observed activity against the expectation using rules that we specify. If the expectation is met, the test passes; if the expectation is not met, the test fails.
As the macro name suggests, we create an "expectation" of activity and then validate the observed activity against the expectation using rules that we specify. If the expectation is met, the test passes; if the expectation is not met, the test fails.


The main difference between the tests is the values of the parameters provided to each test's  srTEST_POINT_WAIT() macro.
The main difference between the tests is the values of the parameters provided to each test's  validation macro.
 
==== TestPoint_SyncExact====


==== ExactTest ====
Here we verify an exact match between the contents of the ''expected'' array and the observed testpoints. The combination of srTEST_POINT_EXPECT_ORDERED and srTEST_POINT_EXPECT_EXCLUSIVE specifies that the test will pass only if:
Here we verify an exact match between the contents of the ''expected'' array and the observed testpoints. The combination of srTEST_POINT_WAIT_STRICT and srTEST_POINT_WAIT_ORDER specifies that the test will pass only if:
* only the testpoints in the expected array are seen, and
* only the testpoints in the expected array are seen, and
* the testpoints are seen in the order specified
* the testpoints are seen in the order specified


==== LooseTest ====
==== TestPoint_SyncLooseTimed====
Here we loosen the restrictions of the exact test. By specifiying neither srTEST_POINT_WAIT_STRICT nor srTEST_POINT_WAIT_ORDER, we now will:
Here we loosen the restrictions of the exact test. By specifiying srTEST_POINT_EXPECT_UNORDERED and srTEST_POINT_EXPECT_NONEXCLUSIVE, we now will:
* ignore any testpoints seen that aren't in the expected array. and
* ignore any testpoints seen that aren't in the expected array. and
* disregard the order in which the testpoints are received
* disregard the order in which the testpoints are received
Line 53: Line 48:
Note that the "IDLE" testpoint is now included in the expected array only once, but with an expected count of 2.
Note that the "IDLE" testpoint is now included in the expected array only once, but with an expected count of 2.


The srTEST_POINT_WAIT() will now cause the test to fail only if all of the expected testpoints are not seen (the specified number of times) within the timeout period.
The srTEST_POINT_CHECK() will now cause the test to fail only if all of the expected testpoints are not seen (the specified number of times) within the timeout period.


==== TimeoutTest ====
==== TestPoint_AsynLooseTimed====
This test is identical to the LooseTest, except that we set the srTEST_POINT_WAIT() timeout value to 1000 milliseconds, which will result in a test failure, as it takes approximately ten seconds for the testpoint expectations to be satisfied.
This test is identical to the SyncLooseTimed test, except that we call srTEST_POINT_WAIT() and pass a timeout value to 400 milliseconds, which will result in a test failure, as it takes approximately 600 milliseconds for the testpoint expectations to be satisfied.


[[Category:Samples]]
[[Category:Samples]]

Revision as of 22:47, 3 June 2009

Introduction

The Test Point samples demonstrate a simple technique to monitor and test activity occurring in another thread. The samples show a common testing scenario of this type: where we want to verify the behavior of a state machine.

Source under test

s2_testpoint_source.c / h

These files impelement a simple state machine that we wish to test. The state machine runs in its own thread, and starts when the thread function StateControllerTask is executed.

The expected state transitions are as follows:

eSTART -> eIDLE -> eACTIVE -> eIDLE -> eEND

The states don't do any work; instead they just sleep() so there's some time spent in each one.

Each state transition is managed through a call to SetNewState() which communicates the state transition to the test thread using the srTEST_POINT() macro. We set the macro argument to the name of the state we are transitioning to as this is the 'value' of the testpoint that will be received by the test thread.

Tests Description

s2_testpoint_basic

This example implements three tests of the state machine implemented in StateController.c. These tests demonstrate the use of the srTEST_POINT_WAIT() and srTEST_POINT_CHECK() macros to verify activity in another thread.

Each test follows the same pattern in preparing and using the test point feature:

  1. create an array of type srTestPointExpect_t which specifies the expected test points
  2. set the expectation array using srTestPointExpect()
  3. start the state machine
  4. use srTEST_POINT_CHECK() or srTEST_POINT_WAIT() macro to validate the expected test points

As the macro name suggests, we create an "expectation" of activity and then validate the observed activity against the expectation using rules that we specify. If the expectation is met, the test passes; if the expectation is not met, the test fails.

The main difference between the tests is the values of the parameters provided to each test's validation macro.

TestPoint_SyncExact

Here we verify an exact match between the contents of the expected array and the observed testpoints. The combination of srTEST_POINT_EXPECT_ORDERED and srTEST_POINT_EXPECT_EXCLUSIVE specifies that the test will pass only if:

  • only the testpoints in the expected array are seen, and
  • the testpoints are seen in the order specified

TestPoint_SyncLooseTimed

Here we loosen the restrictions of the exact test. By specifiying srTEST_POINT_EXPECT_UNORDERED and srTEST_POINT_EXPECT_NONEXCLUSIVE, we now will:

  • ignore any testpoints seen that aren't in the expected array. and
  • disregard the order in which the testpoints are received

Note that the "IDLE" testpoint is now included in the expected array only once, but with an expected count of 2.

The srTEST_POINT_CHECK() will now cause the test to fail only if all of the expected testpoints are not seen (the specified number of times) within the timeout period.

TestPoint_AsynLooseTimed

This test is identical to the SyncLooseTimed test, except that we call srTEST_POINT_WAIT() and pass a timeout value to 400 milliseconds, which will result in a test failure, as it takes approximately 600 milliseconds for the testpoint expectations to be satisfied.