Training Expectations: Difference between revisions

From STRIDE Wiki
Jump to navigation Jump to search
Line 22: Line 22:
* [[Building_an_Off-Target_Test_App#Build_Steps | Build TestApp]] using SDK makefile
* [[Building_an_Off-Target_Test_App#Build_Steps | Build TestApp]] using SDK makefile
* Startup TestApp
* Startup TestApp
* If not already done, create an [[Stride_Runner#Options | option file]] (myoptions.txt) using the following content (Windows example)
* If you have not created an option file, please refer to [[Training_Getting_Started#Run_Training_Tests| setup]]  
 
  ##### Command Line Options ######
  --device "TCP:localhost:8000"
  --database "%STRIDE_DIR%\SDK\Windows\out\TestApp.sidb"
  --output "%STRIDE_DIR%\SDK\Windows\sample_src\TestApp.xml"
  --log_level all


* Execute ''Test Expectations'' Test Units only  
* Execute ''Test Expectations'' Test Units only  

Revision as of 18:51, 28 January 2013

Objectives

This Training Module is focused on Test Points and how to validate them using Expectations. The module covers the following topics:


There are two test files used -- TestExpect.cpp & TestExpect.h -- that implement three Test Units:

  • TestExpect_Seq
  • TestExpect_Data
  • TestExpect_Misc


The first two Test Units have two test methods already implemented and have one method each that you are required to implement called Exercise. The third Test Unit has four test methods already implemented and also has one method to implement as an exercise. Currently the exercise methods return a NOT IN USE status.

Instructions

Build and Run TestApp

  • Build TestApp using SDK makefile
  • Startup TestApp
  • If you have not created an option file, please refer to setup
  • Execute Test Expectations Test Units only
 > stride --options_file myoptions.txt --run TestExpect_Seq --run TestExpect_Data --run TestExpect_Misc
 Loading database...
 Connecting to device...
 Executing...
 test unit "TestExpect_Seq"
   > 2 passed, 0 failed, 0 in progress, 1 not in use.
 test unit "TestExpect_Data"
   > 2 passed, 0 failed, 0 in progress, 1 not in use.
 test unit "TestExpect_Misc"
   > 3 passed, 1 failed, 0 in progress, 1 not in use.
 -----------------------------------------------------------
 Summary: 7 passed, 1 failed, 0 in progress, 3 not in use.

 Disconnecting from device...
 Saving result file...
  • Review the details of the test results using a Browser. Open TestApp.xml which can be found in the sample_src directory (based on the output option). By opening the xml file in a web browser the xsl is automatically applied to create html.

Implement Exercise

  • TestExpect_Seq::Exercise
    • Validate ALL upper case Test Points
    • Use Unordered and Nonstrict sequencing
    • Use sut_DoSequencing(SEQ_1) to generate part of the sequence
    • Use sut_start_thread(SEQ_3) to generate the rest of the sequence
    • Add a NOTE_INFO(..) to log that the test is executing


  • TestExpect_Data::Exercise
    • Validate the following Test Points {D, G, F, H}
    • Check that F occurs 2 times
    • Use Unordered sequencing
    • Write a new custom predicate that validates data for both D and H Test Points
      • Add NOTE_INFO(..) to capture content of the Test Points
      • Add extra check for D that the status is GOOD
      • Confirm that data fields d1 and d2 are as expected
      • Pass the expected data fields (for both Test Points) as part of the user data within the setup
      • Use sut_start_thread(SEQ_3) to generate the sequence
    • Write another custom predicate for validating data for G
      • Add NOTE_INFO(..) to capture content of the Test Point
      • Validate the expected string using user data
    • Add a NOTE_INFO(..) to log that the test is executing


  • TestExpect_Misc::Exercise
    • Validate 2 sequences using a Trigger in between
      • Sequence 1 = D E and A
      • Trigger = C
      • Sequence 2 = F and F (2 occurrences)
    • Use ANY AT ALL special member with trigger
    • Use Ordered and Strict sequencing
    • Use sut_start_thread(SEQ_2) to generate the expected sequences


  • Execute Test Expectations Test Units only
 > stride --options_file myoptions.txt --run TestExpect_Seq --run TestExpect_Data --run TestExpect_Misc
 Loading database...
 Connecting to device...
 Executing...
   test unit "TestExpect_Seq"
     > 3 passed, 0 failed, 0 in progress, 0 not in use.
   test unit "TestExpect_Data"
     > 3 passed, 0 failed, 0 in progress, 0 not in use.
   test unit "TestExpect_Misc"
     > 4 passed, 1 failed, 0 in progress, 0 not in use.
   -----------------------------------------------------------
   Summary: 10 passed, 1 failed, 0 in progress, 0 not in use.

 Disconnecting from device...
 Saving result file...

Run and Publish Results

When you have completed the Exercise(s) publish your results to Test Space. To make it easier for now we recommend that you update your existing option file (myoptions.txt) with the following if not already done:

 #### Test Space options (partial) #####
 #### Note - make sure to change username, etc. ####
 --testspace https://username:password@yourcompany.stridetestspace.com
 --project Training
 --name YOURNAME
  > stride --options_file myoptions.txt --run TestExpect_Seq --run TestExpect_Data --run TestExpect_Misc --space TestExpect --upload

Note: This space has been set up with a Baseline of expected test results that you can use to validate your results.

Reference

The following reference information is related to passing parameters to Test Units.

Wiki

Samples

  • Test Point Sample - Demonstrates simple technique to monitor and test activity occurring in another thread