Running and Publishing the Expectations Sample: Difference between revisions

From STRIDE Wiki
Jump to navigation Jump to search
 
(21 intermediate revisions by 3 users not shown)
Line 1: Line 1:
To leverage this the '''Tests in Script''' introduction sample you need to install the [[Framework_Installation#Windows| STRIDE Framework for Windows]] package on your PC. This requires version 4.1.04 or beyond. Linux desktops will be supported in early Q2.
== Building Instrumented Source Under Test ==
== Building Instrumented Source Under Test ==
In this step, will will add pre-instrumented examples source that provide an overview of STRIDE instrumentation techniques. The [[Expectations Sample]] sources are described in the linked article.
In this step, we will add pre-instrumented example source that provide an overview of STRIDE instrumentation techniques. The [[Expectations Sample]] sources are described in the linked article.
 
This step requires an installation of the STRIDE Framework package. If not installed, please see [[Framework Installation]] for more information.
 
=== How the SDK Makefile Includes Tests ===
The SDK Makefile is set up so that all <tt>.c</tt> <tt>.cpp</tt> and <tt>.h</tt> files found in the directory <tt>SDK/Windows/sample_src</tt> are included in the compile and link of the '''TestApp''' target.


Further--as a pre-compilation step--any <tt>.h</tt> files found in <tt>sample_src</tt> are submitted to the [[s2scompile|S2 compiler]] and subsequent [[Build Tools]]. This will result in
To begin, be sure that TestApp is not running, then copy the <tt>.c</tt> and <tt>.h</tt> files found in <tt>Samples/test_in_scrips/Expectations</tt> to <tt>SDK/Windows/sample_src</tt> (or <tt>SDK/Posix/sample_src</tt> for Linux).
* the detection of [[Test_Unit_Pragmas| test pragmas]] used to declare Test Units in these <tt>.h</tt> files
* the detection of [[Scl_function | function pragmas]] used to declare remoting of functions also found in <tt>.h</tt> files
* the inclusion of metadata into the sidb file created
* the generation of an intercept module required for executing tests


=== Build Steps ===
Once the files have been copied to <tt>sample_src</tt>, simply build TestApp as described in [[Building_an_Off-Target_Test_App | here]].
To begin, be sure that TestApp is not running, then copy the <tt>.c</tt> and <tt>.h</tt> files found in <tt>Samples/test_in_scrips/Expectations</tt> to <tt>SDK/Windows/sample_src</tt>. As described above, the presence of these files in the <tt>sample_src</tt> will result in the submission of the <tt>.h</tt> files to the STRIDE Build Tools.


Once the files have been copied to <tt>sample_src</tt>, simply build TestApp using the SDK makefile.
== Running Expectations Sample ==
 
====Windows====
<ol>
<li>If using Microsoft Visual Studio, open a Visual Studio command prompt<ref>To open a Visual Studio Command prompt: Click the Start button, point to All Programs, Microsoft Visual Studio 200X, Visual Studio Tools, and then click Visual Studio 200X Command Prompt.</ref> to ensure that the compiler and linker are on your PATH.
</li>
<li>Build the test app using the supplied GNU make. (You will get Makefile errors if you use the default make.)
<source lang="dos">
cd %STRIDE_DIR%\SDK\Windows\src
..\bin\make testapp
</source>
</li>
<li>The file <tt>%STRIDE_DIR%\SDK\Windows\out\bin\TestApp.exe</tt> will be produced
</li>
<li>Note that the STRIDE database file <tt>%STRIDE_DIR%\SDK\Windows\out\TestApp.sidb</tt> is also produced
</li>
</ol>
 
== Running Expectation Sample ==


Here we will run the test module that is provided in this example .
Here we will run the test module that is provided in this example .


# Run TestApp in a console window as in [[Building Target App | earlier steps]].
# Run the build above TestApp in a console window.
# Invoke <tt>stride</tt> in a separate console window (different from the running TestApp) -- as shown below and verify Summary results.
# Invoke <tt>stride</tt> in a separate console window (different from the running TestApp) -- as shown below and verify Summary results.


Line 51: Line 21:
</pre>
</pre>


The command line arguments are very long, so we'll want to create a text file named ''RunSample.txt'' in the <tt>SDK\Windows</tt> directory as an option file to submit to <tt>stride</tt>.
The command line arguments are very long, so we'll want to create a text file named ''RunSample.txt'' (for example) in the <tt>SDK\Windows</tt> (or <tt>SDK/Posix</tt> for Linux)  directory as an option file to submit to <tt>stride</tt>.


If you haven't done so already, start <tt>TestApp</tt> running in a separate console window.
If you haven't done so already, start <tt>TestApp</tt> running in a separate console window.


Now run stride as follows (starting from the <tt>SDK\Windows</tt> directory):
Now run stride as follows (starting from the <tt>SDK\Windows</tt> or <tt>SDK/Posix</tt> directory):


<source lang="bash">
<source lang="bash">
Line 66: Line 36:
Loading database...
Loading database...
Connecting to device...
Connecting to device...
  runtime version: 4.1.0x
Executing...
Executing...
   test module "C:\stride\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm"...
   test module "C:\stride\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm"...
     > 3 passed, 1 failed, 0 in progress, 0 not in use.
     > 12 passed, 2 failed, 0 in progress, 0 not in use.
   ---------------------------------------------------------------------
   ---------------------------------------------------------------------
   Summary: 3 passed, 1 failed, 0 in progress, 0 not in use.
   Summary: 12 passed, 2 failed, 0 in progress, 0 not in use.


Disconnecting from device...
Disconnecting from device...
Line 83: Line 52:
<pre>
<pre>
--upload
--upload
--testspace https://USER:PASS@mycompany.stridetestspace.com
--testspace USER:PASS@mycompany.stridetestspace.com
--project Sandbox
--project Sandbox
--space Samples
--space Samples
Line 100: Line 69:
Loading database...
Loading database...
Connecting to device...
Connecting to device...
  runtime version: 4.1.0x
Executing...
Executing test units...
   test module "C:\stride\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm"...
   Executing script "C:\stride\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm"...
     > 12 passed, 2 failed, 0 in progress, 0 not in use.
     > 3 passed, 1 failed, 0 in progress, 0 not in use.
   ---------------------------------------------------------------------
   ---------------------------------------------------------------------
   Summary: 3 passed, 1 failed, 0 in progress, 0 not in use.
   Summary: 12 passed, 2 failed, 0 in progress, 0 not in use.


Disconnecting from device...
Disconnecting from device...
Line 117: Line 85:
* You will have to replace ''USER:PASS'' with your S2-assigned TestSpace user name and password
* You will have to replace ''USER:PASS'' with your S2-assigned TestSpace user name and password
* You will have to replace ''mycompany'' with your S2-assigned subdomain name
* You will have to replace ''mycompany'' with your S2-assigned subdomain name
* The project "Sandbox" and TestSpace "Expectations" have been pre-created within your company STRIDE TestSpace
* The project "Sandbox" and TestSpace "Expectations" should already be created within your company STRIDE TestSpace




Line 125: Line 93:
At the top of the next page, click on the All Projects link to view the status of existing projects. Here you should see the Sandbox project listed, with its contained TestSpace ''Expectations'' shown.
At the top of the next page, click on the All Projects link to view the status of existing projects. Here you should see the Sandbox project listed, with its contained TestSpace ''Expectations'' shown.


Clicking the ''Expectations'' link will present you with the ''Expectations'' TestSpace page. From the top-line results at the bottom of the page you can drill down into the Sequence_1 results to see the test details.
Clicking the ''Expectations'' link will present you with the ''Expectations'' TestSpace page. From the top-line results at the bottom of the page you can drill down into the Sequence_1 (assuming this is the first time publishing) results to see the test details.


=== Analyzing the Results ===
=== Analyzing the Results ===
Line 133: Line 101:
<references/>
<references/>


[[Category:The Sandbox]]
[[Category:Installation]]
[[category:Samples]]

Latest revision as of 20:53, 20 February 2013

Building Instrumented Source Under Test

In this step, we will add pre-instrumented example source that provide an overview of STRIDE instrumentation techniques. The Expectations Sample sources are described in the linked article.

To begin, be sure that TestApp is not running, then copy the .c and .h files found in Samples/test_in_scrips/Expectations to SDK/Windows/sample_src (or SDK/Posix/sample_src for Linux).

Once the files have been copied to sample_src, simply build TestApp as described in here.

Running Expectations Sample

Here we will run the test module that is provided in this example .

  1. Run the build above TestApp in a console window.
  2. Invoke stride in a separate console window (different from the running TestApp) -- as shown below and verify Summary results.

Here are the command line parameters that we will submit to stride

--run=%STRIDE_DIR%\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm
--database ./out/TestApp.sidb 
--device TCP:localhost:8000

The command line arguments are very long, so we'll want to create a text file named RunSample.txt (for example) in the SDK\Windows (or SDK/Posix for Linux) directory as an option file to submit to stride.

If you haven't done so already, start TestApp running in a separate console window.

Now run stride as follows (starting from the SDK\Windows or SDK/Posix directory):

stride -O RunSample.txt

The output should look like this:


Loading database...
Connecting to device...
Executing...
  test module "C:\stride\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm"...
    > 12 passed, 2 failed, 0 in progress, 0 not in use.
  ---------------------------------------------------------------------
  Summary: 12 passed, 2 failed, 0 in progress, 0 not in use.

Disconnecting from device...
Saving result file...

Publishing Results to Test Space

To automatically publish test results add the following options to RunSample.txt:

--upload
--testspace USER:PASS@mycompany.stridetestspace.com
--project Sandbox
--space Samples


Now run stride as follows:

stride -O RunSample.txt

The output should look like this:


Loading database...
Connecting to device...
Executing...
  test module "C:\stride\Samples\test_in_script\Expectations\s2_expectations_testmodule.pm"...
    > 12 passed, 2 failed, 0 in progress, 0 not in use.
  ---------------------------------------------------------------------
  Summary: 12 passed, 2 failed, 0 in progress, 0 not in use.

Disconnecting from device...
Saving result file...
Uploading to test space...


A few things to note:

  • the run parameter specifies the test module to execute
  • You will have to replace USER:PASS with your S2-assigned TestSpace user name and password
  • You will have to replace mycompany with your S2-assigned subdomain name
  • The project "Sandbox" and TestSpace "Expectations" should already be created within your company STRIDE TestSpace


Viewing Results in Test Space

First navigate to the S2-provided TestSpace with your browser. The URL has the form: https://companyname.stridetestspace.com. On the page that is presented, enter your login credentials.

At the top of the next page, click on the All Projects link to view the status of existing projects. Here you should see the Sandbox project listed, with its contained TestSpace Expectations shown.

Clicking the Expectations link will present you with the Expectations TestSpace page. From the top-line results at the bottom of the page you can drill down into the Sequence_1 (assuming this is the first time publishing) results to see the test details.

Analyzing the Results

At this point, we recommend that you take some time to review the techniques used in the Expectations sample tests and correlate the results shown in Test Space with the various STRIDE constructs in the sample source. The article Expectations Sample describes the tests in detail.