STRIDE Test Space

From STRIDE Wiki
Revision as of 20:44, 5 June 2009 by Mikee (talk | contribs)
Jump to navigation Jump to search

What is STRIDE Test Space?

STRIDE Test Space is a hosted web application for storing and analyzing your test results. Test Space accepts data that results from the execution of STRIDE Test Units or Test Scripts. Data is uploaded to Test Space manually (using the web interface) or automatically from one of the STRIDE execution tools (STRIDE Runner or STRIDE Studio). Once data is uploaded, it is retained until it is manually removed or automatically deleted (depending on the space configuration).

STRIDE Test Space organizes results in a hierarchy of projects, test spaces and result sets. Each time you upload data to a test space, a new result set is created (unless you explicitly add your data to an existing one). With a given result set, test results are further organized into test suites containing test cases. Any number of these organizational entities can be used to create a fluid hierarchy of test results that adapts to shifting needs during a product lifecycle.

Test Space is primarily a repository for your test results. Whether you are doing ad-hoc testing with the STRIDE Framework or running fully-automated continuous integration of your STRIDE-enabled code base, Test Space provides a central place to store all the test data that is produced by the tests. Results are uploaded into specific test spaces, which allows the maintainer to control access and notifications for the results.

Test Space also provides easy regression analysis in the form of baseline comparison. Users can create one or more fixed baseline data sets from existing results and further specify that all result sets in a test space should be compared with the fixed set of data. This is sometimes known as "gold standard" comparison and is very useful for detecting regressions in a set of stable tests. Similarly, individual test spaces can be configured to automatically compare each new result set with the previous result. This kind of comparison can also be helpful in detecting regressions in stable code bases. Baseline comparison data can also optionally include timing thresholds so as to enable automatic comparison of test case durations.

Collaboration and communication are built-in to Test Space in the form of messaging and notifications. Once users are granted access to a specific test space, they have full access to view and manage test results. The test space properties can also be configured to notify all users of potential problems with new result sets - specifically regressions against baselines, log errors, and timing threshold violations (if your baseline was configured to compare durations).

Test Space enables focused communication about test results by allowing users to create simple message threads. Messages can be associated with a test space, a result set, or even with a specific test suite. The latter can be very useful when users need to discuss specific test failures, while messages at the space or result set level might be used, for example, to discuss general trends and goals.

What can I do with STRIDE Test Space?

Uploading Results

New results set can be easily uploaded using the automatic upload features in the STRIDE Runner or STRIDE Studio. You can also upload XML results data files manually using the web interface by navigating to the detail view for a specific test case and clicking the upload data button.

Viewing Results

STRIDE Test Space presents several distinct views of your data.

Overview

The overview shows any recent activity (within the last 10 days) across all of your test spaces. Noteworthy events include the following:

  • a new result set added. Summary stats and any potential problems (errors or setbacks) are reported.
  • a new message added
  • a new comment added to an existing message

This view is the default view for Test Space. This data is also available as an RSS feed (your browser should detect the feed and allow you to subscribe).

All Projects

The All Projects view shows all test spaces to which you have access, grouped by project. This is typically how users will navigate to specific test spaces, although links from events will also take you to specific test spaces. This view shows the space names, result set counts, and two spark-line graphs of the pass and fail count trends. For stable test-beds, these graphs should appear flat - otherwise they will indicate the trend of pass and fail counts for your current development effort.

Project View

This view is identical to the All Projects view, except it displays test spaces for only one project. You can navigate to this view my clicking on a project name in the All Projects view.

Space View

The Space view provides an overview for a specific test space. There are three segments to this view: events, trend graphs, and result set list. The events section shows the recent events associated with this test space. These are the same events that appear for the given space in the Overview. The trend graph show bar and line trend charts for the 15 most recent result sets. Both the events segment and trend graphs can be hidden from view using the corresponing hide buttons.

The last segment is the result set list. This shows all result sets in descending sequence order (most recently added result sets at top). Each row in this tabular view includes the result set name and description, total duration, pass and fail totals and any baseline comparison results (if comparisons are activated for the space).

Results View

Clicking on a result set name will take you to the Results View. The results view show a list of test suites. The presence of subsuites and test cases within the suite is indicated by Arrow blue square right.GIF button. Clicking this will cause the immediate children to be displayed. This drilldown button will be shown for any suites that contain children. Other icons that appear indicate the presence of other data for display - e.g. the File:Book blue closed.png icon indicates that annotatations are present (used for log messages) and the Plus expand.GIF button indicates that comments can be displayed for the item.

Each row in the display also shows the pass/fail status for test cases, or the total number of each for test suites. Baseline data comparison data is shown to the right, if a comparison is set-up in the test space's properties.

Baseline Comparison

Notifications

Messages

Glossary of Terms

Test Case

This is the unit of measure for test results - and single pass/fail entity. Test cases can be supplemented with additional information in the form of annotations and comments.

Test Suite

TBD

Annotation

TBD

Comments

TBD

Result Set

a result set is a collection of test suites and test cases that represent a complete set of test results for a given test space. In it's simplest form, is the output from a single execution of the STRIDE Runner for a given set of test units.

Test Space

A test space is a logical grouping of test results. Although STRIDE Test Space does not enforce any relationship between result sets, test spaces are only useful for analysis when you use them to hold sequential result set data that represent the same set of tests. That is, meaningful comparison between subsequent results sets can only be done when each result set represents execution of the same set of tests.

Each test space has properties that allow you to control who has access, if and how the results are compared to other results, whether to notify users of potential regressions, and how many result sets to keep in the space.

Project

A project is logical grouping of test spaces. Every test space must be assigned to one and only one project.

Baseline

TBD

Messages

TBD