STRIDE Test Space
What is STRIDE Test Space?
STRIDE Test Space is a hosted web application for storing and analyzing your test results. Test Space accepts data that results from the execution of STRIDE Test Units or Test Scripts. Data is uploaded to Test Space manually (using the web interface) or automatically from one of the STRIDE execution tools (STRIDE Runner or STRIDE Studio). Once data is uploaded, it is retained until it is manually removed or automatically deleted (depending on the space configuration).
STRIDE Test Space organizes results in a hierarchy of projects, test spaces and result sets. Each time you upload data to a test space, a new result set is created (unless you explicitly add your data to an existing one). With a given result set, test results are further organized into test suites containing test cases. Any number of these organizational entities can be used to create a fluid hierarchy of test results that adapts to shifting needs during a product lifecycle.
Test Space is primarily a repository for your test results. Whether you are doing ad-hoc testing with the STRIDE Framework or running fully-automated continuous integration of your STRIDE-enabled code base, Test Space provides a central place to store all the test data that is produced by the tests. Results are uploaded into specific test spaces, which allows the maintainer to control access and notifications for the results.
Test Space also provides easy regression analysis in the form of baseline comparison. Users can create one or more fixed baseline data sets from existing results and further specify that all result sets in a test space should be compared with the fixed set of data. This is sometimes known as "gold standard" comparison and is very useful for detecting regressions in a set of stable tests. Similarly, individual test spaces can be configured to automatically compare each new result set with the previous result. This kind of comparison can also be helpful in detecting regressions in stable code bases. Baseline comparison data can also optionally include timing thresholds so as to enable automatic comparison of test case durations.
Collaboration and communication are built-in to Test Space in the form of messaging and notifications. Once users are granted access to a specific test space, they have full access to view and manage test results. The test space properties can also be configured to notify all users of potential problems with new result sets - specifically regressions against baselines, log errors, and timing threshold violations (if your baseline was configured to compare durations).
Test Space enables focused communication about test results by allowing users to create simple message threads. Messages can be associated with a test space, a result set, or even with a specific test suite. The latter can be very useful when users need to discuss specific test failures, while messages at the space or result set level might be used, for example, to discuss general trends and goals.
What can I do with STRIDE Test Space?
Uploading Results
New results set can be easily uploaded using the automatic upload features in the STRIDE Runner or STRIDE Studio. You can also upload XML results data files manually using the web interface by navigating to the detail view for a specific test case and clicking the upload data button.
Viewing Results
STRIDE Test Space presents several distinct views of your data.
Overview
The overview shows any recent activity (within the last 10 days) across all of your test spaces. Noteworthy events include the following:
- a new result set added. Summary stats and any potential problems (errors or setbacks) are reported.
- a new message added
- a new comment added to an existing message
This view is the default view for Test Space. This data is also available as an RSS feed (your browser should detect the feed and allow you to subscribe).
All Projects
The All Projects view shows all test spaces to which you have access, grouped by project. This is typically how users will navigate to specific test spaces, although links from events will also take you to specific test spaces. This view shows the space names, result set counts, and two spark-line graphs of the pass and fail count trends. For stable test-beds, these graphs should appear flat - otherwise they will indicate the trend of pass and fail counts for your current development effort.
Project View
This view is identical to the All Projects view, except it displays test spaces for only one project. You can navigate to this view my clicking on a project name in the All Projects view.
Space View
The Space view provides an overview for a specific test space. There are three segments to this view: events, trend graphs, and result set list. The events section shows the recent events associated with this test space. These are the same events that appear for the given space in the Overview. The trend graph show bar and line trend charts for the 15 most recent result sets. Both the events segment and trend graphs can be hidden from view using the corresponing hide buttons.
The last segment is the result set list. This shows all result sets in descending sequence order (most recently added result sets at top). Each row in this tabular view includes the result set name and description, total duration, pass and fail totals and any baseline comparison results (if comparisons are activated for the space).
Results View
Clicking on a result set name will take you to the Results View. The results view show a list of test suites. The presence of subsuites and test cases within the suite is indicated by button. Clicking this will cause the immediate children to be displayed. This drilldown button will be shown for any suites that contain children. Other icons that appear indicate the presence of other data for display - e.g. the icon indicates that annotatations are present (used for log messages) and the button indicates that comments can be displayed for the item.
Each row in the display also shows the pass/fail status for test cases, or the total number of each for test suites. Baseline data comparison data is shown to the right, if a comparison is set-up in the test space's properties. Baseline fields will generally only be displayed when there are differences with the baseline data - these differences can include status and duration (if timing data is present in the baseline).
Baseline Comparison
TBD
Notifications
A test space can be configured to notify users of errors (as log error messages) in a result set and/or regressions relative to a baseline (whether fixed or sequential). Regression notifications for status (pass/fail) and duration are separately configurable in the test space's properties.
Users can also be notified when a new message or comment is added somewhere within the test space. Each message thread has separate notification properties that control who will get notified as comments are added to the message.
Messages
Messages are light-weight discussion threads with context - they can apply to a specific suite, result set or to generally to a test space. In the latter case, users can select a title for the message thread since there can be more than one message thread for a given test space. Messages (and comments thereto) support limited formatting via Textile markdown. Messages can be added to result sets and suites by selecting the icon. If a message already exists for a particular item, the icon will appear with a black background.
Glossary of Terms
Test Case
This is the unit of measure for test results - and single pass/fail entity. Test cases can be supplemented with additional information in the form of annotations and comments.
Test Suite
A Test Suite is just a grouping of test cases. Test Suites can have descriptions and annotations associated with them, but primarily they serve to group test cases. In the STRIDE Test Framework, each Test Unit creates a suite with the name of the test unit and the tests are placed in this suite.
Annotation
Annotations provide additional information about test cases or test suites. Each annotation has a level associated with it as well as a name and description. For this reason, the STRIDE Test Framework maps log messages from srLOG macros to annotations.
Result Set
a result set is a collection of test suites and test cases that represent a complete set of test results for a given test space. In it's simplest form, is the output from a single execution of the STRIDE Runner for a given set of test units.
Test Space
A test space is a logical grouping of test results. Although STRIDE Test Space does not enforce any relationship between result sets, test spaces are only useful for analysis when you use them to hold sequential result set data that represent the same set of tests. That is, meaningful comparison between subsequent results sets can only be done when each result set represents execution of the same set of tests.
Each test space has properties that allow you to control who has access, if and how the results are compared to other results, whether to notify users of potential regressions, and how many result sets to keep in the space.
Project
A project is logical grouping of test spaces. Every test space must be assigned to one and only one project.
Baseline
A baseline is a copy of some test data (suites and cases) that is maintained for the purpose of comparison with other results sets. This is the primary mechanism in Test Space for determining if test results have regressed.
Messages
Messages are simple discussion threads that can be attached to test spaces, result sets, or test suites. Any number of messages can be created for a test space, but only one message thread can be created for result sets and test suites. For any message, the initial message can be followed-up by sequential comments added by other users.