Note: The detailed procedure to produce test documents from Jira is available in the following page DM Test Documents Procedure.
The data management test approach is based on the following principles:
The documents involved are:
DM considers the possibility to use the Test Management for Jira (ATM by Adaptivist). The main advantage is to have an easy way to manage test cases and test executions and to relate them to requirements.
The documentation can be extracted from Jira:
The test specification is the document where all test cases defined for a specific component in the Document Tree are baselined. The obsolete test cases will be included also.
The test cases have to be written in JIRA, in the LVV project, under the corresponding folder.
Sections 1 and 2 of the test specification need to be written directly using Latex and changes submitted in the corresponding GitHub repository. Section 3, 4 and appendix A will be generated from JIRA. Additional appendix sections can be added by the author.
A test case needs to be formulated in a general way, so that it can be executed multiple times in different conditions, like for example, different software versions or different dataset versions.
To fully characterize a Test Case, 3 sections need to complete:
The Test Script consists of a sequence of steps to be executed one after the other.
Each Step has 3 parts:
These are the rules to use to write steps in the Test Script tab:
At the moment (August 2018) plain text test scripts will not be included in the test specification.
Input data and parameters are not taken into account also when generating the test specification nor the test report. Future versions of the document generation script may be able to handle this information.
In the Traceability tab, one or more Verification Elements need to be linked.
Verification Elements are defined in the model in MagicDraw and synchronized in Jira using the Synedia plugin. Each Verification Element is related to a requirement in the model.
Verification Elements will be included in the test specification for each test case.
Links to confluence pages or other web pages can be added to this section.
Follows a table with some recommendations for each test case field in the main tab:
Field Name | Old Test Spec Name | How to fill it | Additional Comments |
---|---|---|---|
Name | Test case subsection title | Has to be a short string indicating the purpose of the test. | It is recommended to avoid using the requirement name and ID. LDM-639 test cases have been named following the verification elements, and therefore the requirements names, mainly for lack of time. Note that the Test Design is not used anymore and there is no need to have at the beginning of the name an identifier following the previous naming. Old test cases will keep it only for backward compatibility. |
Objective | Test Items | Describe the scope of the test. The requirement to be implemented can be used as a reference, describe what you are going to test. It may be only a part of the requirement. In some cases, milestones defined in the test plan LDM-503 may drive the scope of the test. | This field will appear in the Test Specification as "Test Items". Avoid including the requirements text. |
Precondition | Input Specification | This field should include all inputs required to run the test. | This field will appear in the Test Specification as "Input Specifications". For example, the input data required (do not specify versions). The field is provided by the test management plugin in JIRA and can't be renamed. |
Folder | A Test Specification document will be produced for each folder, including all test cases defined in it. | ||
Status | When you create a new test case the status is set to "Draft". When the test case is ready to be submitted for approval, set the status to "Defined". The test cases will not be approved one by one but will be submitted to the CCB for approval (via RFC) within a new version of the test specification. Once the Test Specification is approved the status of the "Defined" test cases will be set to "Approved" and the new issue of the document uploaded in Docushare. Once a test case is not valid anymore, its status has to be set to "Deprecated" When you are going to modify approved test cases:
| Do not delete test cases, this may remove important information from the system. The Jira Test Management plugin is not going to enforce a new version when an approved test case is modified. | |
Priority | Set the priority that you think is more appropriate. By default is set to "Normal". | In the future, this information can be used to prioritize test cases. At the moment is not used. | |
Component | DM (Data Management) | This field may be useful when filtering across test cases. | |
Owner | Who is in charge to write and maintain the test case. | In the future, the test case may be executed by a different person, but the owner will not change. | |
Estimated Time | Fill it in case you have an idea how much time it can take to run the test case, otherwise, let it blanc | ||
Verification Type | It can be "Test", "Inspection", "Demonstration" or "Analysis". Use the most appropriate type. In general, this should be a "Test". | This information should be derived by the requirement and Verification Element definition. | |
Verification Configuration | Not required. | It can be used in case a test case shall test a specific configuration of a software product, but this is against the principle that a Test Case should be generally formulated. Specific configurations will be specified during the test run definition. | |
Predecessor | Intercase Dependencies | This is the list of test cases that need to be completed (successfully) before the actual test case can be executed. | This does not imply that those test cases are part of the test script. Usually, they are not. |
Critical Event | This field is mandatory but should not be relevant for DM. Therefore set it to "False". | ||
Associated Risks | This field should not be relevant to DM. Leave it blanc. | ||
Unit under test | This field should not be relevant to DM. Leave it blanc. | In the future, we may use the components defined in the model in Magic Draw to group test cases, instead of using the folder. | |
Required Software | Environment Needs - Software | List here the required software packages that are needed to be installed in the system to run the test | In case the test case is to verify the functionalities of a specific DM software package, for example, science_pipeline, this shall NOT be listed here, but in the Objective field, Test items section. |
Test Equipment | Environment Needs - Hardware | List here the required hardware that is needed to be installed in the system to run the test. This usually implies a server with a specific CPU power, RAM and available disk space. | The exact hardware used for the test will be specified in the Test Plan or the Test Run. This information can be different each time a test case is executed. |
Test Personel | This field has not been used until now by DM during the definition of the test. Leave it blanc or list here external people that may need to be involved during the verification | These people are not the owner nor the test engineer that is going to run the test. They may be for example stakeholder, that need to assess the scientific results to ensure that the test has passed. | |
Safety Hazards | This field should not be relevant to DM. Leave it blanc. | ||
Required PPE | This field should not be relevant to DM. Leave it blanc. | ||
Postconditions | Output Specifications | Specify here what output is expected from the test. | For example, the output data expected. This field has been called "Postcondition" due to the duality with the "Precondition" field provided by default. |
With the introduction of the Jira Test Management approach, two specific objects are foreseen to guide the test activities planned in a specific moment in time (test campaign):
Before a test activity can start:
The entire document is generated from Jira, except for the history table and curator information. Appendix sections can be added by the author.
To start defining a test campaign, a corresponding Test Plan (LVV-PXXX) has to be created in Jira. Test Plans are organized in folders in the same way as Test Cases are.
In the Test Plan Details, the main tab of the Test Plan object in Jira, the following information needs to be provided before the test campaign starts.
Field Name | Identification in the old Test Report template | How to fill it | Additional Comments |
---|---|---|---|
Name | Short identification of the test activity | This field will be used as the title of the "Test Plan and Report document" | |
Objective | Objective / Scope | Describe the Objective and Scope of the test campaign | The first part of the field will be included in the Test Report / Objective subsection. Add a bold "Scope" string followed by the corresponding text. If no scope is provided, the Test Report / Scope subsection will be let empty. |
Folder | this information is not used to generate documents, but it helps navigate and find information | ||
Status |
| ||
Owner | The responsible for the test campaign planning | ||
Verification Environment | Test Configuration - Hardware | Describe the environment where the test is going to be executed, including hardware and low-level software like for example operating system. | |
Entry Criteria | Not relevant for DM | ||
Exit Criteria | Not relevant for DM | ||
PMCS Activity | Not relevant for DM | ||
Observing Required | Not relevant for DM, except in case of integration test with Camera | ||
System Overview | System Overview | Description of the system under test focused on the actual test campaign (System Overview) and Applicable Documents for this specific test campaign. | The first part of the field will be included in the Test Report / System Overview. Add a bold "Applicable Documents" string followed by the corresponding text. If no Applicable Documents text is provided, the Test Report / Applicable Documents subsection will be let empty. |
Verification Artifact | From SE Test Management Architecture, this field should contain web links to resulting data product(s) including the Test Reports, Analysis, etc. | To be completed when the Test Plan and Report document is available. | |
Overall Assessment | Overall Assessment | Provide first a statement that the test campaign was successful or not. The add a short text where the above statement is justified. | To be completed after all test cases have been executed. |
Recommended Improvements | Recommended Improvements | Provide improvements and suggestions as an outcome of the test campaign. | To be completed after all test cases have been executed. |
The Traceability tab in the Test Plan shall link at least to one specific Test Run.
Additional information has to be provided here, if relevant:
This Jira ATM object was called Test Run in the previous Jira version. All references to Test Run on this page shall be considered references to Test Cycles.
To complete the definition of the test campaign, some information needs to be provided in the Test Run(s) to be executed.
Each Test Run can be related only to one Test Plan.
Each Test Case can be included only once in a Test Run (limitation of the Test Management plugin). If a Test Case has to be executed in two different conditions during a test campaign, two Test Runs need to be defined and associated with the same Test Plan.
Following fields are required to be filled in each Test Run associated with the Test Plan, before the Test Plan is approved:
Field Name | Identification in the old Test Report template | How to fill it | Additional Comments |
---|---|---|---|
Name | Short identification of the Test Run | In case of a single test run associated with a test campaign, this can be the same as the Test Plan name. In the case of multiple Test Runs, they have to identify the conditions that each of them describes, like for example test environment, datasets, configuration, etc | |
Description | Short description of the Test Run | If the case, clarify what differentiate different Test Runs associated with the same Test Plan. | |
Folder | this information is not used to generate documents, but it helps navigate and find information | ||
Status |
| ||
Version | Not relevant for DM | ||
Iteration | Not Relevant for DM | ||
Owner | The responsible for the test run | ||
Planned Start Date | The date when the test execution should start | Leave it blank if this information is not available | |
Planned End Date | The date when the test execution should end | Leave it blank if this information is not available | |
Software Version / Baseline | Test Configuration - Software | List the software required to start a test activity | Note that, if the objective of the Test Plan is to test a specific version of a software product, this has NOT to be listed in this custom field. |
Configuration | Test Configuration | Configuration to be used. Input Data | The section in the Test Plan and Report "Test Configuration" will be filled using the contest of this field for the subsection regarding Input Data and Configuration to be used. The other subsections, Hardware, Software, and Personnel, will be derived from another custom field available in the system already explained before, or from information available in different objects. |
Once the Test Plan has been approved by the corresponding response as described above, the test campaign can start.
For each Test Run, the tester has to run each Test Case Step and add in the corresponding field the obtained result (Actual Result).
To keep in mind that the content of the field "Actual Result" will be included in the test report. Please ensure that its phrasing is consistent and self-explanatory and properly formatted.
During the execution of a test case, possible Step statuses are:
You can use the time recording function to trace the execution time of each test case.
Issues shall be linked in the corresponding section in the test player and mentioned in the "Actual Result" field.
Once all steps in a test case have been completed, a comment shall be added, summarizing the test result. This comment will be reported in the "Test Plan and Report" summary table and will be the most visible information for each test case executed in the test campaign.
Once all Test Runs associated with a Test Plan have been completed, the following activity will be performed to close the test campaign:
(to be confirmed)
The Jira Test Management plugin provides at the moment only limited text formatting capabilities. Besides, we need to fit the information provided in Jira in a readable document.
Follows the rules that need to be followed to have as much as possible well-formatted and readable test documentation produced automatically from Jira.
In case the formatting is broken, the text needs to be fixed in Jira and NEVER in the tex document in Github.
The procedure to generate a document from Jira is in principle the same as the user used for generating requirements documents from MagicDraw.
The approval of the documents follows the usual RFC process, including the RFC specific branch.