You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

The data management test approach is based on the following principles:


  • The tests shall be easy to write and maintain
  • The tests can be executed multiple times, depending on the scheduled milestones and needs
  • The test conditions shall be well identified in order to: i) make the test repeatable ii) make the behaviour and the results as close as possible to operations


The documents involved are:

  • The Test Specification: it contains the list of test cases written on  a specific DM component, plus some descriptive parts on the test activity involved for that component.
  • The Test Report: it contains some descriptive part on the test execution, a test case runs summary report and a detailed report on the test done.


DM consider the possibility to use the Test Management for Jira (by Adaptivist). The main advantages is to have an easy way to manage test cases and test report and to related them to requirements.


The documentation can be extracted from Jira:

  • The test specification will contain the test cases defined to a specific DM component.
  • The Test Report: will be divided in 2 parts:
    • the first part (the plan) contains the planned test activity information (first list of test to be executed), environments, datasets, personnel.
    • the second part (the report) contains the actual test runs results, their assessment and additional information derived from the the execution of the test


Test Specification

The test specification will include all test cases defined for a specific component in the Document Tree. Obsolete test case will be included also.

The test cases have to be written in JIRA, in the LVV project, under the corresponding folder.

Sections 1 and 2 of the test specification need to be written directly using Latex and changes submitted in the corresponding github repository. Section 3, 4 and appendix will be generated from JIRA.

How to write a test case

A test case need to be formulated in a general way, in order that it can be executed multiple times in different conditions, like for example, different software versions or different dataset versions.

In order to fully characterize a Test Case, 3 sections need to completed:

  • the test details in the main tab of the Test Case
  • the steps in the Test Script tab of the Test Case
  • the traceability to the Verification Elements (requirements) in the Traceability tab of the Test Case

Test Script

The Test Script consist in a sequence of Steps to be executed one after the other.

Each Step has 3 parts:

  • the Description: describe what the step is suppose to do
  • the Test Data: describe which input data is required for the test step
  • the expected result from the step.

These are the rules to use to write steps in the  Test Script tab:

  • Write the step description in the most reproducible way
    • Ideally it should be possible to copy and paste the commands. In case of graphic interfaces each action shall be clearly described.
    • Avoid the reference to specific SW versions or dataset: the test case has to be as general as possible.
  • Add in the test data field the input data to use as input (if any)
  • Specify the expected result when not clear from the step description
    • this field will be included in the test report, but the step description will not. Ensure that the information reported here is clear enough to be understood without the need of the step description.
  • When a different test case is used in a step:
    • use only test cases that have been created for that purpose in the same folder (test spec)
    • avoid recursive use of test case (only one level)

Traceability to Requirements

In the Traceability tab, one or more Verification Elements need to be linked.

Verification Elements are defined in the model in MagicDraw.

Test Details

Follows a table with some recommendations for each test case field in the main tab:

Field NameOld Test Spec NameHow to fill itAdditional Comments
NameTest case subsection titleHas to be short string indicating the purpose of the test.

It is recommended to avoid using the requirement name and ID. LDM-639 test cases have been named following the verification elements, and therefore the requirements names, mainly for lack of time.

Note that the Test Design are not used anymore and there is not need to have at the beginning of the name an identifier following the previous naming. Old test cases will keep it only for backwards compatibility.

ObjectiveTest Items

Describe what the scope of the test.

Taking the requirement to be implemented as reference, describe what you are going to test. It may be only a part of the requirement.

In same cases, milestones defined in the test plan LDM-503 may drive the scope of the test.

This field will appear in the Test Specification as "Test Items". Avoid including the requirements text.
PreconditionInput SpecificationThis field should include all inputs required to run the test.

For example the input data required (do not specify versions).

The field is provided by the test management plugin in JIRA and can't be renamed.

Folder

A Test Specification document will be produced for each folder
Status

When you create a new test case the status is set to "Draft".

Once the Test Specification is approved (via RFC) the test draft test cases will be moved in status "Approved".

Once a test case is not valid anymore, its status has to be set to "Deprecated"

When you are going to modify an approved test cases:

  • create a new version
  • set status to "Draft"

Do not delete test cases, this may remove important information from the system.

The plugin is not going to enforce a new version when an approved test case is modified.

Priority
Set the priority that you think is more appropriate. By default is set to "Normal".

In the future this information can be used to prioritize test cases.

At the moment is not used.

Component
DM (Data Management)This field may be useful when filtering across test cases.
Owner
Who is in charge to write and maintain the test case.In the future the test case may be executed by a different person, but the owner will not change.
Estimated Time
Fill it in case you have an idea how much time it can take to run the test case, otherwise let it blanc
Verification Type

It can be "Test", "Inspection", "Demonstration" or "Analysis".

Use the most appropriate type. In general this should be "Test".

This information should be derived by the requirement and verification element definition.
Verification Configuration

This field should not by used in DM.

It can be used in case a test case shall test a specific configuration of a software product, but this is against the principle that a Test Case should be generally formulated. Specific configurations ban be specified during the test run definition and are related to the environment.
PredecessorIntercase DependeciesThis is the list of test cases that need to be completed (successfully) before the actual test case can be executed.This do not imply that those test cases are part of the test script
Critical Event

This field is mandatory but should not be relevant for DM. Therefore set it to "False"


Associated Risks
This field should not be relevant for DM. Leave it blanc.
Unit under test
This field should not be relevant for DM. Leave it blanc.In the future we may use the components defined in the model in Magic Draw to group test cases, instead of using the folder.
Required SoftwareEnvironment Needs - SoftwareList here the required software packages that is need to be installed in the system in order to run the testIn case the test case is to verify the functionalities of a specific DM software package, for example science_pipeline, this shall NOT be listed here, but in the Objective field, Test items section.
Test EquipmentEnvironment Needs - Hardware

List here the required hardware that is need to be installed in the system in order to run the test.

This usually imply a server with a specific CPU power, RAM and available disk space.

The exact hardware used for the test will be specified in the Test Plan or in the Test Run. This information can be different each time a test case is executed.
Test Personel
This field has not been used until now by DM during the definition of the test. Leave it blanc or list here external people that may need to be involved during the verificationThese people are not the owner nor the test engineer that is going to run the test. They may be for exampke stakeholder, that need to assess the scientific results in order to ensure that the test has passed.
Safety Hazards
This field should not be relevant for DM. Leave it blanc.
Required PPE
This field should not be relevant for DM. Leave it blanc.
PostconditionsOutput SpecificationsSpecify here what output is expected from the test.

For example the output data expected.

This field has been called "Postcondition" due to the duality with the "Precondition" field provided by default.


Test Plan and Report

With the introduction of the Jira Test Management approach, two specific object are foreseen in order to guide the test activities planned in a specific moment in time (test campaign):


  • Test Plan: give information on the planning of the test campaign, following an agreed schedule. The main source for the schedule for DM is LDM-503, but additional test campaign can be organized ad hoc.
  • Test Run: provides the information on which test cases have to be execute in a specific context. Multiple test cases can be associated with a test plan.


Before a test activity can start, all relevant information in the Test Plan, and part of the information the Test Report have to be provided. Therefore the first draft of the "Test Plan and Report" document can be generated and approved via RFC. 

Once the first issue of the "Test Plan and Report" is approved, the Test Plan status can be set to "Approved" in Jira.


Defining the Test Campaign

In order to start defining a test campaign, an corresponding Test Plan (LVV-PXXX) has to be defined in Jira. Test Plans are organized in folders in the same way as Test Cases are.

Test Plan Details

In the Test Plan Details, the main tab of the Test Plan object in Jira, following information need to be provided before the test campaign starts.



Field NameIdentification in the old Test Report templateHow to fill itAdditional Comments
Name
Short identification of the test activityThis field will be used as title of the "Test Plan and Report document"
ObjectiveObjectiveDescribe the Objective of the test campaing
Folder

this information is not used to generate documents, but it is helpful for navigating and find information
Status
  • Draft: when a test campaign is proposed or under definition
  • Approved: when a test campaign is fully defined, including test runs information, and it has been formally approved via RFC
  • Completed: when all the test runs associated with the test plan have been completed and all information is ready to be exported in a document
  • Deprecated: when an originally planned test campaign is removed from the planning

Owner
The responsible of the test campaign planning
Verification EnvironmentTest Configuration - HardwareDescribe the environment where the test are going to be executed, including hardware and low level software like for example operating system.
Entry Criteria
Not relevant for DM
Exit Criteria
Not relevant for DM
PMCS Activity
Not relevant for DM
Observing Required
Not relevant for DM, except in case of integration test with Camera
Verification Artifact
From SE Test Management Architecture, this field should contain web links to resulting data product(s) including the Test Reports, Analysis, etc.Suggestion to leave this field empty.


Test Plan Traceability

The Traceability tab in the Test Plan shall link at least to one specific Test Run.

Additional information can be provided here like for example:

  • issues related to the test campaign
  • confluence page
  • web links


Test Report Information required for the test campaign definition

In order to complete the definition of the test campaign, some information need to be provided in the test run(s) to be executed. 

The test run has to be related only with a Test Plan.

Each test case can be included only once in a Test Run. If a Test Case has to be executed in two different condition during a test campaign, two Test Runs need to be defined.


Following fields are required to be filled in each Test Run associated to the Test Plan, before the Test Plan is approved:

Field NameIdentification in the old Test Report templateHow to fill itAdditional Comments
Name
Short identification of the Test RunIn case of a single test run associated with a test campaign, this can be the same as the Test Plan name. In case of multiple Test Runs, they have to identify the conditions that each of them describes, like for example test environment, datasets, configuration, etc
Description
Short description of the Test RunIf the case, clarify what differentiate different Test Runs associated with the same Test Plan.
Folder

this information is not used to generate documents, but it is helpful for navigating and find information
Status
  • Not Executed: when the Test Run is under definition or has still not been executed
  • In Progress: when the Test Run is being executed
  • Done: All steps of all Test Cases have been executed

Version
Not relevant for DM
Iteration
Not Relevant
Owner
The responsible of the test run
Planned Start Date
The date when the test execution should startLeave it blank if this information is not available
Planned End Date
The date when the test execution should endLeave it blank if this information is not available
Software Version / BaselineTest Configuration - SoftwareList the software required to start a test activityNote that, if the objective of the Test Plan is to test a specific version of a software product, this has NOT to be listed in this custom field.
Configuration



Executing the Test Campaign

Once the "Test Plan and Report" has been approved, the test campaign can start.

For each Test Run, the test execution responsible has to run each Test Case Step, and add in the corresponding field the obtained result (Actual Result).

To keep in mind that the the content of the field "Actual Result" will be included in the test report. Please ensure that its phrasing is consistent and self explanatory.

Possible Step status are:

  • NOT EXECUTED: the step has still not been executed
  • IN PROGRESS: the step is being executed
  • PASS: the step has been completed successfully execute, as it can be evinced from the "Actual Result" field
  • CONDITIONAL PASS: the step execution has been completed, but some some minor problems have been experienced. A corresponding issue describing the problem shall be opened. That issue shall not have priority "Blocker" nor "Critical"
  • FAIL: the execution revealed a problem. A corresponding issue describing the problem shall be opened. That issue shall have priority "Blocker" or "Critical"
  • BLOCKED: the step can not be completed due to an existing problem on the preconditions. A issue should be available describing that problem.

You can use the time recording function in order to trace the execution time of each test case.

Issues shall be linked in the corresponding section in the test player.

Once all steps in a test case have been completed, a comment shall be added, summarizing the test result. This comment will be reported in the "Test Plan and Report" summary table and will be the most visible information for each test case executed in the test campaign.

Once all Test Runs associated with a Test Plan have been completed, the second part of the "Test Plan and Report" document can be generated from Jira, reviewed and issued, including all reporting from the concluded test activity. The Test Plan status can be therefor set to "Completed".



Fields missing

Following fields ( are in discussion with SE in order to them to the Test Plan/ Test Run, aiming a fully automatic generation of the Test Plan and Report from Jira.

Field NameIdentification in the old Test Report templateHow to fill itAdditional Comments
System OverviewSystem OverviewGive a description of the system under test. This may correspond to the product description in the model (MagicDraw) but in the majority of the cases will be focused on the specific test campaignThis Field need still to be implemented in the Test Plan, in Jira Test Management
Overall Assessment


Recommended Improvements


  • No labels