You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 22 Next »

The data management test approach is based on the following principles:


  • The tests shall be easy to write and maintain
  • The tests can be executed multiple times, depending on the scheduled milestones and needs
  • The test conditions shall be well identified in order to: i) make the test repeatable ii) make the behaviour and the results as close as possible to operations


The documents involved are:

  • The Test Specification: it contains the list of test cases written on  a specific DM component, plus some descriptive parts on the test activity involved for that component.
  • The Test Report: it contains some descriptive part on the test execution, a test case runs summary report and a detailed report on the test done. With the Jira Test Management introduction, it has been proposed to rename this document in Test Plan and Report.


DM consider the possibility to use the Test Management for Jira (by Adaptivist). The main advantages is to have an easy way to manage test cases and test report and to related them to requirements.


The documentation can be extracted from Jira:

  • The test specification will contain the test cases defined to a specific DM component.
  • The Test Report: will be divided in 2 parts:
    • the first part (the plan) contains the planned test activity information (first list of test to be executed), environments, datasets, personnel.
    • the second part (the report) contains the actual test runs results, their assessment and additional information derived from the the execution of the test


Test Specification

The test specification will include all test cases defined for a specific component in the Document Tree. Obsolete test case will be included also.

The test cases have to be written in JIRA, in the LVV project, under the corresponding folder.

Sections 1 and 2 of the test specification need to be written directly using Latex and changes submitted in the corresponding github repository. Section 3, 4 and appendix will be generated from JIRA.

How to write a test case

A test case need to be formulated in a general way, in order that it can be executed multiple times in different conditions, like for example, different software versions or different dataset versions.

In order to fully characterize a Test Case, 3 sections need to completed:

  • the test details in the main tab of the Test Case
  • the steps in the Test Script tab of the Test Case
  • the traceability to the Verification Elements (requirements) in the Traceability tab of the Test Case

Test Script

The Test Script consist in a sequence of Steps to be executed one after the other.

Each Step has 3 parts:

  • the Description: describe what the step is suppose to do
  • the Test Data: describe which input data is required for the test step
  • the expected result from the step.

These are the rules to use to write steps in the  Test Script tab:

  • Write the step description in the most reproducible way
    • Ideally it should be possible to copy and paste the commands. In case of graphic interfaces each action shall be clearly described.
    • Avoid the reference to specific SW versions or dataset: the test case has to be as general as possible.
  • In the Test Data field add the input data to use as input (if any)
  • Specify the expected result:
    • this field will be included in the test report, but the step description will not. Ensure that the Expected Result is meaningful and self explanatory.
    • pay attention to the formatting, the document generated may require some adjustments to the text added in this field
  • When a different test case is used in a step:
    • use only test cases that have been created for that purpose in the same folder (test spec). Test cases to be included in other test cases, should not need to be traced to parent requirements.
    • avoid recursive use of test case: only one level permitted.

Traceability

In the Traceability tab, one or more Verification Elements need to be linked.

Verification Elements are defined in the model in MagicDraw and synchronized in Jira using the Synedia plugin. Each Verification Element is related to a requirement in the model.

Verification Elements will be included in the test specification for each test case.

Links to confluence pages or to other web pages can be added in this section.

Test Details

Follows a table with some recommendations for each test case field in the main tab:

Field NameOld Test Spec NameHow to fill itAdditional Comments
NameTest case subsection titleHas to be short string indicating the purpose of the test.

It is recommended to avoid using the requirement name and ID. LDM-639 test cases have been named following the verification elements, and therefore the requirements names, mainly for lack of time.

Note that the Test Design are not used anymore and there is not need to have at the beginning of the name an identifier following the previous naming. Old test cases will keep it only for backwards compatibility.

ObjectiveTest Items

Describe what the scope of the test.

The requirement to be implemented can be used as reference, describe what you are going to test. It may be only a part of the requirement.

In same cases, milestones defined in the test plan LDM-503 may drive the scope of the test.

This field will appear in the Test Specification as "Test Items". Avoid including the requirements text.
PreconditionInput SpecificationThis field should include all inputs required to run the test.

This field will appear in the Test Specification as "Input Specifications".

For example the input data required (do not specify versions).

The field is provided by the test management plugin in JIRA and can't be renamed.

Folder

A Test Specification document will be produced for each folder, including all test cases defined in it.
Status

When you create a new test case the status is set to "Draft".

Once the Test Specification is approved (via RFC) the test draft test cases will be moved in status "Approved".

Once a test case is not valid anymore, its status has to be set to "Deprecated"

When you are going to modify an approved test cases:

  • create a new version
  • set status to "Draft"

Do not delete test cases, this may remove important information from the system.

The Jira Test Management plugin is not going to enforce a new version when an approved test case is modified.

Priority
Set the priority that you think is more appropriate. By default is set to "Normal".

In the future this information can be used to prioritize test cases.

At the moment is not used.

Component
DM (Data Management)This field may be useful when filtering across test cases.
Owner
Who is in charge to write and maintain the test case.In the future the test case may be executed by a different person, but the owner will not change.
Estimated Time
Fill it in case you have an idea how much time it can take to run the test case, otherwise let it blanc
Verification Type

It can be "Test", "Inspection", "Demonstration" or "Analysis".

Use the most appropriate type. In general this should be "Test".

This information should be derived by the requirement and Verification Element definition.
Verification Configuration

Not required.

It can be used in case a test case shall test a specific configuration of a software product, but this is against the principle that a Test Case should be generally formulated. Specific configurations will be specified during the test run definition.
PredecessorIntercase DependeciesThis is the list of test cases that need to be completed (successfully) before the actual test case can be executed.This do not imply that those test cases are part of the test script. Usually they are not.
Critical Event

This field is mandatory but should not be relevant for DM. Therefore set it to "False".


Associated Risks
This field should not be relevant for DM. Leave it blanc.
Unit under test
This field should not be relevant for DM. Leave it blanc.In the future we may use the components defined in the model in Magic Draw to group test cases, instead of using the folder.
Required SoftwareEnvironment Needs - SoftwareList here the required software packages that is need to be installed in the system in order to run the testIn case the test case is to verify the functionalities of a specific DM software package, for example science_pipeline, this shall NOT be listed here, but in the Objective field, Test items section.
Test EquipmentEnvironment Needs - Hardware

List here the required hardware that is need to be installed in the system in order to run the test.

This usually imply a server with a specific CPU power, RAM and available disk space.

The exact hardware used for the test will be specified in the Test Plan or in the Test Run. This information can be different each time a test case is executed.
Test Personel
This field has not been used until now by DM during the definition of the test. Leave it blanc or list here external people that may need to be involved during the verificationThese people are not the owner nor the test engineer that is going to run the test. They may be for exampke stakeholder, that need to assess the scientific results in order to ensure that the test has passed.
Safety Hazards
This field should not be relevant for DM. Leave it blanc.
Required PPE
This field should not be relevant for DM. Leave it blanc.
PostconditionsOutput SpecificationsSpecify here what output is expected from the test.

For example the output data expected.

This field has been called "Postcondition" due to the duality with the "Precondition" field provided by default.


Test Plan and Report

With the introduction of the Jira Test Management approach, two specific object are foreseen in order to guide the test activities planned in a specific moment in time (test campaign):


  • Test Plan: gives information on the planning of the test campaign, following an agreed schedule. The main source for the schedule for DM is LDM-503, but additional test campaign can be organized ad hoc.
  • Test Run: provides the information on which test cases have to be execute in a specific context. Multiple Test Runs can be associated with a Test Plan.


Before a test activity can start:

  • all relevant information in the Test Plan and Test Report have to be provided in Jira;
  • a first Test Plan and Report (draft) has to be generated from Jira, corresponding source tex files uploaded in github and published via DMTR-XXX.lsst.io/v
  • the corresponding responsible (T/CAM, product scientist) has to review the draft document and set the Test Plan status to "Accepted" if everything is fine, or ask the Test Plan and Test Report owners for improvements if needed.


Defining the Test Campaign

In order to start defining a test campaign, an corresponding Test Plan (LVV-PXXX) has to be created in Jira. Test Plans are organized in folders in the same way as Test Cases are.

Test Plan Details

In the Test Plan Details, the main tab of the Test Plan object in Jira, following information need to be provided before the test campaign starts.



Field NameIdentification in the old Test Report templateHow to fill itAdditional Comments
Name
Short identification of the test activityThis field will be used as title of the "Test Plan and Report document"
ObjectiveObjective / ScopeDescribe the Objective and Scope of the test campaign

The first part of the field will be included in the Test Report / Objective subsection.

Add a bold "Scope" string followed by the corresponding text. If no scope is provided, the Test Report / Scope subsection will be let empty.

Folder

this information is not used to generate documents, but it is helpful for navigating and find information
Status
  • Draft: when a test campaign is proposed or under definition
  • Approved: when a test campaign is fully defined, including test runs information, and it has been formally approved via RFC
  • Completed: when all the test runs associated with the test plan have been completed and all information is ready to be exported in a document
  • Deprecated: when an originally planned test campaign is removed from the planning

Owner
The responsible of the test campaign planning
Verification EnvironmentTest Configuration - HardwareDescribe the environment where the test are going to be executed, including hardware and low level software like for example operating system.
Entry Criteria
Not relevant for DM
Exit Criteria
Not relevant for DM
PMCS Activity
Not relevant for DM
Observing Required
Not relevant for DM, except in case of integration test with Camera
System OverviewSystem OverviewDescription of the system under test focused on the actual test campaign (System Overview) and Applicable Documents for this specific test campaign.

The first part of the field will be included in the Test Report / System Overview.

Add a bold "Applicable Documents" string followed by the corresponding text. If no Applicable Documents text is provided, the Test Report / Applicable Documents subsection will be let empty.

Verification Artifact
From SE Test Management Architecture, this field should contain web links to resulting data product(s) including the Test Reports, Analysis, etc.To be completed when the Test Plan and Report is available.
Overall AssessmentOverall Assessment

Provide first a statement that the test campaign was successful or not.

The add a short text where the above statement is justified.

To be completed after all test cases have been executed.
Recommended ImprovementsRecommended Improvements

Provide improvements and suggestions as an outcome of the test campaign.

To be completed after all test cases have been executed.


Test Plan Traceability

The Traceability tab in the Test Plan shall link at least to one specific Test Run.

Additional information has to be provided here, if relevant:

  • issues related to the test campaign
  • confluence page (usually not relevant for DM)
  • web links (usually not relevant for DM)


Test Run Information required for the test campaign definition

In order to complete the definition of the test campaign, some information need to be provided in the Test Run(s) to be executed. 

Each Test Run can to be related only to one Test Plan.

Each Test Case can be included only once in a Test Run (limitation of the Test Management plugin). If a Test Case has to be executed in two different condition during a test campaign, two Test Runs need to be defined and associated with the same Test Plan.


Following fields are required to be filled in each Test Run associated to the Test Plan, before the Test Plan is approved:

Field NameIdentification in the old Test Report templateHow to fill itAdditional Comments
Name
Short identification of the Test RunIn case of a single test run associated with a test campaign, this can be the same as the Test Plan name. In case of multiple Test Runs, they have to identify the conditions that each of them describes, like for example test environment, datasets, configuration, etc
Description
Short description of the Test RunIf the case, clarify what differentiate different Test Runs associated with the same Test Plan.
Folder

this information is not used to generate documents, but it is helpful for navigating and find information
Status
  • Not Executed: when the Test Run is under definition or has still not been executed
  • In Progress: when the Test Run is being executed
  • Done: All steps of all Test Cases have been executed

Version
Not relevant for DM
Iteration
Not Relevant fot DM
Owner
The responsible of the test run
Planned Start Date
The date when the test execution should startLeave it blank if this information is not available
Planned End Date
The date when the test execution should endLeave it blank if this information is not available
Software Version / BaselineTest Configuration - SoftwareList the software required to start a test activityNote that, if the objective of the Test Plan is to test a specific version of a software product, this has NOT to be listed in this custom field.
ConfigurationTest Configuration

Configuration to be used.

Input Data

The section in the Test Plan and Report "Test Configuration" will be filled using the contest of this field for the subsection regarding Input Data and Configuration to be used.

The other subsections, Hardware, Software and Personnel, will be derived from other custom field available in the system already explained before, or from information available in different objects.


Executing the Test Campaign

Once the Test Plan has been approved by the corresponding responsible as described above, the test campaign can start.

For each Test Run, the tester has to run each Test Case Step, and add in the corresponding field the obtained result (Actual Result).

To keep in mind that the the content of the field "Actual Result" will be included in the test report. Please ensure that its phrasing is consistent and self explanatory and properly formatted.

During the execution of a test case, possible Step status are:

  • IN PROGRESS: the step is being executed
  • PASS: the step has been completed successfully execute, as it can be evinced from the "Actual Result" field
  • CONDITIONAL PASS: the step execution has been completed, but some some minor problems have been experienced. A corresponding issue describing the problem shall be opened. That issue shall not have priority "Blocker" nor "Critical"
  • FAIL: the execution revealed a problem. A corresponding issue describing the problem shall be opened. That issue shall have priority "Blocker" or "Critical"
  • BLOCKED: the step can not be completed due to an existing problem on the preconditions. A issue should be available describing that problem.

You can use the time recording function in order to trace the execution time of each test case.

Issues shall be linked in the corresponding section in the test player and mentioned in the "Actual Result" field.

Once all steps in a test case have been completed, a comment shall be added, summarizing the test result. This comment will be reported in the "Test Plan and Report" summary table and will be the most visible information for each test case executed in the test campaign.

Once all Test Runs associated with a Test Plan have been completed, the following activity will be performed to close the test campaign:

  • complete in the test plan the field "Overall Assessment" as follows:
    • Add a first statement saying if the test campaign has been successfully concluded or not.
    • Add a short text (max 20 lines) explaining the reasons of the above statement.
  • add suggestions in the "Recommended Improvements" field, if any, or leave it blank
  • generate from Jira the second part of the "Test Plan and Report" document (sections 4 and 5), 
  • the corresponding manager reviewed the document, ensure that all information in it is meaningful and text correctly formatted
  • upload the document in docushare
  • Set the Test Plan status to "Completed".


Text Formatting

(to be confirmed)

The Jira Test Management plugin provides at the moment only limited text formatting capabilities. In addition we need to fit the information provided in Jira in a readable document.

Follows the rules  that need to be followed in order to have as much as possible a well formatted and readable test documentation produced automatically from Jira.

In case the formatting is broken, text need to be fixed in Jira and NEVER in the tex document in github.

  1. General Multiline fields fill be converted in subsections in the corresponding Jira Document. 
    1. The name of the field will become the name of the subsection.
      1. exception are for few fields, where a different name is expected to appear in the document.
    2. The formatting will be preserved as much as possible in the generated document
    3. Table or images will not be exported (for the moment)
    4. If a field is empty, no subsection will be included in the document
      1. except for fields that are indicated mandatory in the  collumn
  2. Step multiline fields will be displayed as monospaced pre-formatted in the document (verbatim latex size scriptsize)


Procedures

Test Specification Approval

  • Create / Update Test cases in Jira
    • test case status shall be set to Draft when an Approved test case is updated
  • Test Specification updated via Travis (continuously or at regular intervals)
  • Submit RFC for Test Spec approval
  • Once RFC approved:
    • Test Cases status to be set to Approved,
    • Test Spec to be regenerated with approved test cases and uploaded in Docushare

Test Plan and Report Approval

  • Create in Jira a Test Plan
  • Complete the the Test Plan with all relevant information required before the test activity can start (see procedure above)
  • Create a Test Run and trace it to the Test Plan just created, multiple Test Runs can be create for the same Test Plan.
  • Complete the the Test Run(s) with all relevant information required before the test activity can start (see procedure above)
  • The document "Test Plan and Report" is generated (draft) from Jira (can be generated continuously)
  • Formal Jira Test Plan approval (Test Readiness Review) before test activity starts:
    • Target: 
      • Ensure that everything needed for the test is available and has been properly documented
      • Request changes to the Jira Test Plan and Test Run(s) if needed
      • Set status of Jira Test Plan to "Approved"
    • Who is involved
      • In case of small test campaign regarding a single product, the product leader is in charge to review the document ad set to "Approved" the Jira Test Plan. Stake holders can be involved, but the objective is to keep the process as smooth as possible.
      • In case of big integration test campaign that involve many products and may extend outside DM:
        • The draft "Test Plan and Report" should be sent to all responsible of involved systems and stakeholders  
        • Feedbacks be collected by the responsible of the test campaign, the Jira Test Plan owner and corrections and updated be done in Jira Test Plan and Test Run(s)
        • In case of needs, a meeting should be organized in order to discuss and sort out all open points
        • Once all problems have been sorted out, the Jira Test Plan owner can set the status to "Approved" and test activity can start.
        • A draft version of the Test Plan and Report can be uploaded in Docushare (not prefered)
  • Run the test using the Jira Test Run Player
    • Report result for each Test Step
    • Issues opened during the test execution shall be related to the Test Case in the Test Run Player
    • Assess the result of global execution of each Test Case as specified above
  • Once all Test Cases included in the Test Run(s) have been executed, Complete the field "Summary Overview" in the Jira Test Plan and set the status to "Completed"
  • The Test Plan and Report generated continuously from Jira will include all execution information and 
  • It should be ready for the final review and issue in Docushare:
    • In of a small test campaign, the product leader review the document, ask for the needed corrections to be made in Jira, and once OK sends it to Docushare
    • In case of a big integration test campaign that involves many products and may extend outside DM:
      • The final draft "Test Plan and Report" should be sent to all responsible of involved systems and stakeholders
      • Feedbacks to be implemented in Jira and document regenerated
      • In case of needs, a meeting should be organized in order to discuss and sort out all open points
      • Once all problems have been sorted out, the Test Plan and Report will be issues in Docushare
    • Who
      • In case of small test campaign, the product leader is in charge to review the document ad set to "Approved" the Jira Test Plan. Stake holders can be involved, but the objective is to keep the process as smooth as possible.
      • In case of big integration test campaign that involve many products and may extend outside DM:
        • The final draft "Test Plan and Report" should be sent to all responsible of involved systems and stakeholders  
        • Feedbacks be collected by the responsible of the test campaign, the Jira Test Plan owner and corrections and updated be done in Jira Test Plan and Test Run(s)
        • In case of needs, a meeting should be organized in order to discuss and sort out all open points
        • Once all problems have been sorted out, the Jira Test Plan owner can set the status to "Completed" and
        • The final version of the Test Plan and Report can be uploaded in Docushare
  • No labels