Overview
Primary objective of this page is to list down the best practices and standards of software testing using Quality Centre a test management tool. It describes the best methodologies that can be followed on HP quality centre for better test management and planning activities. These best practices are based on the real time knowledge and experience rather than a theoretical advice. The target audience of this page are the testing professionals involve in typical system testing using quality centre hence it is imperative that reader is familiar with basic quality centre terminologies and basic operations. Moreover, this page can be used by senior managers or by business analyst to streamline the testing process with other project management processes.
The scope of this page is limited for internal exercises/usage, to develop a consistent approach across best testing practices. This page will not cover the ‘Know How’ of the Quality Centre but will guide only the best practices and principle to follow in a project.
To understand the quality centre reader can refer Mercury Quality Centre User’s Guide Ref:
Glossary
MQC Mercury Quality Centre
SAP TAO SAP Test Acceleration and Optimization
CM Configuration Management
SUT System Under Testing
BPT Business Process Testing
dashboard technology A modular, configurable, and extensible user interface technology..
Followings are the typical process of test management on Quality Centre.
Figure 1 - Quality Centre Process
Component of the Quality Centre
Following are the basic components of the Mercury Quality Centre to define and manage the testing process. This section will give a brief overview of each component/ tab of the Quality Centre.
Requirement Module
This is the first module of the mercury Quality centre where project team can define brief description of each individual requirement to map to testing task or give the coverage to business functionality.
Note: this requirement tab offers the facility to automate the requirement process if it is being automated.
Following are the best practices on quality centre requirement definition tab:
Testing team analyse and gather project documents from various source to scope of the testing requirements. QC requirement tab is the best area to summarise the testing requirements, scope of testing, test objectives, and strategy etc. this requirement can be gathered by test manager, test analyst or senior tester or someone in team who is involved in application understanding. This area is being used by the team to give the coverage to each requirement assigned for individual tester. Tester can clearly determine the scope of the testing in this tab , assigned priority to task and test coverage to each feature of the application, criticality of the application function, relative importance and overall object of the application testing. The information described in this section can be used by tester to develop regression suite, smoke test suite etc.
Figure 2 -Requirement specification process
Best Practices for Requirement definition in QC
Requirement subject is recorded under the requirements Tab by creating a Requirements Tree. This defines the requirement in a hierarchical graphical illustration. The name of parent requirement folder should be defined with UPPER CASE and assign a unique Req. Id, project ID and the importance of the requirement. Requirement can be broken down to lowest level of detailed testing elements and any relevant document can also be attached to each sub requirement such as use case, Visio diagram etc. Once the requirements are defined a formal review process can be applied to validate each of the testing elements with priority level, coverage etc.
Example: SM01-PBS –HISTORICAL DATA MIGRATION
• Provide high level project detail description under description heading
Example: This project is basically a migration process to be used in moving the Portman data of saving, mortgage and insurance products currently held on Portman systems to the nationwide system (Columbus). We can use detailed defined requirements as a basis for test plan.
• Use Title Case to create sub folders under project parent folder
Example:
REQ_PBS 001– Saving Accounts Migration
REQ_PBS 002– Mortgage Accounts Migration
REQ_PBS 003– Insurance Migration
Check List for Requirement Definition on Quality centre
• Is the business requirement clearly stated in requirements tab for SUT?
• Has the test strategy clearly defined?
• Is the reason for testing specific module with importance is defined?
• Is the test environment of the project consistent with the CM?
• Is it clear what will define a successful outcome of the test requirement?
• Is it clear what the preferred option to schedule test plan is available?
• Is it clear why this is the preferred testing technique selected in Req. option?
• Are the risks faced by the project explicitly stated in requirement tab?
• Are the plans for addressing those risks explicitly stated?
Test Planning in Quality Centre
Project test plan on quality centre is the area that describes / states of the objectives, scope, approach, technique of a software testing efforts. This is a very important area for application testing. The technique of preparing a test planning is a constructive way to think through the efforts needed to verify and validate the acceptability of a software application. The area on testing plan on quality helps people outside the testing group understand the 'why' and 'how' of product is being validated. It should be thorough enough to be useful but not so thorough that no one outside the test group will read it.
Following are the typical characteristic and benefits of the test plan module of the Quality Centre.
Test Plan module on Quality Centre, helps tester to brief test requirements, individual test strategy, test techniques, test coverage and detail test steps. In best practice test plan module should be elaborated not just for defining the test cases, steps and scripts but should be used to map to the requirement definition in the first tab i.e. requirements. How we will be testing this application, such as applying top to bottom integration testing technique, system testing, or end to end business processes. What would be the test data, entry and exit points etc.
Test Coverage in Test plan Module on quality centre, this is the process and best practice which link requirement to test plan and finally with status of pass fail from test lab. By following this approach tester can identify the status of testing activities and also cover the defined requirement in test requirement tab. In practice this should be done before importing test steps to test lab.
Figure 3-Test planning and definition on Quality centre
Best Practices for test specification on QC
Check List for Quality Centre Test Plan
• Is it clear that what and how we are testing?
• Is it clear that what is the test priorities and schedule?
• Is it clear that what would be the test data for executing specified test plan?
• Is it clear that test environment is configured as defined in CM?
• Is it clear what is the preferred testing technique selected?
• Are the risks faced by the project explicitly stated in test plan?
• Are the plans for addressing those risks explicitly stated?
• Is the defined test step is logically designed to give sufficient req coverage?
• Is the test objective are clear and understood from the test plan or test scripts?
• Is the entry, exit and suspension and resumption criteria defined clearly?
Test Cases in Quality Centre Test Lab
A test case in quality centre is the part of test plan that describe an input, action, or event and an expected response, to determine if a feature of an application is working correctly. A test case should contain particulars such as test case identifier, test case name, objective, test conditions/setup, input data requirements, steps, and expected results. Negative and positive test cases can be defined in sequence of each others. Ensure that each test case meet the best practice and standards defined under section [4]. Quality Centre facilitates the control of test execution of tests in logical test set. Tester can set conditions, and schedule the date and time for executing your tests. You can also set the sequence in which to execute the tests. For example, run test3 only after test1 has finished and passed successfully or and run test3 only if test1 passed.
Figure 4 - Quality Centre Test Lab process
Quality Centre Test Lab process
Best Practice for Quality Centre Test Lab.
• Before starting of new manual run on test script ensures that you have not executed test cases earlier, if you would like to verify earlier test run then click on continue manual run.
• As a best practice don’t apply fiter on test lab.
• Use standard terminology to prove execution detail such as Passed, as expected with more descriptive details
• Ensure if you are running a test set executed by other team member must apply name of change on test lab tab under tester’s name.
• If test step is not applicable, set to N/A and if test can not be executed because of data unavailability ensure that you set test step as not completed with appropriate comments
• If there is test step dependency because of test data, or failure of few steps ensure you set rest of the test steps as Blocked and stop the test run.
• If test is not applicable, remove from the test set tree
Check List for Quality Centre Test Lab.
• Ensure that you import test cases from test plan module and map to test lab.
• Each test plan module has been mapped appropriate
• Test numbers and legend are consistent as per test plan
• Select appropriate test run – e.g. continue manual run, automated run.
• Is the entry criterion meets the need of test plan?
• Is the test environment ready for test execution?
• Ensure that test settings are appropriate as per the test configuration document.
• Is available test data and environment are sufficient to meet test objective?
• Is the defined schedule is clear and viable enough to meet test execution need?
• Defect raised against test step having appropriate information, attachment etc.
• Ensure that current status of the test status is set to one of the following “Failed” , “Passed”,”N/A” and if Not completed or not run the appropriate description available for test steps.
Defect Management on Quality Centre
For more details, see Defect management process on QMS
Figure 5 -Quality Centre Defect management process
Best Practices:
• If defect is related to application, ensure that defect has been added from test steps
• Indicate the severity of the defects based on the business requirement defined in requirements tab
• If not auto email is not configured on QC, send manual email to concerns that defect is assigned, and this can be done thru defect screen on Quality Centre.
• Assigned clear step-wise step description to define the problem for reproduction with appropriate comments
• Subject of the defect should be consistent and same as listed for project / application area with unique project Id e’g SM01 – Account Validation Failed – use sentence case
• Attach relevant information, screen shot etc.
Check List:
• Ensure that Defect description is relevant and well understood / No spelling mistakes
• Assigned to and assigned against is set clearly
• Subject is associated to test lab or test plan.
• Reproducible is set to yes or no with details
• Severity is set appropriately
• Application version / Build Id and description detailed accurately
• Before adding or modifying defect description ensure that you read history of the defect
• View associated test case status, should be failed
•
Test Analysis on Quality Centre
See page named “BPT testing using QC for test analyses”.
Standards for Writing Test Cases/Scenarios on QC
Very first step in software testing is the element of requirements – What is being tested?
This should be clearly defined under the requirement tab and should be aligned updated as per the Business requirement document changes.
This requirement can be defined in agreed format of scenario format or in use case for mat
For the purpose of this document we will be looking standards of manual test case using a typical test scenario definition using Quality centre.
Below is the list of standards that can be followed for better Quality centre practices.
• The First component in the Scenario/Test Case ID should be the type of testing for which the scenario being written e.g. #ST01 -
• The Second component in the Scenario/Test Case ID should be Unit Name/ Module Name–Sub Component / Module Name
• Purpose of the test plan / objectives
• Test Coverage criteria, this would be based on the requirement coverage
• All the Scenario/Test Case should be easily understandable, clear, concise and to the point.
• The pre-requisite for the Scenario/Test Case should be mentioned before the test cases. See below sample scenario
• The Scenario/Test Case for different module should be maintained in a single set of test plan or Worksheet in an Excel File. The Name of the Excel File should be TypeOfTesting_ModuleName.xls I.e. ST_BackOffice.xls
• Whenever new Scenario/Test Case is been added in between two existing one it should be named after the previous Scenario/Test ID with decimal places. i.e. If we have to add new Scenario in between scenario ID ST-SA-BKG-0015 and ST-SA-BKG-0016, then new scenario id will be ID ST-SA-BKG-0015.1 and so on.
• Defect ID of a Scenario/Test Case should match the Defect ID submitted against that defect in Defect Tracking System (QC) /Excel file. there for if this is a manual testing process a Traceable to a requirement would be easy
Sample Test Scenario
Scenario / test Case #ST01- Validation of the Account Data with Customer Data
SUMMARY DESCRIPTION AND OBJECTIVES
The objective of this Test Case is to verify that selected account data can be migrated successfully from a PBS account type named and number “First Save (404)” account to a NBS Smart account. Where the account holder is > 7 & <18 years on the day of migration, and the account has an Interest payment mandate set TEST COVERAGE The testing points covered by this Test Case include: All type of accounts where holders are > 7 & <18 on day of migration and roles are /= to JAF/JAO will be migrated into NBS Smart account type. Interest calculation and payment will be paid on migration of Portman account. All the migration of a First Save type of account must not be registered as a customer will be migrated with previous transaction. TEST DATA REQUIREMENTS This Test Case requires data that satisfies the aove test coverage : #The PBS account number 404 first Save migrated and in data file (MTF) using PSMS saving #Account Holder > 7 & < 18 years on day of migration Quality Language for manual Test Cases • Test Case Quality language means the better testability with the language being used with test specification so this can be used by a team Use active case such as do this, do that, Execute this, perform this, click on, select from. application displays this, does that, should do etc Simple as much as possible and nearer to conversational language however people has reservation with technical language Exact, consistent names of fields, not generic especially active cases Should explain Windows, Web basics such as Drop Down list this should come from the business rather technology e.g. rather writing that select drop down, it would be better to write select
Naming conventions
• Module Names should be in all capitals and bold. E.g. to mention BackOffice Module in the Scenario/Test Case, usage should be “BACKOFFICE”.
Screen Names should be bold and have Camel Notation i.e. the first letter of the
word in Capitals and rest in Small letters. If there are multiple words on the screen
name, then the 1st letter of all the words should be in Capitals and the rest of the
word in small letters. i.e. for Account Detail screen name will be “Account Details”
All the objects like Text Box, List Box, Check Box, Radio Button, Combo Box names should be in italics and bold. i.e. Login Text box should be mentioned as Login.
The Link Names should be mentioned with an Italics, bold and underline below the
word. i.e. Sign out link will be named as Sign Out.
Database table name should be in Caps i.e. for Emp_Detail table name will be
EMP_DETAIL, ACOOUNTS, JOINHOLDER
Test Case review Check List
All the test cases should review at least once and checklist should be filled by authored team member.
General Points to be considered while reviewing the Check List:
• Test cases should follow agreed standard format.
The Test cases are complete with respect to the Specifications Document on which they are based.
Test data is mentioned explicitly for each test condition if applicable
'Expected result' section of each test case is complete
Pre-conditions for executing a test case or a set of test cases are specified.
It is specified which test cases are to be executed together (or in a specific order)
Test cases cover the entire functionality of the integrated module and available requirement is amped to test plan.
Test cases follow the order of integration as defined in requirement module or in reference document such as functional design document or test plan.
Best and effective Practice to use QC
No comments:
Post a Comment