You are on page 1of 106

HP Quality Center

Student Guide

Table of Content
TABLE OF CONTENT
COURSE OVERVIEW
COURSE OBJECTIVES
CLASS INTRODUCTION
COURSE OUTLINE
INTRODUCTION TO QUALITY CENTER
LESSON OBJECTIVES
QUALITY CENTER OVERVIEW
THE TEST MANAGEMENT PROCESS
LOGGING ON TO QUALITY CENTER
QUALITY CENTER MODULES
THE QUALITY CENTER TOOLBARS
SHORTCUTS MENU
TOOLS AND HELP
SUMMARY
REVIEW QUESTIONS
LAB EXERCISE
WORKING WITH REQUIREMENTS
LESSON OBJECTIVE
TEST MANAGEMENT PROCESS
REQUIREMENT TYPES
REQUIREMENTS TREE
BUILDING A REQUIREMENTS TREE
CREATING A REQUIREMENT
PMT INTEGRATION
REQUIREMENTS FIELDS
VIEWING MULTIPLE REQUIREMENTS
EDITING MULTIPLE REQUIREMENTS
SUMMARY
REVIEW QUESTIONS
LAB EXERCISE
TEST PLANNING
LESSON OBJECTIVE
TEST MANAGEMENT PROCESS
TEST PLANNING OVERVIEW
THE TEST PLAN TREE OVERVIEW
CREATING A SUBJECT FOLDER
KEY POINTERS FOR DEFINING TESTS
ADDING A TEST
POPULATING THE DETAILS TAB
TEST CASE FIELDS

ADDING TEST STEPS


IMPORTING TEST CASES FROM EXCEL
CONSIDERATION FOR DESIGNING TEST STEPS
CALLING A TEST
TEST PARAMETERS
DEFINING A PARAMETER
CALLING A TEST WITH PARAMETERS
EDITING THE VALUES OF CALLED PARAMETERS
CREATING TEMPLATE TESTS
TEST-REQUIREMENT RELATIONSHIP
LINK TESTS TO A REQUIREMENT
LINKING REQUIREMENTS TO A TEST
SUMMARY
REVIEW QUESTIONS
LAB EXERCISE
TEST EXECUTION
LESSON OBJECTIVE
TEST MANAGEMENT PROCESS
USING THE TEST LAB MODULE
TEST EXECUTION OVERVIEW
TEST SET
TEST SETS TREE
CREATING A TEST SET FOLDER
CREATING A TEST SET
BUILDING AND EXECUTING TEST SETS
ADDING TEST SET DETAILS
ADDING TEST INSTANCES
SETTING A TEST SET NOTIFICATION
RUNNING TESTS MANUALLY
TOOLS FOR RECORDING RESULTS OF MANUAL TESTS
VIEWING TEST RUN RESULTS
ATLAS INTEGRATION
BUGZILLA INTEGRATION
SUMMARY
REVIEW QUESTIONS
LAB EXERCISE
REPORTING AND ANALYSIS
LESSON OBJECTIVE
TEST MANAGEMENT PROCESS
REPORTING OPTIONS OVERVIEW
PRE-DEFINED REPORTS AND GRAPHS
CUSTOM REPORTS AND GRAPHS
QUALITY CENTER GRAPH TYPES
GRAPH WIZARD
CONFIGURING A GRAPH
ELEMENTS OF THE GRAPH WINDOW

DISPLAYING GRAPH DATA


EXCEL REPORTS
MANAGING REPORTS
DASHBOARD VIEW
CREATE A DASHBOARD PAGE
CONFIGURE A DASHBOARD PAGE
VIEW A DASHBOARD PAGE
SUMMARY
REVIEW QUESTIONS
LAB EXERCISE
PROJECT ADMINISTRATION
LESSON OBJECTIVE
NAVIGATING TO THE PROJECT CUSTOMIZATION PAGE
ADDING USERS TO THE PROJECT
ASSIGNING USER TO A USER GROUP
CUSTOMIZING PROJECT LISTS
SUMMARY
REVIEW QUESTIONS

Course Overview
Course Objectives
This hands-on course provides the tools you need to implement and utilize Quality
Center 11. Students will learn how to manage quality information throughout the
development cycle, from constructing and importing the requirements through
designing and executing tests.

Class Introduction
Please let us know:
1.
2.
3.
4.

Your name
Your role
Application or project you are working on
Your personal objective for this class

Course Outline
09:00 a.m.
09:15 a.m.
10:30 a.m.
10:45 a.m.
12:00 p.m.
12:45 p.m.
02:00 p.m.
02:15 p.m.
03:30 p.m.
03:45 p.m.

09:15
10:30
10:45
12:00
12:45
02:00
02:15
03:30
03:45
05:00

a.m.
a.m.
a.m.
p.m.
p.m.
p.m.
p.m.
p.m.
p.m.
p.m.

Introduction
Unit 1
Break
Unit 2
Lunch
Unit 3
Break
Unit 4
Break
Unit 5

Introduction to Quality Center


Lesson Objectives
After completing this lesson, you will be able to:

Understand the role of Quality Center at VMware.

Identify the phases of the test management process.

Log into Quality Center rand become familiar with its modules, toolbars, and
menus.

Find help on Quality Center topics.

Quality Center Overview


Quality Center is a web-based test management tool that is used to structure,
organize, and document all phases of the application testing process according to
VMware requirements. Quality Center supports communication and collaboration
among the distributed teams.
Quality Center has the following features:

Provides a web-based repository for all testing assets and provides a clear
foundation for the entire testing process.

Establishes seamless integration and smooth information flow from one stage
of the testing process to the next.

Supports the analysis of the test data and coverage statistics, to provide a
clear picture of the accuracy and quality of an application at each point of
lifecycle.

The Test Management Process


Application life cycle management process with Quality Center involves the
following phases:

Requirements Specification:Application analysis and requirements


definition

Test Planning: Creating the test plan based on your requirements

Test Execution: Creating test sets and executing them

Defect Tracking: Integration with Bugzilla

Logging on to Quality Center


1. Navigate to the Quality Center start page by accessing the following URL
from Internet Explorer
http://quality.eng.vmware.com/qcbin/
2. Click on the Application Lifecycle Management link. Each time Quality Center
is run, it checks the version. If it detects a newer version, it downloads the
necessary files to your machine.
Note: In Windows Vista and 7,If you do not have administrator privileges
onyour machine, and a Security Warning displays, click Dont Install. Youwill
be redirected to the Install screen.
If file downloads are prohibited by your browser, you can install thesefiles by
using the HP ALM Client MSI Generator Add-in on the MoreHP Application
Lifecycle Management Add-ins page. For moreinformation on add-ins, refer to
the HP Application LifecycleManagement Installation Guide.
3. Type your Active Directory user name and password and click on the
Authenticate button.
4. Select Domain TRAINING and project vSphere_Training.
5. Click on the Login button

Quality Center Modules


Quality Center has the following modules:
Dashboard: Enables you to analyze data from one or more projects using
the same metrics
Analysis View
Dashboard View
Requirements: Enables you to specify and track the testing requirements
Testing: Enables you to create a test plan
Test Resources
Test Plan
Test Lab

The Quality Center Toolbars


The project window provides the following toolbars:
Common toolbar: Provides navigation buttons, access to common tools,
documentation and additional resources. This toolbar is located on the upperright part of the window.
Module toolbar: Contains buttons for frequently-used commands in the
current Quality Center module

Shortcuts Menu
The Quality Center interface provides context-specific menus that appear when you
right-click certain interface elements. These menus are useful shortcuts to
commands that are specific to the interface elements that are selected.

Tools and Help


The Quality Center common toolbar provides the following menus:
Tools: Enables you to:
Switch to another project.
Customize a project.
Log a new defect (currently is not used).
Spell-check your modules and configure spell-check options.
Clear project history.
Generate Word reports.

Help:Enables you to:


Open the Quality Center Documentation Library and other online
resources.

Additional help is available on the wiki:


https://wiki.eng.vmware.com/QEOps/InfTools/HPQC-FAQ

Summary
In this lesson, you have learned to:

Understand the role of Quality Center at VMware.

Identify the phases of the test management process.

Log into Quality Center rand become familiar with its modules, toolbars, and
menus.

Find help on Quality Center topics.

Review Questions
Please answer these review questions:
1. What are the modules in Quality Center

2. Where is the common toolbar located, and what functions does it support

3. How can you access the Quality Center Documentation library

Lab Exercise
Installing Quality Center
1. Navigate to the Quality Center start page by accessing the following URL
from Internet Explorer
http://quality.eng.vmware.com/qcbin/
2. Click on the Application Lifecycle Management link. Each time Quality Center
is run, it checks the version. If it detects a newer version, it downloads the
necessary files to your machine. Since this is the first time you are running
Quality Center on this machine, it loads the client to the local machine.
Note: In Windows Vista and 7, if you do not have administrator privileges
onyour machine, and a Security Warning displays, click Dont Install. Youwill
be redirected to the Install screen.
If file downloads are prohibited by your browser, you can install thesefiles by
using the HP ALM Client MSI Generator Add-in on the MoreHP Application
Lifecycle Management Add-ins page. For moreinformation on add-ins, refer to
the HP Application LifecycleManagement Installation Guide.
3. Type your Active Directory user name and password and click on the
Authenticate button.
4. Select Domain TRAINING and project vSphereTraining.
5. Click on the Login button

Working with Requirements


Lesson Objective
After completing this lesson, you will be able to:

Add new requirements.

Create Requirements tree.

Request PMT synchronization process.

View and edit multiple requirements at the same time.

Test Management Process


A test management process in Quality Center starts with defining requirements.
Requirements describe in details application functionality. They are foundation on
which the entire testing process is based.
Requirements are the yardstick for measuring the process of a project. When
planning a project, you need to allocate time to clearly define the requirements of
the testing process. This helps you plan and execute the successive stages of the
Test Management process. For example, you use the requirements data during the
Develop Test Plan stage to create tests for testing requirements. In the Execute Test
stage, you execute these tests to check whether or not the requirements have been
met.

Requirement Types
The path to quality software begins with excellent requirements. Slighting the
processes of requirements development and management is a common cause of
software project frustration and failure.
As a rule, the requirements are coming to VMware QE from the PMT system. Custom
requirement types were created for consistency between two systems.
Icon

Requirement Type
Folder

Description
A folder that organizes the requirements

Release

Group of requirements related to a specific


release

Feature

PMT Feature equivalent

Sub-Feature

PMT Sub-Feature equivalent

Test Design
Specification

PMP TDS equivalent

Note: Additional Requirement Types could be created if necessary.

Requirements Tree
Quality Center requirements are organized in a tree structure:

Building a Requirements Tree


Application requirements could be created and maintained within the Quality Center
or they could be imported directly from PMT via the specially developed web-service
You can start building a requirements tree by creating a requirement within the
Requirements folder at the root level. After creating a requirement, you can create
requirements within the same folder; or, depending on the requirement, you may
create parent or child requirement.

Note:Normally there would be only one requirement of a type Folder created under
the Requirement folder. The folder name is zDeleted Requirements. This folder is
used similarly to a recycle bin in Windows operating system to prevent permanent
deletion of the entities.

Creating a Requirement
You can follow the outlined process in order to create a new requirement:
1. From the requirement tree, select the target parent requirement and click on
the New Requirement button. The Create New Requirement dialog box
appears.
2. In the Create New Requirement dialog box, from the Requirement Type list,
select the type of requirement you want to create.
3. In the Requirement Name field, type an appropriate name for the new
requirement and click OK. The New Requirement dialog box appears.

Note: A requirement name cannot include any of the following characters: \^ *


4. In the Description text box, type a detailed description of the requirement.

5. Click on the Submit button

PMT Integration
The following diagram demonstrates the synchronization process between PMT and
Quality Center Requirements module:

To request synchronization between the PMT and Quality Center project, you need
to submit a request through Bugzilla.

Requirements Fields
You can view the Requirements details information either by activating the
Requirements Details option from the View menu and selecting the requirement, or
by double-clicking on the requirement.

The table below lists all the fields that you can use to describe each requirement. If
your project needs to capture additional data, Quality Center administrator can
configure custom-defined fields and selection lists.
Field
Name

Requirement Type

Usage
Assigns a short description of the requirement.

Indicates the type of the requirement.

QE Owner

Indicates the name of the user who is responsible for the


requirement from the QE side.

DEV Owner

Indicates the name of the user who is responsible for the


requirement from the development side.

Modified

Indicates the date the requirement was last edited.

Priority

Sets the priority level of the requirement.

PMT ID

Indicates the corresponding PMT ID of the requirement (if


any)

PMT Feature Status

Indicates current PMT Feature status of the requirement

Product

Indicates the component of the system on which the


requirement is based.

Reviewed

Indicates whether the requirement has been reviewed


and approved.

Additional Requirements Details tabs display the following information:

Requirement Description in Rich Text format


Attachments
Requirements Traceability
Test Coverage
History

Viewing Multiple Requirements


The Requirements Grid view provides a non-hierarchical view of the requirements
data. Each line in the grid displays a separate requirement. You can use this view to
record and review the requirement details.
To navigate to the Requirements Grid view, from the menu bar in the Requirements
module, select View Requirements Grid.
The Requirements Grid view also enables you to filter requirements based on
different fields. For example, you can click the text box below the Direct Cover
Status field, and click the browse button that appears within the text box.

You can choose which columns to display on the Requirements Grid by clicking on
the Select Column button.

Editing Multiple Requirements


In addition to viewing and filtering multiple requirements, you can use the
Requirements Grid view to edit multiple requirements. You can replace field values
for a selected requirement and all of its child requirements, or for all requirement in
the requirements tree.

To edit multiple requirements:


1. In the Requirements Grid view, from the menu bar, select EditReplace.
The Replace dialog box appears.
2. In the Replace dialog box, in the Find in Field list, select or type the field for
which you want to replace values.
3. In the Value to Find box, select or type the value you want to search for.
4. In the Replace with dialog box, select or type the value to replace the existing
field values.
5. Click Replace All to replace the field values of all the requirements.

Summary
In this lesson, you learned to:

Add new requirements.

Create Requirements tree.

Request PMT synchronization process.

View and edit multiple requirements at the same time.

Review Questions
1. What types of the Quality Center requirements have your learned?

2. How do you request synchronization process between the PMT and Quality
Center project?

3. Is it possible to add extra fields to track additional requirements information?

4. Which view enables you to edit multiple requirements at the same time?

Lab Exercise
Part 1: Altering the Requirements View
1. Log into Quality Center using your user name and password. Select
TRAINING domain and vSphere_Training project.
2. On the Quality Center sidebar, select the Requirements module.
3. From the menu, select ViewRequirements Tree to switch to tree view.
4. To change the columns displayed in tree view:
a. On the toolbar, click the Select Columns
Columns dialog box appears.

button. The Select

b. Clear entries under Visible Columns by clicking on the button with the
double left-pointing arrows. Only required fields remain.
c. Under Available Columns, double-click on the following column names
in this order to make them visible: Req ID, PMT ID, Requirement Type,
QE Owner, Reviewed, Creation Date, Modified
d. If columns are not in the order defined above, use the up and down
arrows above Visible Columns to reorder them.
e. Click OK, The new column order is displayed.

Part 2: Adding Folders and Requirements


1. Create the folder with the same name as your Quality Center user ID under
the Requirements folder:
a. Select the Requirements folder and click on the New Folder button
on the toolbar. The Create New Requirements Folder dialog box
appears.
b. In the Folder Name field, type your Quality Center user ID, and then
click OK.
2. Create Release folder and associated Requirements:
a. Select the folder you have created in previous step by clicking on it.
b. On the toolbar, click on the New Requirement button. The Create
New Requirement dialog box appears.

c. In the Requirement Type field, select VM Release.


d. In the Requirement Name field, type OP, and then click OK. The New
Requirement dialog box appears. Close the New Requirement dialog
box by clicking on the Close button.
e. In the Description field, type: Group of requirements associated
with VMware OP release.
f.

Click the Submit button.

g. Create additional requirements detailed in the table below:

Note: You have to follow the hierarchical structure

Type
VM Feature
VM SubFeature
VM TDS
VM TDS

Name
Storage Platform
APD Survivability
Enablement
Mount VmfsVolume
Unmount Vmfs Volume

Description
Storage Platform
APD Survivability Enablement
Mount VmfsVolume
Unmount Vmfs Volume

Test Planning
Lesson Objective
After completing this lesson, you will be able to:

Organize subjects and test in a test plan tree.

Create tests that define the steps for testing an application.

Link tests to the requirements that need to be tested.

Import test cases from Excel into Quality Center.

Test Management Process


The second stage of the testing process is test planning. After the requirements are
approved, the testing team designs tests that validate whether the application
meets these requirements. The efficiency of the test runs depends on how you plan
and execute these tests.

Test Planning Overview


You need to develop a clear and concise test plan to successfully test an application.
A good test plan enables you to assess the quality of your application at any stage
the testing process. To develop a test plan:
1. Develop the test plan tree.
a. Create subject folders in the test plan tree.
b. Define specific tests within each subject folder.
2. Add manual steps for each test.
3. Build automation test scripts, as appropriate.
4. Link tests to requirements.

The Test Plan Tree Overview


On the left pane of the Test Plan module, the test plan tree is displayed. The test
plan tree is a graphical representation of the Test Plan displaying the test assets
according to a hierarchical relationship of their respective functions.
The test plan tree:

Organizes tests according to the functional units or subjects of an


application.

Provides a clear picture of the testing building blocks and assists in the
development of actual tests.

Shows the hierarchical relationships and dependencies between tests.

When planning your test plan tree, consider the hierarchical relationships of the
functions in your application. Divide these functions into subjects and build a test
plan tree that represents the function of your application.
The test plan tree must be built according to the following structure:

Creating a Subject Folder


To create the test plan tree in the Test Plan module, you manually create folders and
add tests to these folders. The test plan tree starts with the Subject folder, which is
available by default. From the Subject folder, you create Release folders and add
subfolders within each Release folder.
To add a folder:
1. Navigate to the Test Plan module
2. From the test plan tree, select the Subject folder.
3. On the Quality Center toolbar, click the New Folder button. The New Folder
dialog box appears.
4. In the Folder Name field, type a name for the new test subject.

Note: A folder name cannot include any of the following characters: \ ^ *


5. Click OK to add the folder to the test plan tree.

Key Pointers for Defining Tests


After defining the subjects in the test plan tree, the next step is to add and define
the specific tests to be performed for each subject. You need to write tests that
define the set of inputs, events, actions, and expected outcomes to verify that the
application complies with a specific requirement. When you create tests, you need
to ensure that the tests are:

Accurate: Each test should have a distinct objective, such as verifying a


specific function or system requirement.

Economical: A test must include only the necessary steps and fields that are
needed for its purpose.

Consistent: Each step of the test must be executed consistently.

Appropriate: A test must be appropriate for its testers and its environment.

Traceable: A test must map to a requirement or a process.

To define a test:
1. Add a test to the test plan tree.
2. Specify the details of the test.

Adding a Test
To add a test to the test plan tree, you need to define basic information about the
test, such as its name and type.
To add a test:
1. From the test plan tree, select the subject folder in which you want to add the
new test.
2. On the Quality Center toolbar, click New Test. The Create New Test dialog
box appears.
3. From the QC Internal list, select a type for the test.
4. In the Test Name field, type a name for the test.
5. Enter all required information.
6. Click OK to add the test to the test plan tree.

Populating the Details Tab


You can use the Details tab to define the basic information about a test, such as its
status, creation date, a priority of execution. In addition to the standard fields
provided, you can use the Objective field to provide a brief overview of the test. For
example, you can use the Objective field to record the statement of purpose,
requirements for performing the test, and a high-level outline of test execution.

Note: You can provide additional information about a test by adding attachments in
its Attachments tab. This tab provides the same functionality as the Attachments
tab in the Requirements module.

Test Case Fields


Table below illustrates the all fields available for the test case:
Field Name

Field
Type

List Values

Comments
Represents Product of the test
case - e.g., ESX
Represents Functional and/or
Testing Area

1.Product

Text

N/A

2.Func Area

Text

N/A

3.Component

Text

N/A

Auto Script Path

Text

N/A

Path for the script in Perforce for


the automated test case

Automation
Level

List

Semi-Automated
Manual
Automated

Required by Process Initiative

Comments

Text

N/A

Configs HW

Text

N/A

Configs Other

Text

N/A

Configs SW

Text

N/A

Configs Test Bed


Creation Date
Exec Duration
(min)
Execution Status

Text
Date

N/A
N/A

Text

N/A

List

Fully
Automatable?

List

Y
N

Keyword

Text

N/A

Legacy Test
Case ID

Text

N/A

Objective

Text

N/A

Partner Facing?

List

Y
N

Any Notes/Comments about test


case. QC system field
List of applicable Hardware
configurations for the test case,
can be uploaded from Excel
along with Test Case itself
List of applicable Other devices
configurations for the test case
List of applicable Software
configurations for the test case
List of Test Beds for the test case
Filled by default. QC system field
Expected execution duration for
the test case, in minutes
Filled by default. QC system field
Required by Process Initiative
Free-format text field for tagging
test cases
To capture a complete old test
case id e.g.
ESX::Network::Stress::MyTest
Summary/Objective/High-level
Description of test case. QC
system field

PMT ID of the Feature covered


by test case

PMT #:

Text

N/A

Pre-Condition
QC Internal

Text
Text

Status

List

N/A
N/A
Review
Retired
Repair
Execution Ready
Draft
Development

Subject

List

N/A

Filled by default. QC system


field.

User List

N/A

Whoever created the test case

Unit
System
Integration
Functional
Acceptance

Level of test case. Required by


Process Initiative

Test Case
Developer

Test Case Level

List

Test Case
Priority

LIst

Test Case Type

List

P4
P3
P2
P1
P0
Use case
Usability
Uptime
Upgrade
Stress
Security
Scalability
Regression
Performance
Limits
HW Enablement
Functional
Feature
Interoperability
Fault Injection
Exploratory
Testing
Documentation
Testing

Test case status

Type of test case (multi-select).


Required by Process Initiative

Test ID

N/A

Test Name

N/A

Internal db id assigned by QC.


System field.
Test case short name. QC
system field

Adding Test Steps


After defining manual tests, you specify the detailed steps to execute each test.
Adding a test step involves specifying the actions to perform on the application, the
input to enter, and the expected output.
To add and define test step:
1. From the test plan tree, select a test.
2. In the right pane, click the Design Steps tab.
3. On the Design Steps page toolbar, click the New Step button. The Design
Step Editor dialog box appears.
4. In the Step Name field, type a name for the test step.
5. In the Description field, type the instructions for this step.
6. In the Expected Result field, type a description of what should be the
expected after the step is completed. In addition, specify detailed instructions
that testers can use to verify the result.
7. Click OK. The test steps appear in the Design Steps page. A footprint icon
appears next to the test name in the test plan tree. This icon indicates that
the steps are defined for the test.

Importing Test Cases from Excel


Detailed instruction on how to import test cases into Quality Center using the MS
Excel add-in are posted on the engineering wiki site:
https://wiki.eng.vmware.com/wiki/images/7/71/HowToInstallHPQCExcelAddIn.pdf

Consideration for Designing Test Steps


When designing test steps, you must define the operations of the application so that
all aspects of the application are tested. To ensure that you clearly and accurately
capture all of the actions required to complete an operation, you must:

Write the test steps in active voice. When you use active voice, the person
executing the test gets clear instructions on how to perform the test steps.

Use on action per step and clearly state whether the action should be
performed by the tester or the application.

Ensure that you do not leave out a step.

Use consistent terminology throughout the test.

Validate that the fields indicated in the test exist and are labeled the same
way as they are labeled in the system being tested.

Specify the pass and fail condition for the test.

Calling a Test
You can build the steps of a test to include calls to other tests. This enables you to
modularize and reuse a standard sequence of steps across multiple tests.
To call another test as a step within a test:
1. Click the Design Steps tab of the calling test.
2. On the Design Steps page toolbar, click the Call to Test button. The Select
a Test dialog box appears.
3. Select the test to call and click OK. This adds a step in the current test and
labels it as Call <TEST_NAME>. If you call a test that has unassigned
parameters, the Parameters of Test dialog box appear. You now assign the
parameters value.

Test Parameters
A parameter is a variable that can be assigned a value during test execution.
Parameters provide flexibility by enabling each calling test to dynamically change
its values. You use parameters to control test execution by specifying data values at
run time.
You can use parameters in the Description and Expected Result sections of a test
step.

Defining a Parameter
To define a parameter:
1. From the test plan tree, select a test.
2. Click the Design Steps tab.
3. On the Design Steps page toolbar, click the New Step button. The Design
Step Editor dialog box appears.
4. Place the cursor in the Description field or the Expected Result field of the
step in which you want to define the parameter.
5. Click the Insert Parameter button. The Parameters dialog box appears.
6. Click on the New Parameter button. New Test Parameter dialog box appears.
7. In the Parameter Name field, type a name for the parameter and click OK.
The new parameter is added within the step as <<parameter_name>>.
Note: A parameter name cannot include any of the following characters: ~ ?
<
8. Click Ok to close the Design Step Editor dialog box.

Calling a Test with Parameters

When you call test that contains parameters, you can set the values that you want
to pass to these parameters.
To call a test and pass values to its parameters:
1. From the test plan tree, select the calling test.
2. Click the Design Steps tab of the calling test.
3. Select the test that you want to call.
4. Click OK. Called Test Parameters dialog box appears.
5. In the Actual Value text box, type the values that you want to pass to the
parameters in the called test.
6. Click OK to add a new step that contains the call to the selected test and the
values that need to be passed to the test parameters.

Editing the Values of Called Parameters


You can edit the values that you assigned to parameters even after you define a
test call.
To edit a value of a called parameter:
1. Right-click the calling step and click Called Test Parameters. The Called
Test Parameters dialog box appears.
2. Click the Actual Value column of a parameter and type a new value.
3. Click OK to update the test call to display the new value.

Creating Template Tests


Any existing manual test can be copied and used as the basis for creating a new
manual test, but marking a manual test as a template test gives it special
designation. While a template test is no different from any other manual test in
composition of function, Quality Center can use the designation in dialog boxes
which have the Show Only Template Tests checkbox.
To configure a test as a template test:
1. From the test plan tree, right-click on a manual test.
2. Select Mark as Template Test.

Test-Requirement Relationship
Before you specify the detailed test steps, you can link tests to requirements to
verify if your test plan is focused on validating your project requirements. You can
define links between requirements and tests from the Test Plan module or the
Requirements module.
You can review the test-requirement relationship at any time during the testing
process to check how changes in requirements impact your test plan.

Link Tests to a Requirement


To link tests to a requirement:
1. Open the Requirements module.
2. If Requirements Details view is not enabled, from the Requirements toolbar,
select ViewRequirements Details.
3. From the requirement tree, select a requirement.
4. From the Test Coverage tab, click Select. The Test Plan Tree tab appears on
the right side of the screen.
5. Select a test from the Test Plan Tree tab and click the Add to Coverage
button to add the test to the Test Coverage grid.

Note: You can also drag and drop requirements from the requirements tree to the
Req Coverage grid.

Linking Requirements to a Test


To link requirements to a test:
1. Open the Test Plan module.
2. From the test plan tree, select a test.
3. Click the Req Coverage tab.
4. Click the Select Req button. The Requirements tree appears on the right
side of the screen.
5. From the requirements tree, select a requirement.
6. Click the down arrow of the Add to Coverage button to open a list
containing two options for linking requirements.

Note:The links that you define from the Requirements module are
automatically reflected in the Test Plan module. Similarly, the links that you
define from the Test Plan module are automatically reflected in the
Requirements module.

Summary

In this lesson, you learned to

Organize subjects and test in a test plan tree.

Create tests that define the steps for testing an application.

Link tests to the requirements that need to be tested.

Import test cases from Excel into Quality Center.

Review Questions
1. What is Quality Center Test Plan module is used for?

2. How many design steps a single test case can contain?

3. Why do you need to link test cases to the requirements?

4. Where can you find information about how to import test cases from Excel to
Quality Center?

Lab Exercise
After adding or importing the requirements, you need to create tests to verify
whether requirements have been met.

Part 1: Building Test Plan Tree


1. Log into Quality Center using your user name and password. Select
TRAINING domain and vSphere_Training project.
2. On the Quality Center sidebar, select the TestingTest Plan module.
3. From the menu, select ViewTest Plan Tree to switch to tree view.
4. Within Test Plan Tree select the root Subject folder by clicking on it.
5. On the Test Plan toolbar, click on the New Folder button. The New Test
Folder dialog box appears. Type in your Quality Center ID into Test Folder
Name edit box. Click on the OK button.
6. Under the folder you have created in the previous step create the test plan
tree based on the data provided in the table below:

Note: You have to follow the hierarchical structure

Type
Release
Product
Functional
Area
Component

Name
OP
ESX
ESX Server
Storage

Part 2: Creating and Modifying Tests


Now you will create a test. The data for your test will be provided, in reality you
would have to determine this data on your own. Always keep in mind that you are
writing instructions for a test execution engineer, not necessarily for yourself.

Create a Test Case


a. From the test plan tree, select the Storage folder, created in the
previous step.
b. On the test plan tree toolbar, click on the New Test button. The New
Test dialog box appears.
c. Enter all required information into the New Test dialog box
Field Name
Test Name
QC Internal
Product
Functional Area
Component
Automation Level
Fully Automated
Status
Test Case Developer
Test Case Level
Test Case Priority
Test Case Type
Objective

Field Value
Training
MANUAL
ESX
ESX Server
Storage
Manual
N
Draft
[your QC ID]
Functional
P1
Functional
This test case was created for training
purposes

d. Click OK button.

Add Design Steps


a. From the test plan tree, select the test case you have created in step 1.
b. If not selected, click on the Details tab.
c. Change the value of the Status field to Development.
d. Click on the Design Steps tab.

e. On the Design Steps toolbar, click on the New Step button. Design
Step Details dialog box appears.
f.

Type in Step 1 description goes in here text into the Procedure text
box area.

g. Type in Step 1 expected result goes in here text into the Expected
Result text box area.
h. On the Design Step dialog box toolbar, click on the New Step button.
i.

Type in Step 2 description goes in here and Step 2 expected result


goes in here text into Procedure and Expected Result fields
respectively.

j.

Click on the OK button.

k. Click on the Details tab.


l.

Change the value of the Status field to Execution Ready.

Part 3: Linking Requirements to Tests


1. Create a link between the script you have just created and Mount
VmfsVolume requirement you have created during the lab exercise at the
end of Managing Requirements lesson.
a. From the Test Plan Tree select the Training test you have created in the
previous part.
b. In the right pane, click the Req Coverage tab then then the Select
Req button. The requirements tree appears in the right pane of the
Test Details page.
c. From the requirements tree, expand the Requirements folder to find
the Mount VmfsVolumerequirement you have previously created.

Note:

You must select the requirement located under the


requirements folder named after your Quality Center user ID. Other
students also have the requirement named Mount VmfsVolume under
their folders.
d. Click the Add to Coverage button. The selected requirement appears
in the Req Coverage grid as a link.
e. Within the Req Coverage grid, click on the Mount VmfsVolumelink.

f.

Note that Training test is displayed within the Test Coverage grid.

Test Execution
Lesson Objective
After completing this lesson, you will be able to:

Create and organize folders in a test sets tree.

Create test sets.

Add test instances into test sets.

Execute manual tests.

Record and review the results of test execution.

Test Management Process


The third stage of the test management process involves organizing tests into test
sets and running the tests. After running the tests, you have complete
documentation that lists the inconsistencies, issues, and defects in the application.
You can subsequently report these problems into a defect tracking system for
further investigation, correction, and retesting.

Using the Test Lab Module


You perform all test execution tasks from the Test Lab module. In the Test Lab
module, you organize tests into test sets. A test set is a group of test instances
designated to achieve specific testing goals. The goals of a test set must be in
synchronization with the testing goals of the release which it is related to.
After you assign a test set to a release, you schedule the execution of test instances
within the test set. You can also specify the conditions and sequence for test
execution. Based on the execution conditions, you execute manual tests instances
within a test set. After test execution is complete, you analyze the test results to
determine whether a defect should be logged for failing steps.
To navigate to the Test Lab module, click the TestingTest Lab links on the Quality
Center sidebar.

Test Execution Overview


Quality Center provides the framework and tools to efficiently execute tests. The
stages that this framework recommends are:
1. Develop the test set tree: Provides a clear picture of the test building blocks
and assists in the execution of the actual tests. It helps you plan for data
dependencies and identify common scripts that can potentially be reused for
future testing.
To develop a test set tree:
a. Create folders in the test set tree according to the following structure:

b. Create test sets.


c. Add test instances to the test sets.
2. Organize test runs: Enables you to control the execution of test instances by
setting test run conditions and creating a schedule.
3. Run the tests: Enables you to manage the execution of manual and
automated test instances and document their results.

Test Set
A test set is a group of tests designed to achieve specific testing goals. A test set
can contain a combination of manual and automated test instances. You can add
test instances of the same test multiple times to the same test set and across
different test sets so that you can reuse routines.
To understand the concept of test sets, consider smoke testing. Smoke testing is
nonexhaustive software testing that you use to verify the most crucial
functionalities of a software application. Smoke testing does not test the finer
details of a software application.
Consider that you want to test the logon functionality of an online application and
you perform smoke testing for this purpose. You build a test set to include test
instances that validate the logon functionality. This test set includes tests that
validate the user name and password used to log on to the online application.
You build another test set to include test instances that verify the logon functionality
in a particular Windows environment. This test set includes test instances that
validate the logon functionality on different Windows operating systems, such as
Windows XP and Windows Vista, and on different web browsers.
The two test sets combined together test all aspects of the online application.

Test Sets Tree


The test sets tree organizes and displays the test sets hierarchically. Developing a
test sets tree helps you organize your testing process by grouping test sets into
folders and organizing the folders in different hierarchical levels.
To comply with VMware Quality Center standards, the test set tree must be
organized as follows:

Creating a Test Set Folder


The test sets tree always starts with the Root folder. In this folder, you can create
main folders and add subfolders to these main folders.
To add a folder:
1. From the test sets tree, select the Root folder to create a main folder or
select an existing folder to create a subfolder.
2. On the toolbar, click the New Folder button. The New Folder dialog box
appears.
3. In the New Folder dialog box, in the Folder Name field, type a name for the
new folder.

Note: A folder name cannot contain any of the following characters: \ ^ *


4. Click OK to add the folder to the test sets tree.

Note: Folders can contain subfolders, and each subfolder can contain further
subfolders. Each folder or subfolder can contain a maximum of 676
subfolders.

Creating a Test Set


After creating a test set folders, you create test sets within the folders.
To create a test set:
1. From the test set tree, select the folder to which you want to add a new test
set.
2. On the toolbar, click the New Test Set button. The New Test Set dialog box
appears.
3. In the New Test Set dialog box, in the Test Set Name field, type a name for
the test set.
4. In the Description field, type a description for the test set.
Note: A test set name cannot include any of the following characters: \ ^ *
5. Click OK to close the New Test Set dialog box. The new test set appears in the
test set tree.

Building and Executing Test Sets


The Test Lab module in Quality Center consists of the following test execution
element, which you use to provide information about test sets:

Details Tab: A description of the test set currently selected in the test set
tree.

Execution Grid: Enables you to declare the test instances that make up
each test set, run test instances, and review the results of these executions.
Displays test instance data in a grid.

Attachments: A list of attachments that provide additional information for


the test set currently selected in the test set tree.

History: Enables you to view the changes performed for the selected test set

Adding Test Set Details


After creating a test set, you can add details to the test set, such as its closure date
or Perforce branch information.
To provide additional information for a test set:
1. From the test sets tree, select a test set.
2. In the right pane, click the Details tab.
3. Under Details, specify the values for the following fields:
a. Close Date: Displays the panned closing date for the test set.
b. Baseline: In Baseline, select a baseline to which to pin the test set, if
any.
c. Open Date: Displays the planned start date for the test set.
d. Perforce Branch: Displays the perforce path for the automation
scripts
e. Status: In Status, set the status of the test set to Open or Closed.

Adding Test Instances


After creating a test sets tree, you select and add test instances to each test set.
To add test instances to a test set:
1. From the test sets tree, select a test set
2. In the right pane, click the Execution Grid tab and click Select Tests. The
Test Plan Tree tab appears on the right side of the screen and displays the
test plan tree.
3. From the test plan tree that appears on the right side of the screen, click a
test folder to add an entire group of test instances or click a test name to add
a specific test instance to the selected test set.
4. Click Add Tests to Test Set. This adds the test instance to the test. The test
instance name is built from the corresponding test name and a prefixed
number.

Note: If you select a folder containing tests that are already included in the test
set, you are prompted to select the tests in the folder that you still want to add.
Additionally, if the tests that have unassigned parameters, you are prompted to
enter values for parameters. You can also drag and drop tests from the test plan
tree to the Execution Grid to create test instances.

Setting a Test Set Notification


You can configure a test set to automatically send a status alert e-mail to the author
of the test set when it completes execution.
To set a notification for a status alert e-mail:
1. From the test sets tree, select a test set.
2. In the right pane, click the Automation tab and click Notifications.
3. Select events for which an e-mail message should be sent.
Note: You may select the Environment Failure event to send an e-mail if the
test set fails due to reasons other than the test logic itself. This could include
failures due to function calls that do not return results, access violations,
version incompatibility between application components, missing DLL files, or
inadequate permissions.
4. To specify the e-mail recipients, type their valid e-mail addresses, or select
their user or user group names by using the To button.
5. In the Message field, type the body of the e-mail message.

Running Tests Manually


To manually run and record test results:
1. From the test sets tree, select a test set.
2. In the right pane, click the Execution Grid tab and select a manual test.
3. On the Quality Center toolbar, click the Run arrow and select Run with
Manual Runner.
4. To start the test instance run, click Begin Run. The Manual Runner: Test
Set dialog box appears.
5. Perform the test step as outlined in the Procedure field of the Manual
Runner: Test Set dialog box.
6. Record the status and actual result of each step using provided fields.
7. To end the test run, click End Run.

Tools for Recording Results of Manual Tests


While running manual tests, Quality Center provides you different options to record
the results of manual tests.
You use the following options to record the results of a manual test run:
1. Compact View button: Enables you to toggle between Steps Grid and
Compact View. You can use Compact View to individually view and update the
Description, Expected and Actual fields of each test step.
2. Status column: Enables you to record the execution status of a test.
3. Actual field: Enables you to record additional details about the actual test
execution results.

Viewing Test Run Results


To view the results of the test instances, you need to go to the Execution Grid page.
The Excution Grid page is divided into two panes, which display the most recent test
run status. The two panes on the Execution Grid page are:

Last Run Result: Displays the execution results of the selected test from its
last test run.

Step Details: Displays the details of the selected step.

ATLAS Integration
To post execution results into Quality Center from ATLAS, a special web service was
developed. Quality Center results posting will be enabled for every ATLAS test (old
and new). No modification needed on launchers or integrated test scripts. Other
automation frameworks could be integrated in the future using the developed web
service.

Bugzilla Integration
Even though Quality Center has its own Defect Tracking module, all defects will still
be tracked in Bugzilla. A button was created within the Test Lab module to launch
Bugzilla, Bug ID(s) field was created on the Test Run level to hold the corresponding
Bug IDs for each failed test case.

Summary
In this lesson you learned to:

Create and organize folders in a test sets tree.

Create test sets.

Add test instances into test sets.

Execute manual tests.

Record and review the results of test execution.

Review Questions
1. Can the same test be added to the same test set multiple times?

2. How can I see the results from the previous executions of the same test
instance?

3. Which result do we see on the Test Set Execution Grid?

4. What is a test set?

Lab Exercise
After creating the test plan tree, you create a test set to group several test
instances of the same test case and run them with different configuration
parameters.

Part 1: Building Test Set Tree


1. First, create folders to hold the test set:
a. Log
into
Quality
Center.
vSphere_Training project.

Select

TRAINING

domain

and

b. On the sidebar, click the TestingTest Lab module.


c. From the test sets tree, select the Root folder and on the Quality
Center toolbar click the New Folder button. The New Folder dialog box
appears.
d. In the Folder Name field, type your Quality Center user ID.
e. Within the folder you have just created build test set tree structure
according to the data listed in the table below

Note: You have to follow the hierarchical structure

Type
Release
Milestone
Cycle

Name
OP
Alpha
Alpha Cycle 1

Part 2: Creating a test set


1. Create a Training test set under the Alpha Cycle 1 folder
a. SelectAlpha Cycle 1folder you have just created and click on the New
Test Set button. The New Test Set dialog box appears.
b. In the Test Set Name field, type Training and click OK.
c. With the Training test set selected, click the Execution Grid tab, and
then click the Select Tests button in the middle pane to open the test
plan tree in a new pane on the right.
d. In the right pane, navigate to the Training test set you have created
during Test Plan Lab Exercise. Select Training test and click Add Test
to Test Set button.
e. Add one more instance of the same test to the test set by clicking Add
Test to Test Set button again.

Note: When you add another instance of the same test to the test
cases, it will prompt you to confirm your action. Click OK on the
Create Instance dialog box.

Part 3: Modify Test Set


1. To assign test instances to a user, first move the Responsible Tester column to
the left, then modify the field value for each test:
a. In the Execution Grid, click the Select Columns button.
b. In the Visible Columns box, select the Responsible Tester field.
c. To the right of the Visible Columns heading, click the up-arrow button
to move Responsible Tester to just below Status, then click OK.
d. For each test instance within the test set, click in the Responsible
Tester column and select your Quality Center user ID.
e. Click on the Refresh button in the toolbar.

Part 4: Executing Tests

1. To execute all test instances within the test set using Manual Runner:
a. From the test sets tree, select the Training test set, you have
previously created.
b. On the right pane, click on the Run Test Set button.
c. From the Manual Test Run dialog box, select Manual Runner. Click OK.
d. Click on the Begin Run button to start executing the first test
instance.
e. Observe Procedure and Expectedfield values.
f.

Pass the step by clicking on Pass Selected button.

g. Click on the End Run button to finish executing the first test instance.
h. Repeat steps d through g to execute the second test instance, only this
time, fail the selected step.
i.

Click on the End Run button to terminate test set run process.

j.

Observe the test instance status change on the Test Execution Grid.

Reporting and Analysis


Lesson Objective
After completing this lesson, you will be able to:

Query Quality Center modules for useful information.

Generate Live Analysis graphs.

Generate reports and graphs.

Analyze reports and graphs.

Generate Excel Reports.

Configure Dashboard.

Test Management Process


You can generate reports and graphs within each Quality Center module to track
and assess the progress of your project. The Requirements, Test Plan and Test Lab
modules of the Quality Center provide predefined report and graph templates. You
can use these templates to retrieve the information that you want to analyze.

Reporting Options Overview


Quality Center provides the following options to generate graphs and reports:

Quality Center Querying: Enables you to view each module in a grid,


where you can apply various filters, change group by and view order settings.

Live Analysis Reports: Enables you to view real-time graphs for the
selected folder. This report type is available within the tree view of theTest
Plan and Test Lab Quality Center modules.

Module Analysis Reports: Gives you the ability to create, save and reuse
real-time graphs. The data for the report could be filtered by various
conditions. This type of reports has cross-project reporting capabilities.

Excel Reports: Enables you to export date to the Excel spreadsheet.


Spreadsheet could be formatted with the post processing macro. This type of
reports has virtually limitless capabilities within MS Excel framework as long
as metadata exists in the Quality Center.

Pre-Defined Reports and Graphs


Each Quality Center module provides predefined templates for generating reports
and graphs specific to that module. You can run these reports and graphs by using
their default settings or modify them to retrieve the information that you need. You
can organize these reports and graphs according to your requirements.

Custom Reports and Graphs


Quality Center Dashboard module is used to define and manage Quality Center
analysis graphs and reports. To open the Dashboard module, click the Dashboard
button on the sidebar.
The Dashboard module contains the following key elements:
1. Analysis View: Enables you to create and manage graphs, standard reports
and Excel reports.
2. Dashboard View: Enables you to create and manage dashboard pages that
display multiple graphs on a single page
You can save the reports into favorite views. To create a favorite view:
1. Generate a report and define all the custom settings you need.
2. When you have the report output displayed, click Add to Favorites. The Add
to Favorites dialog box appears.
3. In the Name field, type a name for your favorite view.
4. For Location, select Private or Public. Private stores your favorite view in
your private folder and restricts other users from accessing the report output.
Public stores your favorite views in a common folder and enables all users of
the specified project to access the report output.
5. Click OK to add the name of your favorite view to the favorite view list.

Note: To load a favorite view, click its name from the favorite view list. After
loading a favorite view, click the Generate Report button to display the
updated data.

Quality Center Graph Types


You use Quality Center graphs to analyze the progress of your work and the
relationships between the data that your project has accumulated throughout the
testing process.
The following graph types are available in Quality Center:

Summary Graphs: Each Quality Center module provides summary


graphs specific to the tasks that it supports. This graph type shows the
total count of requirement, tests or test instances that were defined
throughout the testing process.

Progress Graphs: Each Quality Center module provides progress graphs


specific to the tasks that the module supports. This graph type shows the
accumulation of requirements, tests or test instances over specific period.

Trend Graphs: The Requirements and Test Plan modules provide trend
graphs specific to the tasks that they support. This graph type shows the
history of changes to specific fields over a specific period.

Requirements Coverage Graphs: This graph type is specific to the


Requirements module. It shows the total count of requirements, grouped
by test coverage status.

Graph Wizard
You use Graph Wizard to generate a new graph. This wizard takes you through the
steps for generating a new graph.
To run the Graph Wizard:
1. From the menu bar, select AnalysisGraphsGraph Wizard. The Graph
Wizard dialog box appears.
2. Select a graph type and click Next.
3. Select which projects to use to get data from and click Next.
4. Select a filter option and click Next.
5. Select a field by which data should be grouped in the graph.
6. Select the field containing the values that you want to plot on the x-axis and
click Finish to confirm your settings and generate the graph.

Configuring a Graph
You can define what data appears in a graph, and how the data is organized.
To configure graph:
1. In the analysis view, select the graph you want to configure.
2. Click the Configuration tab.
3. Configure graph settings

Elements of the Graph Window


Elements of the graph window enable you to customize your graph.
You can:

Use the Description tab to enter annotations about a graph. Note that this tab
can be edited only for graphs that are saved as favorite views.

Use Save Graph Data button to Save the graph.

Use Set Graph Appearance button to modify a graph layout.

Use the Copy Graph to Clipboard and Print Graph buttons to reuse a
graph.

Use the Edit Categories button to select the data that is plotted and
organized in a graph. Alternatively, you can use the options on the right side
of the window to change the x-axis, y-axis, and data group settings of the
graph.

Use the Full Screen View button to view a larger display of the graph.

Use the Refresh button to adjust a graph to display the latest data and
settings.

Navigate to the Pie Chart tab to see how data from a Bar Chart is translated
to a Pie Chart. Not that this tab is only available for Summary and
Requirements Coverage graph types.

Displaying Graph Data


After creating a graph, you can display the data on which the graph is based.
To display the data on which the graph is based:
1. View the graph you want to see the data for.
2. Click on the Data Grid button. The data grid view is displayed.

Excel Reports
You can export Quality Center data to an Excel report to analyze the data and
present it in a graph. The Excel report consists of data defined by Structured Query
Language (SQL) queries on the Quality Center project database. You can execute a
Visual Basic script on the exported data to perform calculations and analyze the
data.
In addition, you can generate a report that contains parameters. Using parameters
in a report enables you to reuse the report for different purposes.

Test Cycle Duration as of 10/10/2011


12:01:57 AM
Project Name: vSphere_Training
Cycle Name
Alpha Cycle 1
Alpha Cycle 2
Alpha Cycle 3
Alpha Cycle 4
Beta Cycle 1
Beta Cycle 2
Beta Cycle 3
RC Cycle 1
RC Cycle 2
RC Cycle 3
RC Cycle 4
RTM Cycle 1
RTM Cycle 2
RTM Cycle 3
RTM Cycle 4

Cycle Duration (hr)


234.5
178.23
235.92
241.07
210.3
192.28
212.65
253.58
197.67
213.22
232.07
229.37
168.13
145.67
212.3

Managing Reports
You define analysis items (graphs and reports) and create dashboard pages using
Dashboard module. To open the Dashboard module, click the Dashboard button on
the sidebar.

Dashboard View
In the Dashboard module, you create, view and manage graphs, standard reports,
and Excel reports, for analyzing Quality Center data. You also create dashboard
pages that display multiple graphs side-by-side.
The Dashboard module includes trees for analysis items and dashboard pages. Each
tree consists of Private and Public root folders. Under each root folder you develop
separate trees. Analysis items or dashboard pages that you create in a public folder
are accessible to all users. Analysis items or dashboard pages that you create in a
private folder are accessible only to the user who created them. Public dashboard
pages can include only public graphs.
Analysis items and dashboard pages in public folders may show different results for
different user, depending on the data hiding definitions for the user group.

Create a Dashboard Page


In the Dashboard View, you can arrange and view multiple graphs on a single page.
You select the graphs to include in the dashboard page from the graphs in the
analysis tree. You can arrange the graphs on the page in any order you like, and you
can expand or reduce their size.
The maximum graphs per page is 4, but you can add as many pages as you need.
Dashboard pages can only be built based on graphs no reports or custom queries
even though the excel folder appears.
To create a Dashboard page:
1. Click on the DashboardDashboard View on the right sidebar.
2. In the Dashboard tree, select a public or a private folder.
3. Click the New Page button. The New Dashboard page dialog box opens.
4. Enter a dashboard page name, and click OK.

Note: A Dashboard page name cannot include the following characters: \ ^ *

Configure a Dashboard Page


You configure dashboard pages by selecting and arranging graphs on your page.
Each row on the dashboard page can include one or two graphs. In public dashboard
pages you may include only public graphs.
To configure a Dashboard page:
1. In the Dashboard module, click the Dashboard View.
2. In the Dashboard tree, select the Dashboard page that you want to
configure.
3. Click the Configuration tab. The Select Graphs pane opens.
4. To refresh the graph tree, click the Refresh button.
5. Select a graph, and click the Add Graph to Dashboard Page button.

View a Dashboard Page


After you arrange graphs on your dashboard page, you view the graphs in the View
tab.

Summary
In this lesson you learned to:

Query Quality Center modules for useful information.

Generate Live Analysis graphs.

Generate reports and graphs.

Analyze reports and graphs.

Generate Excel Reports.

Configure Dashboard.

Review Questions
1. Which Quality Center module enables you to generate graphs?

2. Which report type allows you to export data to the Excel spreadsheet?

3. What are the different Quality Center graph types?

4. What is the maximum number of graphs per Dashboard page?

Lab Exercise
Part 1: Querying Quality Center Modules

To generate Number of Test Cases per Test Developer report for MN release:
1

Navigate to the Test Plan module.

Switch to the Test Grid view (View Test Grid View).

Clear filter.

Click on the Filter button and setup the following filter conditions:
a

Subject: MN.

Status: Execution Ready.

Click on the Group by tab, select the Test Case Developer condition.

Save the favorite view in a private folder

Part2: Generating Live Analysis Reports


To generate Test Cycle Status and Test Cycle Status by TesterLive Analysis
graphs:
1. Navigate to the Test Lab module.
2. Select MN\Alpha\Alpha Cycle 1 folder.
3. Click on the Live Analysis tab.
4. If any graphs are displayed, remove them by clicking the X button.
5. Create Test Cycle Status graph:
a. Click on the Add Graph button.
b. Select Summary Graph and click on the Next button.
c. Select <None> from the Group by drop-down list.
d. Select Status from the X-axis drop-down list.
e. Click Finish.
6. View the graph in full screen mode.
7. View the graph as a Pie Chart.

8. Click on the Set Graph Appearance button, select Appearance tab, enable 3D
Graph checkbox.
9. Close Full Screen viewer.
10.Create Test Cycle Status by Tester:
a. Click on the Add Graph button.
b. Select Summary Graph and click on the Next button.
c. Select Status from the Group by drop-down list.
d. Select Tester from the X-axis drop-down list.
e. Click Finish.
11.View the graph in full screen mode.
12.Switch to the Data Grid view.
13.Switch to the Bar Chart view.
14.Close Full Screen viewer.
15.Click on different folder in Test Lab tree, show how graphs are generated for
each folder.

Part 3: Generating Analysis Graphs


To generate Automation Level and Automation Level by Product graphs:
1. Navigate to the Test Plan module
2. Click on AnalysisGraphsGraph Wizard menu
3. Select Summary Graph radio button and click on the Next button
4. Mention that this is the place where you can indicate if youd like to pull the
report from more than one project. Click on the Next button.
5. Select Define a new filter radio button and click on the Filter button next
to it.
6. For the Subject field select Subject\MN folder and click on the OK button
7. Click on the Next button
8. Select <None> from the Group By field drop-down list
9. Select Automation Level from the X-axis filed drop-down list
10.Click on the Finish button
11.Maximize Test Summary Graph window
12.Click on the Pie Chart button
13.Set graph appearance to 3D
14.Click on the Cancel button
15.Select Automation Level from the Group By field drop-down list
16.Select Product from the X-axis filed drop-down list
17.Click on the Finish button
18.Save the graph in the private folder.

Part 4: Generating Requirements Coverage report


To generate theRequirements Coverage report:
1. Navigate to the Requirements module.
2. Make sure that you are in the Requirements Details view.

3. Select any requirement from the Requirements Tree and click on the Test
Coverage tab.
4. Make sure that Full Coverage checkbox is selected.
5. Demonstrate a list of test cases linked to this requirement and a pie chart at
the bottom of the page.

Part 5: Generating Excel report


To generate Dashboard Summary Excel report:
1. Navigate to the DashboardAnalysis View module.
2. Select Dashboard Summary Report report.
3. Click on the Configuration tab
4. Observe the SQL query
5. Observe post-processing macro
6. Generate the report

Project Administration
Lesson Objective
After completing this lesson, you will be able to:

Add users to the project.

Customize list values.

Navigating to the Project Customization Page


You use Project Customization page to customize the projects.
To navigate to the Project Customization page:
1. Log on to Quality Center. The Quality Center home page appears.
2. From the menu bar, select ToolsCustomize. The Project Customization
page appears.

Adding Users to the Project


To add a user to the project:
1. Navigate to the Project Customization page.
2. On the left sidebar click on the Project Users icon.
3. In the right pane, click on the Add User button arrow. Select Add User from
Site option.
4. Select a user you want to add from the Add User from Site dialog box and
click OK.

Assigning User to a User Group


You can change the access privileges for an existing user at any time by changing
the user group to which the user is assigned.
To assign a user to specific user group:
1. Navigate to the Project Customization page.
2. On the left sidebar click on the Project Users icon.
3. On the right pane, select a user you want to assign to a group.
4. Click on the Membership tab.
5. Select a group you want to add the user to from the Not Member of list and
move the group name to the Member of list by clicking the > button.

Note: You can only add users to the following groups: VM QE Lead, VM QE
Tester, and VM QE Viewer. If you grant any other privileges to a user, they will
be removed by the automated audit procedure.

Customizing Project Lists

You can create a list items specific to your project, if they are not regulated by a
standard process.
To add an item to a list:
1. Navigate to the Project Customization page.
2. On the left sidebar click on the Project Lists icon.
3. From the Lists list, select the list you want to create an additional item for.
4. Click New Item. The New Item dialog box appears.
5. In the New Item dialog box, in the New Item Name field, type a name for the
item.
6. Click Ok to close the New Item dialog box.

Summary
In this lesson you learned to:

Add users to the project.

Customize list values.

Review Questions
1. Which groups you can assign users on the project to?

2. Which lists can you add new entries to?

You might also like