You are on page 1of 43

Guidelines for Core

Key Performance
Indicators
Interim Report on
Primary Service Channels

Guidelines for Core


Key Performance Indicators
Interim Report onfor
Primary
Service
Channels
Guidelines
Core
Key
Performance:
September 2004
Primary
Service Channels

Executive Summary
The development of key performance indicators (KPIs) for the Government of
Canada (GoC) became a priority as Canadas Government Online (GOL) initiative
matured from 1998 through 2004. The rapid development of the Internet channel as a
means of providing effective public service delivery created an appetite for
revolutionary change in all types of service delivery. Prior to GOL, large-scale
improvements to service delivery were confined to specific government programs
and services. Interdepartmental projects were rare. The advent of the Internet and
the preference of Canadians to to access on-line government services has created
cutting edge opportunities for change in delivering services to Canadians.
In the past three years, dozens of interdepartmental initiatives have taken hold and
have helped to foster citizen-centred service delivery. As more and more business
improvement opportunities were conceived, it became clear that the Government of
Canada needed clear communication for analytical decision making. Many
departments have made significant investments in performance management and
made progress towards the disciplined decision-making characteristic of the worlds
best corporations. Nevertheless, differences in terminology, definitions, usage, data
collection, and performance frameworks were quickly identified as limiting the ability
to monitor and affect enterprise-level performance.
The genesis of the Core KPI project came from the GoCs Telephony Service
Working Group an interdepartmental collection of GoC call centre managers and
executives that came together to share best practices, establish consistent service
standards and generally improve the capabilities of GoC call centre operations. In
2003, this working group quickly identified, and provided precise definitions of,
common KPIs.
Coincident with this achievement, Treasury Board of Canada, Secretariat developed
a modernized approach to the management of the public sector organizations and
programs called the Management Accountability Framework (MAF). This
comprehensive set of tools, standards, and processes provided an over-arching
framework for the Core KPI project. The operational nature of KPIs strongly
supported the MAF and provided direct information to two of the primary MAF
categories stewardship and citizen-focused service.
In 2003, as the GoCs Internet channel rapidly matured and initial significant
transactional capability came online, new interdepartmental working committees
were formed to deal with the complexities of multi-service, multi-channel delivery
alternatives. Internet gateways and clusters rapidly evolved; this helped organize
services in parallel with client segments and life events. This has created
opportunities to effect corresponding changes in how GoC services are delivered inperson and by mail. By 2004, there was a clear need to establish common core KPIs

and establish a working environment to develop further a common performance


language.
The Core KPI project brought together numerous government managers experts in
delivering services to Canadians, visitors and businesses. Managers with
operational responsibility for call and mail processing centres, Internet sites, and inperson locations were engaged in several meetings to identify the KPIs that provide
maximum management value.
The result of these meetings was a small set of channel-specific core KPIs that
reflect specific MAF themes. These KPIs will be required for a variety of reporting
requirements, Treasury Board submissions, and ongoing reviews. Additional
operational KPIs were identified that are recommended by Treasury Board (but not
required) as effective indicators that provide strong operational benefits to service
delivery organizations.
The Core KPI project is not complete. There is an ongoing requirement for
implementation, improvement, and additions as the GoC service delivery strategy
evolves. Perhaps the most important and lasting benefit is the networking of the best
performance management people in the GoC. These experts continue to develop
new techniques and identify improvements to ensure that Canada remains one of the
world leaders in public sector service delivery. And that position clearly improves our
competitive position in the twenty-first century.

Record of Changes
Version
V 0.9
V 1.0

Date
August 30, 2004
Sept. 30, 2004

Summary of Changes
First draft for formal review
Minor edits

Detailed Description of Changes from Previous Version

Acknowledgements
Project Authority:

Victor Abele

Director, Service Strategy,


CIOB Treasury Board Secretariat, Canada

Project Analyst:

Phillip Massolin

Analyst, Service Strategy


CIOB, Treasury Board Secretariat, Canada

Author:

Dan Scharf

Equasion Business Technologies

Contributors:

Daryl Sommers
Colin Smith
Reina Gribovsky
Dolores Lindsay
Daniel Tremblay
Kyle Toppazzini
Marg Ogden

Web Content:

Morris Miller

Service Improvement - Guidelines for Key Performance


Indicators
_______________________________________________________
Table of Contents
1.0 INTRODUCTION.....................................................................7
2.0 DEFINING THE CHANNELS...................................................8
3.0 MANAGEMENT ACCOUNTABILITY FRAMEWORK.............10
4.0 SERVICE STANDARDS........................................................11
5.0 ACCOUNTABILITY and Key Performance Measures............12
6.0 POLICY AND PROGRAMS...................................................14
7.0 RISK MANAGEMENT............................................................14
8.0 KEY PERFORMANCE INDICATORS Phone Channel.......15
MAF CATEGORY: CITIZEN FOCUSED SERVICE......................15
Metrics for Access.....................................................................15
Metrics for Delay.......................................................................15
Metrics for Quality.....................................................................16
Metrics for Client Satisfaction...................................................17
MAF CATEGORY: STEWARDSHIP..............................................17
Metrics for Agent Utilization......................................................17
Metrics for Service Effectiveness..............................................18
Metrics for Use of Technology..................................................19
Metrics for Channel Take-up.....................................................19
MAF CATEGORY: PEOPLE........................................................20
9.0 KEY PERFORMANCE INDICATORS In-Person Channel...21
MAF CATEGORY: CITIZEN FOCUSED SERVICE.......................21
Metrics for Access.....................................................................21
Metrics for Delay.......................................................................21
Metrics for Quality.....................................................................22
Metrics for Client Satisfaction...................................................23
MAF CATEGORY: STEWARDSHIP..............................................23
Metrics for Agent Utilization......................................................23
Metrics for Service Effectiveness..............................................24
Metrics for Use of Technology..................................................25
Metrics for Channel Take-up.....................................................25
MAF CATEGORY: PEOPLE.........................................................25
5

10.0 KEY PERFORMANCE INDICATORS Internet Channel. .26


MAF CATEGORY: CITIZEN FOCUSED SERVICE......................26
Metrics for Access.....................................................................26
Metrics for Delay.......................................................................28
Metrics for Quality.....................................................................28
Metrics for Client Satisfaction...................................................29
MAF CATEGORY: STEWARDSHIP..............................................29
Metrics for Agent Utilization......................................................29
Metrics for Service Effectiveness..............................................30
Metrics for Use of Technology..................................................31
Metrics for Channel Take-up.....................................................31
MAF CATEGORY: PEOPLE........................................................31
11.0
KEY PERFORMANCE INDICATORS Mail Channel........33
MAF CATEGORY: CITIZEN FOCUSED SERVICE.......................33
Metrics for Access.....................................................................33
Metrics for Delay.......................................................................33
Metrics for Quality.....................................................................34
Metrics for Client Satisfaction...................................................35
MAF CATEGORY: STEWARDSHIP..............................................35
Metrics for Agent Utilization......................................................35
Metrics for Service Effectiveness..............................................36
Metrics for Use of Technology..................................................36
Metrics for Channel Take-up.....................................................37
MAF CATEGORY: PEOPLE.........................................................37
12.0 USING SERVICE DELIVERY KPIs in DEPARTMENTAL
REPORTING...................................................................................38
13.0 CONTACT INFORMATION.................................................39
APPENDIX A: Terms and Definitions.............................................40
APPENDIX B: References............................................................41
APPENDIX C: Summary of Core Key Performance Indicators .42

Service Improvement - Guidelines for Key Performance


Indicators
_______________________________________________________

1.0 INTRODUCTION
Citizens are faced with a greater choice of channels than ever before to access
government services (in-person, phone, internet and mail) , creating corresponding
challenges for organizations to manage service delivery across all channels.
Key Performance Indicators (KPI) are increasingly used by the private and public
sectors to measure progress towards organizational goals using a defined set of
quantifiable measures. For the GoC, KPIs are becoming an essential part of
achieving Management Accountability Framework (MAF) compliance.
Once approved, the KPI framework will constitute a key element of departments
annual monitoring. You can navigate KPIs by:

Channel - each channel includes standard metrics (KPIs) for managing


performance, or

Management Accountability Framework category


A series of government-wide workshops identified the requirement for a consistent
approach to measure service delivery performance across the GoC. Workshop
results can be assessed to help you create baseline frameworks for channel
measurement and provide input into the process.

Key Performance Indicators for Service Delivery Channels

2.0 DEFINING THE CHANNELS


Phone Service is the preferred service channel for most Canadians. Primary modes
of interaction within this channel are:
Interactive Voice Response (IVR) to provide self-service, routing, and
broadcast services;
Agent-based services the most highly valued service channel for citizens
today.
Internet Service: most surveys indicate this as preferred channel of the future.
Modes of interaction include:
Self-service via online transactions, search and website navigation strategies
E-mail which provides delayed support both automated and agent authored
Online chat technologies which provides real time agent assisted Internet
service delivery
Mail: primary indicators indicate that this channel the paper channel - is
decreasing in popularity. Surveys in the past few years indicate that citizens will
increase the use of more timely and interactive channels. Mail is sent using three
methods (analogous to the modes of interaction in the other channels):
Fax instant transmission via facsimile devices over phone or broadband
networks;
Courier expedited delivery via priority parcel carriers (within 48 hours);
Regular Mail via regular postal services (3 to 10 days).
In-Person Service: Canadas extensive network of local offices provides a significant
proportion of all government service delivery using primarily queued and appointment
based service models. Some In-Person points of service also offer assisted Internet
and telephone services through kiosks, publicly available computers, and public
phones. There are four services modes for the In-Person Service channel:
Queued a managed, multi-agent counter office which often has a reception
counter to triage visitors to the correct counter and answer simple questions;
Scheduled significant volumes of in-person services are provided on a prescheduled one-on-one basis;
Outreach several service delivery organizations schedule seminars and
training sessions in communities throughout Canada;
Retail some organizations provide storefront operations where visitors can
browse publications and utilize computers for self-service, Service staff are

Key Performance Indicators for Service Delivery Channels


available to help and can either approach the visitor directly on the floor or
respond to visitors questions at service counters.
Service channels and modes of interaction are impacted by accessibility standards
which maximize the availability of channels to people with disabilities. For example,
the use of TTY technology within the Phone Channel provides access to hearing
impaired individuals. In the In Person channel, the use of ramps, lower counters, and
powered doors facilitate access for people who use wheelchairs. The Internet
Channel uses W3C accessibility standards to ensure that government websites are
accessible using assisting technologies such as screen readers, font magnifiers, and
speech recognition.
Several GOC organizations are using multi-channel service strategies to achieve
higher service value with economic investments. For example, Service Canada
uses publicly available computers within its service outlets to provide self-service
and, as required, assisted Internet support to visitors. Several departments are
experimenting with dedicated, specially trained call-centre agents to support Internet
site visitors through toll-free direct assistance lines.
KPIs listed in this document are not specifically intended to measure the important
service delivery issues of accessibility to people with disabilities and integrated multichannel implementation characteristics.

Key Performance Indicators for Service Delivery Channels

3.0 MANAGEMENT ACCOUNTABILITY FRAMEWORK


The Government of Canada has instituted a consistent management framework for
its programs and services. Comprehensive information on the Management
Accountability Framework (MAF) can be found at Treasury Boards website
(reference: http://www.tbs-sct.gc.ca/maf-crg/maf-crg_e.asp ) . MAF provides deputy
heads and all public service managers with a list of management expectations that
reflect the different elements of current management responsibilities. Key
Performance Indicators for Service Delivery are grouped within the MAF categories
(Policy and Programs, People, Citizen-Focused Service, Risk Management,
Stewardship, Accountability) as shown in the following diagram.

The majority of service delivery indicators relate to the operational nature of the
Stewardship category. Additional indicators measure progress to objectives under
the Citizen-Focused Service Category and People. The Accountability category
provides checklists and processes for establishing effective service level agreements.
Specific assessment tools are used for the Policy and Programs and Risk
Management categories.

Key Performance Indicators for Service Delivery Channels

4.0 SERVICE STANDARDS


The Service Improvement Initiative defines specific guidelines for all departments
and agencies to establish and publish services standards for citizens, international
visitors, and businesses using GOC services and programs.
The Citizen First Survey provides specific information on client service expectations
through a formal comprehensive survey. Trends over the past 5 years indicate shifts
in these expectations that provide government service delivery managers with
effective direction to prioritize service improvement initiatives.
The overarching goal is to establish a 10% increase in client satisfaction by 2005.
Departments are required to set standards and measure progress to this goal using
primary criteria such as:
Timeliness - the time required to receive a service or product
Access
- how accessible was the service or ordering process to the client?
Outcome
- did the client receive what was needed?
Satisfaction - overall client satisfaction with the service/product request.
Common high-level service standards include:
Average speed to answer - 5 minutes
Expected answer by e-mail - Next business day
Queue time for in-person services - 15 minutes.

Key Performance Indicators for Service Delivery Channels


5.0

ACCOUNTABILITY and Key Performance Measures

As specified in MAF, service delivery channels must employ clear accountability


frameworks. Most frequently, departments and agencies use a Service Level
Agreement (SLA) both for internally resourced services and for third party and
partnership teams. The following table presents a minimum set of components which
must be included in a Government of Canada SLA which provides the foundation for
service delivery to citizens, visitors and businesses.

COMPONENT

DESCRIPTION

Service Level Agreement


Name
Service Description

The name of the SLA, particularly useful when a


single SLA is used across multiple service offerings.
The details of the service the government intends to
provide and the benefits the users can expect to
receive
Normally identified in a Service Catalogue based on
already defined metrics. This level of criticality should
be based primarily on the service users
requirements.
Identifies which channels this service is available
through e.g. telephone, mail, in-person, Internet, and
appropriate contact information for the channels.
The department or agency which is primarily
responsible for the service.
Other partner-departments that provide support to a
Primary Service Provider for a service. e.g. GTIS
provides the Internet Server to the Health Canada
Provides the details of the quality of the service a
client can expect. This is frequently time based e.g.
Passports will be processed in X number of days.
Delivery targets describe the key aspects of the
service provided, such as access, timeliness and
accuracy.
The effective start and end dates of the agreement. A
review date must also be identified so that
performance measurements can be made and the
SLA can be adjusted or action can be taken to
improve the performance of an SLA.
Identifies a cost for service (even when user fees are
not required) to ensure that users understand and

Service Criticality Level

Service Channels
Service Primary Service
Provider
Service Partner Providers
Pledge
Delivery Targets
Dates

Costs

Key Performance Indicators for Service Delivery Channels

Complaint and Redress

Service Hours

Throughput

Change Management

Security and Privacy

Service reporting and


reviewing
Performance
Incentives/Penalties

form realistic expectations about services offered by


the Federal Government
Provides the service user with mechanisms to resolve
their Concerns, for example when the SLA has not
been met.
Dates: start, end and review
Scope: (what is covered and what is not)
Responsibilities: Service Providers, Partners and the
User
Service availability e.g. 24x7. Service hours should
provide maximum cost-effective access for the
service user. Public holidays must be identified as
well as the hours for each channel.
Describes the anticipated volumes and timing for
activities within a specific service e.g. UI Applications
Sep-May=100,000, Jun-Aug=50,000. This is
important so that any performance issues, which
have been caused by excessive throughput outside
the terms of the SLA, can be identified.
Identifies the policies surrounding any changes that
will affect the service provided to the user. For
example, if UIC benefits are going to be mailed out
every 2nd month instead of every month, how will the
change be managed to ensure that expectations are
being met.
Identifies inter-departmental policies on the sharing of
user information for various services. Organizations
must comply with PIPEDA and Treasury Board
Policies.
Specifies the content, frequency and distribution of
service reports and the frequency of service review
meetings.
This section identifies any agreement regarding
financial incentives or penalties based upon
performance against service levels. Penalty clauses
can create a barrier to partnership if unfairly invoked
on a technicality and can also make service providers
and partners unwilling to admit mistakes for fear of
the penalties being imposed.

Key Performance Indicators for Service Delivery Channels


6.0

POLICY AND PROGRAMS

The Policy and Programs in this MAF context refers to relevant lines of activity within
Departments and Agencies. Departmental Reports are the primary reporting tool
used to document the overall policy effectiveness of specific programs. Readers
should consult the MAF, relevant Treasury Board Policies as well as the Program
Activity Architecture (P.A.A.) for information and guidance in this category.
In the Fall / Winter of 2004/05, we will be consulting with the service community with
a view to developing a more comprehensive service policy. This work will allow us to
formalize the approach to KPIs which is currently in draft format.

7.0 RISK MANAGEMENT


The Risk Management category of MAF specifies a checklist for departmental
management to establish comprehensive and transparent identification of risks,
tolerances, mitigation strategies, and effective communication approaches. For
detailed information, readers should refer to MAF.

Key Performance Indicators for Service Delivery Channels


8.0

KEY PERFORMANCE INDICATORS Phone Channel

MAF CATEGORY: CITIZEN FOCUSED SERVICE


Metrics for Access
KPI:
Call Access
Description: Percentage of calls presented that get into the ACD.
Objective: Measures overall service capacity from ACD to Agent.
Definition: (Calls Answered +Calls Abandoned) divided by Calls Presented. Busy
signals generated by switch divided by total calls received in reporting period.
Derivation: ACD
Suggested benchmark / Range: 40% to 60%
Status:
Proposed as a Core KPI
KPI: Caller Access
Description: Percentage of unique callers who attempt and successfully access
service.
Objective: Basic volume measure. Determines service level by counting
unserviced callers. Removes repeat callers from accessibility measure.
Definition: Total unique phone numbers completed divided by Total Unique Phone
Numbers attempted.
Suggested benchmark / Range: 80% to 85%
Status:
Proposed as a Core KPI
KPI: Abandoned Calls
Description: Percentage of calls which are abandoned while in queue due to
prolonged delay waiting for service, typically for a live agent.
Objective: Key Measure for overall service level.
Definition: Number of calls abandoned within agent-queue + IVR abandons before
success markers divided by total calls answered + total calls abandoned.
Derivation: ACD.
Suggested benchmark / Range: 10% to 15%
Status:
Proposed as a Core KPI
Metrics for Delay
KPI: Average Speed to Answer (ASA)
Description: The average delay, while in queue before connecting to an agent,
expressed in seconds.
Objective:
Primary Indicator of caller satisfaction.
Definition: The total number of seconds from ACD queuing of call to agent
acceptance / total agent calls.

Key Performance Indicators for Service Delivery Channels


Derivation: Measured by ACD
Status:
Proposed as a Core KPI
KPI: Service Level
Description: Percentage of calls that reach an agent or are abandoned within a
specified time threshold.
Objective: This measure is required in order to set and publish telephone service
standards.
Definition: Calls answered within Threshold + Calls Abandoned within Threshold /
(Total Calls Answered + Total Calls Abandoned).
Derivation: Measured by ACD
Status:
Proposed and required for phone service management
Metrics for Quality
KPI:
Answer Accuracy
Description: Consistency of IVR and agent answers.
Objective: To ensure program integrity.
Definition: Local quality scorecard assessed by call monitoring and / or mystery
shopper approaches.
Derivation: # of calls answered in IVR terminated at success markers +# of agent
calls resulting in success status times accuracy evaluation ratio
Status:
Proposed as a Core KPI
KPI: Professionalism
Description: Encompasses a range of soft-skills that govern the approach to
delivering accurate information and reliable services.
Objective: Identifies and reinforces effective communication.
Definition: Best measured through the use of a mystery shopper program that
uses specific planned calls placed to the call centre by a measurement organization.
Can also be measured by exit surveys performed immediately after call completion.
Status:
Recommended as an operational measure.

Key Performance Indicators for Service Delivery Channels


Metrics for Client Satisfaction
KPI: Client Satisfaction Level
Description: Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfaction
Objective: Primary indicator of client perception of service quality, measured
repeatedly, trends provide strong evidence of service improvement levels
Derivation: The multi-channel survey tool will be used (at a minimum) to determine
the core measures relevant to the telephone channel.
Status:
Proposed as a Core KPI
KPI: Service Complaints
Description: Count and categorization of complaints received through all channels
concerning the Phone channel.
Objective: Primary indicator of service quality particularly when measured over
time.
Derivation: Counted by incident tracking system. Complaints received through
other channels must be added to total.
Status:
Recommended as a Core KPI but not currently feasible as most GoC
organizations do not integrate service feedback information.
MAF CATEGORY: STEWARDSHIP
Metrics for Agent Utilization
KPI: Cost per Call
Description: The total operational cost of the call centre over the reporting period
divided by total calls handled during the reporting period..
Objective: Provides high level indication and trend of overall service performance.
Definition: will require further working group consultation
Status: Recommended as Core KPI
KPI: Agent Capacity
Description: The anticipated number of hours of agent time available for telephone
service for each full-time equivalent (FTE).
Objective: Ensures that agent resources are dedicated to required functions
Status: Recommended as an operational measure

Key Performance Indicators for Service Delivery Channels


KPI: Resource Allocation
Description: A management indicator assessing allocated FTEs to service delivery.
Objective: Measures effective use of channel resources.
Definition: Locally defined
Status: Recommended as an operational measure
KPI: Agent Adherence
Description: An assessment of telephone agent adherence to schedule and making
oneself available during anticipated service periods.
Objective: Contributes to resourcing effectiveness
Definition: Calculated as total agent login time divided by scheduled work time
Status: Recommended as an operational measure
KPI: Agent Occupancy
Description: The percentage of agent time spent in direct service including talk and
wrap up time.
Objective: Ensures accurate resourcing levels to achieve target service levels.
Definition: (Talk time + after call wrap up time) divided by total agent log in time
over measured period.
Suggested benchmark / Range: 85%
Status: Recommended as an operational measure

Metrics for Service Effectiveness


KPI: First Call Resolution
Description: The degree to which the client needs are met without further referral or
call-back within a designated time interval.
Objective: Minimize cost and maximize client satisfaction.
Definition: number of single calls by unique phone number within 48 hour period not
abandoned
Status: recommended as Core KPI
KPI: Accurate Referral
Description: A redirect to the correct service for resolution of client need (may be to
a higher service tier or to a separate organization/jurisdiction providing the service).
Objective: Measures key caller criteria of more than 2 transfers.
Definition: will require further working group participation
Status: Not recommended. Not technically feasible at this time.
Metrics for Use of Technology

Key Performance Indicators for Service Delivery Channels


KPI: Call Avoidance
Description: A call that quickly exits the system after an introductory message or
bulletin that provides a desired answer for a substantial portion of the calling
population, e.g. related to an immediate but temporary service outage.
Objective: Measures utility of IVR/bulletins to answer high-volume inquiries.
Definition: Calls terminated at specific IVR marker after bulletin
Status: Proposed as Core KPI
KPI: Calls Answered by IVR Successfully
Description: A call that terminates in IVR tree after success marker.
Objective: Measures utility of IVR response tree to provide self-service answer; an
important indicator of IVR utility; secondary indicator of client satisfaction.
Definition: Calls terminated at all IVR success markers.
Status: Proposed as Core KPI
Metrics for Channel Take-up
KPI:
Calls
Description: Total calls received
Objective:
Measures overall service demand
Definition: Number of calls received at switch. Note that this will include repeat
callers who are refused at the switch.
Status: proposed as Core KPI
KPI:
Callers
Description: Unique Callers
Objective: Measures service demand more accurately.
Definition: Unique phone numbers dialing the service
Status: proposed as Core KPI

Key Performance Indicators for Service Delivery Channels

MAF CATEGORY: PEOPLE


At publishing time, KPIs for the MAF PEOPLE category had not yet been proposed to
the working group for review. Some examples of KPIs that might be suitable for this
MAF category include:
Total Months Staff on Strength, Average Months on Strength per Agent: A measure
of the total experience level of the agents within the call centre. Monitoring this over
time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides
a secondary indicator of Call Centre health and it often correlates to overall customer
satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helps
measure the utilization of Call Centre supervisor time as well as the investment in
agent skill improvement.
Training Days/Agent: Total training days delivered during the measurement period
divided by the number of agents. Training is required for program/service delivery,
for technology, and for the development of skills related to professionalism and
customer interaction.
Further discussion with departments and agencies will be conducted to identify
effective KPIs under this MAF category.

Key Performance Indicators for Service Delivery Channels

9.0 KEY PERFORMANCE INDICATORS In-Person Channel


MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
KPI: Visitor Access
Description:
Count of visitors who either a ) are serviced at agent stations or b) obtain self-service
through in-location computers
OR
Count of visitors entering facility. This depends on the service model and facility.
Objective: Basic volume measure.
Definition: Total visitors entering facility over measurement period.
Suggested benchmark / Range: TBD
Status: Proposed as Core KPI. Tracked by all operations.
KPI: Visitors Serviced
Description: Ratio of visitors receiving agent service to total visitors. Provides
indication of utilization of self-service capabilities and overall operational capacity.
Definition: Total agent-visitor services divided by total visits
Status: Recommended as an operational measure.
Metrics for Delay
KPI: Average Wait Time (AWT)
Description: The average delay from time of entering facility to introduction at agent
station.
Objective:
Primary Indicator of visitor satisfaction.
Definition: The total number of minutes from pulling of service ticket to service.
Derivation: Measured by service management system
Status: Recommended as an operational KPI. Measured by all Queued service
operations. Not trackable within retail service model.
KPI: Service Level
Description: Percentage of visitors that reach an agent within target wait time.
Objective: This measure is required in order to set and publish in-person service
standards.
Definition: Visitors served within threshold/Total Visitors Serviced
Derivation: Measured by service ticketing system
Status: Recommended as an operational KPI. Measured by all Queued service
models. Not relevant to retail service model.

Key Performance Indicators for Service Delivery Channels


Metrics for Quality
KPI:
Answer Accuracy
Description: Reliability of agent answers.
Objective: To ensure program integrity.
Definition: Local quality scorecard assessed by supervisor monitoring and / or
mystery shopper approaches and/or exit surveys
Derivation: # of visitors answered successfully by agents
Status: Under review. May be impractical in several service models.
KPI: Professionalism
Description: Encompasses a range of soft-skills that govern the approach to
delivering accurate information and reliable services.
Objective: Identifies and reinforces effective communication.
Definition: Best measured through the use of a mystery shopper program that
uses specific planned visits to the service centre by a measurement organization.
Can also be measured by exit surveys either conducted by staff or at self-service
computers.
Status: Under review. May be impractical in several service models.
KPI: Transaction duration variability
Description: For operations providing specific transaction services, analysis of
variance of transaction duration correlates strongly to application accuracy.
Objective: Assess process consistency across agents.
Status: Proposed to working team. Applicable only to some operations. Possible
as a recommended operational KPI.
KPI: Critical Error Rate
Description: Some operations monitor application/transaction errors (typically
omission of required information) requiring additional interactions with clients.
Objective: Assessment of pre-visit instructions to clients and/or reception desk
triage procedures.
Status: Proposed as an operational measure. Applicability to be reviewed.

Key Performance Indicators for Service Delivery Channels


Metrics for Client Satisfaction
KPI: Client Satisfaction Level
Description: Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfaction
Objective: Primary indicator of client perception of service quality, measured
repeatedly, trends provide strong evidence of service improvement levels
Derivation: The multi-channel survey tool will be used (at a minimum) to establish
the CSat level for in-person services.
Status: Recommended as core KPI.
KPI: Service Complaints
Description: Count and categorization of complaints received through all channels
concerning the In-Person channel.
Objective: Primary indicator of service quality particularly when measured over
time.
Definition: Total service complaints received during reporting period divided by total
number of calls.
Derivation: Counted by incident tracking system. Complaints received through
other channels must be added to total.
Status: Recommended as core KPI. Caveat: Members noted that current systems
do not currently support the collection and categorization of service complaints that
are received through a wide variety of channels (e.g. Ministers correspondence,
general e-mails, complaints at end of successful service phone call)
MAF CATEGORY: STEWARDSHIP
Metrics for Agent Utilization
KPI: Cost per Contact
Description: Total labour costs divided by total service requests.
Objective: Provides a snapshot of current operational efficiency specifically related
to agent/manpower.
Definition: TBD
Status: Recommended as Core KPI. Definition of labour cost to be determined.
KPI: Agent Capacity
Description: The anticipated number of hours of agent time available for counter
service for each agent.
Objective: Ensures that agent resources are dedicated to required service

Key Performance Indicators for Service Delivery Channels


functions
Status: Recommended as operational KPI for queued service models.
KPI: Resource Allocation
Description: A management indicator assessing allocated agent positions to service
delivery.
Objective: Measures effective use of channel resources.
Definition: Locally defined
Status: Recommended as operational KPI for queued service models.
KPI: Agent Adherence
Description: An assessment of service agent adherence to schedule and making
oneself available during anticipated service periods.
Objective: Contributes to resourcing effectiveness
Definition: Calculated as total agent login time divided by scheduled work time
Status: Recommended as operational KPI for queued service models.
KPI: Agent Occupancy
Description: The percentage of agent time spent in direct service including talk and
wrap up time.
Objective: Ensures accurate resourcing levels to achieve target service levels.
Definition: (Talk time + after visit wrap up time) divided by total agent log in time
over measured period.
Suggested benchmark / Range: TBD
Status: Recommended as operational KPI for queued service models.
Metrics for Service Effectiveness
Working group is asked to contribute suggestions for KPIs in this theme.
KPI: Turn Around Time
Description: The average time to transaction complete (i.e. receipt by client)
expressed as a percentage of target time.
Objective: Measures the response time to the client primary indicator of customer
satisfaction.
Definition:
Status: Under review.

Key Performance Indicators for Service Delivery Channels


Metrics for Use of Technology
KPI: Self-Service Ratio
Description: A visitor to the service office that accesses computers
Objective: Measures utility computer facilities within service office.
Definition: Count of number of computer accesses divided by total visitors during
measurement period.
Status:
Proposed as a Core KPI.
Metrics for Channel Take-up
KPI:
Visitors
Description: Total visitors entering the office.
Objective: Measures overall service demand
Definition: See ACCESS measure.
Status:
Proposed as Core KPI.
MAF CATEGORY: PEOPLE
Total Months Staff on Strength, Average Months on Strength per Agent: A
measure of the total experience level of the agents/staff within the service centre.
Monitoring this over time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides
a secondary indicator of service centre health and it often correlates to overall
customer satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helps
measure the utilization of service centre supervisor time as well as the investment in
agent skill improvement.
Training Days/Agent: Total training days delivered during the measurement period
divided by the number of agents. Training is required for program/service delivery,
for technology, and for the development of skills related to professionalism and
customer interaction.
Further discussion with departments and agencies will be conducted to identify
effective KPIs under this MAF category.

Key Performance Indicators for Service Delivery Channels


10.0 KEY PERFORMANCE INDICATORS Internet Channel
The Canadian Gateways team has published a definitive report on Internet
Measurement identifying the suitability and meaning of specific web measures (for
example, hits versus visits). Readers are asked to review this document (see
Appendix B).
MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
In the Internet Channel, the access theme includes measures concerning the availability of the site to
potential site visitors. There are two primary components to site availability:
a) How easily can site visitors locate the site through search engines, links from other sites or via
publication of the URL through other channels such as the phone and mail? and
b) Is the site available for site visitors once it has been located?
Other qualitative characteristics contributing to access include compliance with W3C Accessibility
Standards to ensure the site is fully inclusive and available to persons with disabilities.

KPI:
Search Engine Ranking
Description: Relevance ranking weighted from distribution of site visitors who
entered the site through commercial search engines. Metric assumes that a high
search engine rank provides maximum accessibility to those visitors who access the
site via search.
Objective: Measures overall site access through search engines.
Definition: Sum of (relevance ranking multiplied by search engine referring count)
divided by total search engine referrals
Derivation: Relevance rank from top five referring search engines using visitor
representative sample of search terms
Suggested benchmark / Range:
Status:
Proposed as a Core KPI
KPI:
Direct Access Ratio
Description: Percentage of visits which access the site directly via same or known
URL to total visitors.This metric assumes that visits accessing the site directly are
either typing or pasting a URL in from another source (e.g. a brochure) or have
bookmarked the site as a result of repeated visits.
Objective: Assessment of site memory through known URL or bookmarking;
Definition: Visits arriving at any page in site who do not have a referring URL
associated with the visit.
Derivation: Web traffic statistics counting visits arriving at site without referring URL.
Status:
Proposed as a Core KPI
KPI:

Server Availability Percentage

Key Performance Indicators for Service Delivery Channels


Description: Total available server hours over total planned server hours during
reporting period.
Objective: Indicative of overall Internet service capacity
Definition: sum of total available server hours less scheduled maintenance hours
divided by total planned server hours
Derivation: Server/Operating System Logs
Status:
Proposed as a Core KPI
KPI:
Referral percentage.
Description: Percentage of total visits arriving at the site from planned referral sites.
This KPI can be further broken down into specific sites: e.g. GoC Gateways, other
GoC sites, other jurisdictions etc.
Objective: Measures another access route to the site and can be used to adjust
access strategies.
Definition: Total visits arriving from specified websites divided by total visits.
Status:
Proposed as Core KPI
KPI:
Conversion Rate
Description: Rate at which visitors initiate transactions and reach the submit page.
Objective: Key Measure of overall service level and visitor satisfaction
Definition: Total visits reaching submit pages divided by total visits viewing
transaction start pages.
Derivation: web monitoring package
Suggested benchmark / Range:
Status:
Proposed as Core KPI
KPI:
Abandonment Rate
Description: Rate at which visitors initiate transactions but do not reach the submit
page PLUS visitors exiting site from non-content pages
Objective: Key Measure for overall service level.
Definition: Visits with unsatisfactory exit pages divided by total visits
Derivation: web traffic statistics
Suggested benchmark / Range:
Status:
Proposed as Operational Measure

Key Performance Indicators for Service Delivery Channels


Metrics for Delay
KPI: Average Visit Duration
Description: The average duration of a visit. This metric can provide some
indication of visitor need. However as more and more transactions are put online,
statistics for visit duration may need to be separated according to type of visit (e.g.
transactional, browse, search)
Objective: Assessment of site stickiness overall relevance of site content and
transactions to visitors requirements.
Definition: Total elapsed seconds from site entry at any page to site exit for all
visits divided by number of visits
Derivation: Measured by web traffic software.
Status:
Recommended as an operational measure.
Metrics for Quality
KPI:
Site Error Messages
Description: Capture of all computer identified error conditions. Such as page not
found message, invalid links, transaction aborts etc.
Objective: Improves overall site quality and response.
Definition: Total error page views divided by total visits
Derivation: Web activity tracking
Status:
Recommended as Core KPI
KPI:
Internet Channel Feedback
Description: Total criticisms, complaints and compliments categorized into effective
topics, received through all sources. It is recognized by working group that this is
difficult to track today. However, it is recognized as high value.
Objective: Contributes to program integrity.
Definition: Count of complaints by topic over reporting period.
Derivation: E-Mail, Phone Incident Tracking System, Ministerial Correspondence
system
Status:
Recommended as an operational measure but not currently used within
most departments.

Key Performance Indicators for Service Delivery Channels


KPI: Professionalism
Description: Encompasses a range of soft skills that govern the approach to
delivering accurate information and reliable services.
Objective: Identifies and reinforces effective web design and web authoring skills.
Definition: Best measured through the use of focus groups and independent
testing organizations. Some input may be available from Media Metrics. Quality of email responses where implemented can be verified by Email Response Management
System (# of QA corrections, etc).
Status:
Proposed as Core KPI but must be further developed by working group.
Metrics for Client Satisfaction
KPI: Client Satisfaction Level
Description: Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfaction
Objective: Primary indicator of client perception of service quality, measured
repeatedly, trends provide strong evidence of service improvement levels
Derivation: The multi-channel survey tool will be used (at a minimum) to determine
the core measures relevant to the Internet channel. As well, period exit surveys
should be conducted upon site exit. Timeliness of speed of e-mail response can also
be measured.
Status: Proposed as Core KPI

MAF CATEGORY: STEWARDSHIP


KPI: Cost per Visit, Cost per Visitor
Description: The total operational cost of the site over the reporting period divided
by total visits/visitors handled during the reporting period..
Objective: Provides high level indication and trend of overall service performance.
Definition: will require further working group consultation
Status:
Recommended as Core KPI
Metrics for Agent Utilization
The following four measures can be tracked for agent-assisted calls concerning the
Internet channel and for all messages/e-mails submitted through the Internet site. All
are recommended as Operational Measures.
KPI: Agent Capacity
Description: The anticipated number of hours of agent time available for service for

Key Performance Indicators for Service Delivery Channels


each full-time equivalent (FTE).
Objective: Ensures that agent resources are dedicated to required functions
KPI: Resource Allocation
Description: A management indicator assessing allocated FTEs to service delivery.
Objective: Measures effective use of channel resources.
Definition: Locally defined
KPI: Agent Adherence
Description: An assessment of telephone agent adherence to schedule and making
oneself available during anticipated service periods.
Objective: Contributes to resourcing effectiveness
Definition: Calculated as total agent login time divided by scheduled work time
KPI: Agent Occupancy
Description: The percentage of agent time spent in direct service including talk and
wrap up time.
Objective: Ensures accurate resourcing levels to achieve target service levels.
Definition: (Talk time + after call wrap up time) divided by total agent log in time
over measured period.
User support metrics
Suggested benchmark / Range: 85%
Metrics for Service Effectiveness
KPI: First Visit Resolution
Description: Unique visitors over x-day period who exited the site from success
content pages
Objective: Minimize cost and maximize client satisfaction.
Definition: number of single unique visits within x-day period who exited the site
from specific success (i.e. answer found) pages

Key Performance Indicators for Service Delivery Channels


Metrics for Use of Technology
As the Internet Channel is used to provide self-service through Technology, this
theme is not applicable within the channel.
Metrics for Channel Take-up
Web channel take up data is used in comparison with other channels to determine
the impact of web site changes.
KPI:
Visits
Description: Total site visits accepted
Objective: Measures overall service demand
Definition: Number of visit sessions initiated by web servers.
Status:
Proposed as Core KPI
KPI:
Visitors
Description: Unique Visitors
Objective: Measures service demand accurately.
Definition: Unique visitors counted either through registration/login processes or
via cookies.
Status:
Proposed as Core KPI

MAF CATEGORY: PEOPLE


At publishing time, KPIs for the MAF PEOPLE category had not yet been proposed to
the working group for review. Some examples of KPIs that might be suitable for this
MAF category include:
Total Months Staff on Strength, Average Months on Strength per Agent: A measure
of the total experience level of the agents within the call centre. Monitoring this over
time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides
a secondary indicator of Call Centre health and it often correlates to overall customer
satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helps
measure the utilization of Call Centre supervisor time as well as the investment in
agent skill improvement.

Key Performance Indicators for Service Delivery Channels


Training Days/Agent: Total training days delivered during the measurement period
divided by the number of agents. Training is required for program/service delivery,
for technology, and for the development of skills related to professionalism and
customer interaction.
Further discussion with departments and agencies will be conducted to identify
effective KPIs under this MAF category.

Key Performance Indicators for Service Delivery Channels

11.0 KEY PERFORMANCE INDICATORS Mail Channel


MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
KPI: Applications/Pieces Opened
Description:
Count of new envelopes opened during reporting period.
Objective:
Basic volume measure.
Definition:
Total envelopes opened less inappropriate mail (junk mail,
wrongly-addressed etc)
Suggested benchmark / Range:
Status:
Proposed as Core KPI.
KPI: Applications Completed
Description: Outbound mail for completed files.
Definition:
Status:
Proposed as Core KPI.
KPI: Applications/Mail in Process
Description:
All files remaining open at end of reporting period. Represents
the work in progress within the processing centre.
Definition:
Previous open files + applications received less applications
completed.
Status:
Proposed as Core KPI.
Metrics for Delay
KPI: Average Cycle Time (ACT)
Description: The average elapsed time that the application/mail was held within
the processing centre prior to completion.
Objective:
Primary Indicator of visitor satisfaction.
Definition:
The total number of minutes from opening of envelope to mailing of
response.
Derivation: Measured by mail tracking system.
Status:
Recommended as a Core KPI.

Key Performance Indicators for Service Delivery Channels


KPI: Pass Through Ratio
Description: Ratio of total handling time to total cycle time.
Objective:
Primary Indicator of workflow efficiency.
Definition:
Total minutes of processing time (time in agent) divided by total
elapsed time. Ratio should approach 1.0 to indicate zero delay
between processes.
Derivation: Measured by mail tracking system.
Status:
Recommended as an Core KPI.
KPI: Service Level
Description: Percentage of mail that are completed within target processing time.
Objective: This measure is required in order to set and publish mail service
standards.
Definition: Applications completed within service threshold divided by total
applications completed.
Derivation: Measured by mail tracking system
Status:
Recommended as an operational KPI.
Metrics for Quality
KPI:
Response Accuracy
Description: Reliability of mail response/completion.
Objective: To ensure program integrity.
Definition: Local quality scorecard assessed by quality assurance review of
outbound mail plus write backs One or more subsequent mail receipts for same
applications.
Derivation: QA report.
Status:
Under review.
KPI: Professionalism
Description: Encompasses a range of soft-skills that govern the approach to
delivering accurate information and reliable services.
Objective: Identifies and reinforces effective communication.
Definition: Best measured through the use of an enclosed feedback postcard or
through alternate channel surveys (e.g. post response phone call).
Status:
Under review. May be impractical in several service models.
KPI: Critical Error Rate
Description: Some operations monitor application/transaction errors (typically
omission of required information) requiring additional interactions with clients.
Objective: Assessment of application instructions to clients
Status: Proposed as an operational measure. Applicability to be reviewed.

Key Performance Indicators for Service Delivery Channels

Key Performance Indicators for Service Delivery Channels


Metrics for Client Satisfaction
KPI: Client Satisfaction Level
Description: Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfaction
Objective: Primary indicator of client perception of service quality, measured
repeatedly, trends provide strong evidence of service improvement levels
Derivation: The multi-channel survey tool will be used (at a minimum) to establish
the CSat level for in-person services.
Status: Recommended as core KPI.
KPI: Service Complaints
Description: Count and categorization of complaints received through all channels
concerning the mail channel.
Objective: Primary indicator of service quality particularly when measured over
time.
Definition: Total service complaints received during reporting period divided by total
number of mail received.
Derivation: Counted by incident tracking system. Complaints received through
other channels must be added to total.
Status: Recommended as core KPI. Caveat: Members noted that current systems
do not currently support the collection and categorization of service complaints that
are received through a wide variety of channels (e.g. Ministers correspondence,
general e-mails, phone call)
MAF CATEGORY: STEWARDSHIP
Metrics for Agent Utilization
KPI: Cost per Contact
Description: Total labour costs divided by total service requests.
Objective: Provides a snapshot of current operational efficiency specifically related
to agent/manpower.
Definition: TBD
Status: Recommended as Core KPI. Definition of labour cost to be determined.
KPI: Agent Capacity
Description: The anticipated number of hours of agent time available for mail
service for each agent.
Objective: Ensures that agent resources are dedicated to required service

Key Performance Indicators for Service Delivery Channels


functions.
Status: Recommended as operational KPI for mail processing service models.
KPI: Resource Allocation
Description: A management indicator assessing allocated agent positions to service
delivery.
Objective: Measures effective use of channel resources.
Definition: Locally defined
Status: Recommended as operational KPI for mail processing service models.
KPI: Agent Adherence
Description: An assessment of service agent adherence to schedule and making
oneself available during anticipated service periods.
Objective: Contributes to resourcing effectiveness
Definition: Calculated as total agent login time divided by scheduled work time
Status: Recommended as operational KPI for queued service models.
KPI: Agent Occupancy
Description: The percentage of agent time spent in direct mail service including
wrap up time.
Objective: Ensures accurate resourcing levels to achieve target service levels.
Definition: (Response time + wrap up time) divided by total agent log in time over
measured period.
Suggested benchmark / Range: TBD
Status: Recommended as operational KPI for mail service models.
Metrics for Service Effectiveness
KPI:
Description:
Objective:
Definition:
Status:
Metrics for Use of Technology
KPI: Automated Response Ratio
Description: Ratio of applications received and completed but not handled by agents
to total applications received.
Objective:
Definition:
Status:
Proposed as an Operational KPI.

Key Performance Indicators for Service Delivery Channels


Metrics for Channel Take-up
KPI:
Applications Received
Description: Total applications/mail entering the processing centre.
Objective: Measures overall service demand
Definition: See ACCESS measure.
Status:
Proposed as Core KPI.
MAF CATEGORY: PEOPLE
Total Months Staff on Strength, Average Months on Strength per Agent: A measure
of the total experience level of the agents/staff within the service centre. Monitoring
this over time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides
a secondary indicator of service centre health and it often correlates to overall
customer satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helps
measure the utilization of service centre supervisor time as well as the investment in
agent skill improvement.
Training Days/Agent: Total training days delivered during the measurement period
divided by the number of agents. Training is required for program/service delivery,
for technology, and for the development of skills related to professionalism and
customer interaction.
Further discussion with departments and agencies will be conducted to identify
effective KPIs under this MAF category.

Key Performance Indicators for Service Delivery Channels


12.0

USING SERVICE DELIVERY KPIs in DEPARTMENTAL REPORTING

The MAF provides the primary framework for departments to prepare required annual
performance reports to provide formal feedback to Deputy Ministers. Treasury
Board of Canada, Secretariat is developing a Web-based approach to support and
streamline departmental performance reporting.
Service delivery key performance Indicators will become an important component of
these departmental reports and will significantly contribute to common understanding
of overall service channel performance across government.

Key Performance Indicators for Service Delivery Channels

13.0 CONTACT INFORMATION


Further information, suggestions and contributions can be forwarded to:
Service Delivery Improvement
Treasury Board of Canada, Secretariat
2745 Iris Street
Ottawa, Ontario
K1A 0R5
Victor Abele (abele.victor@tbs-sct.gc.ca)
Director, Service Delivery Improvement
Telephone: (613) 946-6264
Fax:
(613) 952-7232
Shalini Sahni (Sahni.Shalini@tbs-sct.gc.ca)
Analyst
Telephone: (613) 948-1119
Fax:
(613) 952-7232

Key Performance Indicators for Service Delivery Channels

APPENDIX A: Terms and Definitions


ACD Automatic Call Distributor a software/hardware device that manages call
queues, delivers IVR recordings as selected by the caller, and routes calls from the
queue to appropriate agents based on any number of caller parameters
CTI Computer Telephony Integration technology that provides an integrated
phone/computer capability to the service agent. CTI provides features such as
automatic caller file retrieval, soft phone, referral/call back electronic forms with
response script suggestion, caller wait time, and quick access to the mainframe and
online reference material..
Channel The primary service channels are telephone, Internet, mail and in-person.
IVR/VR Interactive Voice Response/Voice Recognition two related terms
describing two types of self-service technology employed in the Telephone Service
Channel. Interactive Voice Response provides the caller with a series of options to
be selected using the telephone keypad. Voice Recognition allows the caller to
speak the question or say an option from a recorded list.
KPI - Key Performance Indicator a measurable objective which provides a clear
indication of service centre capability, quality, customer satisfaction, etc.

Key Performance Indicators for Service Delivery Channels

APPENDIX B: References
Citizen First 3 report, 2003. Erin Research Inc, Institute for Citizen-Centred Service,
Institute of Public Administration of Canada.
Common Web Traffic Metrics Standards, March 21, 2003. Version 1.1., Treasury
Board Secretariat, Canada.
Key Performance Indicators Workshop, 2003. Service Delivery Improvement,
Treasury Board Secretariat, Canada.
Performance Management Metrics for DWP Contact Centres, March 14, 2003.
Version 2.0. Ivackovic and Costa. Department of Works and Pensions, United
Kingdom.
Peformance Measures for Federal Agency Websites: Final Report. October 1, 2000.
McClure, Eppes, Sprehe and Eschenfelder. Joint report for Defense Technical
Information Center, Energy Infomration Administration and Government Printing
Office, U.S.A.
Service Improvement Initiative How to Guide, 2000. Treasury Board Secretariat,
Canada.
Service Management Framework Report, 2004. Fiona Seward. Treasury Board
Secretariat Canada and Burntsands Consulting.
Summary Report on Service Standards, 2001. Consulting and Audit Canada
(Project 550-0743)

Key Performance Indicators for Service Delivery Channels


APPENDIX C: Summary of Core Key Performance Indicators
These core indicators were vetted by the working group and are recommended for
inclusion into the MAF.
Phone
Call Access
Caller Access
Abandoned Calls

In-person

Calls Answered by
IVR Successfully
Calls
Callers

Mail

Visitor Access

Search Engine
Applications/Pieces
Ranking
Opened
Client Satisfaction Direct Access Ratio Applications
Level
Completed
Service Complaints Server Availability Applications/Mail in
Percentage
Process
Cost per Contact
Referral Percentage Average Cycle Time

Average Speed to
Answer
Answer Accuracy Visitors
Client Satisfaction
Level
Cost per Call
First Call Resolution
Call Avoidance

Internet

Conversion Rate
Pass Through Ratio
Site Error MessagesClient Satisfaction
Level
Professionalism
Service Complaints
Client Satisfaction Cost per Contact
Level
Cost per Visit, Cost Applications
per Visitor
Received
Visits
Visitors

You might also like