Professional Documents
Culture Documents
Key Performance
Indicators
Interim Report on
Primary Service Channels
Executive Summary
The development of key performance indicators (KPIs) for the Government of
Canada (GoC) became a priority as Canadas Government Online (GOL) initiative
matured from 1998 through 2004. The rapid development of the Internet channel as a
means of providing effective public service delivery created an appetite for
revolutionary change in all types of service delivery. Prior to GOL, large-scale
improvements to service delivery were confined to specific government programs
and services. Interdepartmental projects were rare. The advent of the Internet and
the preference of Canadians to to access on-line government services has created
cutting edge opportunities for change in delivering services to Canadians.
In the past three years, dozens of interdepartmental initiatives have taken hold and
have helped to foster citizen-centred service delivery. As more and more business
improvement opportunities were conceived, it became clear that the Government of
Canada needed clear communication for analytical decision making. Many
departments have made significant investments in performance management and
made progress towards the disciplined decision-making characteristic of the worlds
best corporations. Nevertheless, differences in terminology, definitions, usage, data
collection, and performance frameworks were quickly identified as limiting the ability
to monitor and affect enterprise-level performance.
The genesis of the Core KPI project came from the GoCs Telephony Service
Working Group an interdepartmental collection of GoC call centre managers and
executives that came together to share best practices, establish consistent service
standards and generally improve the capabilities of GoC call centre operations. In
2003, this working group quickly identified, and provided precise definitions of,
common KPIs.
Coincident with this achievement, Treasury Board of Canada, Secretariat developed
a modernized approach to the management of the public sector organizations and
programs called the Management Accountability Framework (MAF). This
comprehensive set of tools, standards, and processes provided an over-arching
framework for the Core KPI project. The operational nature of KPIs strongly
supported the MAF and provided direct information to two of the primary MAF
categories stewardship and citizen-focused service.
In 2003, as the GoCs Internet channel rapidly matured and initial significant
transactional capability came online, new interdepartmental working committees
were formed to deal with the complexities of multi-service, multi-channel delivery
alternatives. Internet gateways and clusters rapidly evolved; this helped organize
services in parallel with client segments and life events. This has created
opportunities to effect corresponding changes in how GoC services are delivered inperson and by mail. By 2004, there was a clear need to establish common core KPIs
Record of Changes
Version
V 0.9
V 1.0
Date
August 30, 2004
Sept. 30, 2004
Summary of Changes
First draft for formal review
Minor edits
Acknowledgements
Project Authority:
Victor Abele
Project Analyst:
Phillip Massolin
Author:
Dan Scharf
Contributors:
Daryl Sommers
Colin Smith
Reina Gribovsky
Dolores Lindsay
Daniel Tremblay
Kyle Toppazzini
Marg Ogden
Web Content:
Morris Miller
1.0 INTRODUCTION
Citizens are faced with a greater choice of channels than ever before to access
government services (in-person, phone, internet and mail) , creating corresponding
challenges for organizations to manage service delivery across all channels.
Key Performance Indicators (KPI) are increasingly used by the private and public
sectors to measure progress towards organizational goals using a defined set of
quantifiable measures. For the GoC, KPIs are becoming an essential part of
achieving Management Accountability Framework (MAF) compliance.
Once approved, the KPI framework will constitute a key element of departments
annual monitoring. You can navigate KPIs by:
The majority of service delivery indicators relate to the operational nature of the
Stewardship category. Additional indicators measure progress to objectives under
the Citizen-Focused Service Category and People. The Accountability category
provides checklists and processes for establishing effective service level agreements.
Specific assessment tools are used for the Policy and Programs and Risk
Management categories.
COMPONENT
DESCRIPTION
Service Channels
Service Primary Service
Provider
Service Partner Providers
Pledge
Delivery Targets
Dates
Costs
Service Hours
Throughput
Change Management
The Policy and Programs in this MAF context refers to relevant lines of activity within
Departments and Agencies. Departmental Reports are the primary reporting tool
used to document the overall policy effectiveness of specific programs. Readers
should consult the MAF, relevant Treasury Board Policies as well as the Program
Activity Architecture (P.A.A.) for information and guidance in this category.
In the Fall / Winter of 2004/05, we will be consulting with the service community with
a view to developing a more comprehensive service policy. This work will allow us to
formalize the approach to KPIs which is currently in draft format.
KPI:
Search Engine Ranking
Description: Relevance ranking weighted from distribution of site visitors who
entered the site through commercial search engines. Metric assumes that a high
search engine rank provides maximum accessibility to those visitors who access the
site via search.
Objective: Measures overall site access through search engines.
Definition: Sum of (relevance ranking multiplied by search engine referring count)
divided by total search engine referrals
Derivation: Relevance rank from top five referring search engines using visitor
representative sample of search terms
Suggested benchmark / Range:
Status:
Proposed as a Core KPI
KPI:
Direct Access Ratio
Description: Percentage of visits which access the site directly via same or known
URL to total visitors.This metric assumes that visits accessing the site directly are
either typing or pasting a URL in from another source (e.g. a brochure) or have
bookmarked the site as a result of repeated visits.
Objective: Assessment of site memory through known URL or bookmarking;
Definition: Visits arriving at any page in site who do not have a referring URL
associated with the visit.
Derivation: Web traffic statistics counting visits arriving at site without referring URL.
Status:
Proposed as a Core KPI
KPI:
The MAF provides the primary framework for departments to prepare required annual
performance reports to provide formal feedback to Deputy Ministers. Treasury
Board of Canada, Secretariat is developing a Web-based approach to support and
streamline departmental performance reporting.
Service delivery key performance Indicators will become an important component of
these departmental reports and will significantly contribute to common understanding
of overall service channel performance across government.
APPENDIX B: References
Citizen First 3 report, 2003. Erin Research Inc, Institute for Citizen-Centred Service,
Institute of Public Administration of Canada.
Common Web Traffic Metrics Standards, March 21, 2003. Version 1.1., Treasury
Board Secretariat, Canada.
Key Performance Indicators Workshop, 2003. Service Delivery Improvement,
Treasury Board Secretariat, Canada.
Performance Management Metrics for DWP Contact Centres, March 14, 2003.
Version 2.0. Ivackovic and Costa. Department of Works and Pensions, United
Kingdom.
Peformance Measures for Federal Agency Websites: Final Report. October 1, 2000.
McClure, Eppes, Sprehe and Eschenfelder. Joint report for Defense Technical
Information Center, Energy Infomration Administration and Government Printing
Office, U.S.A.
Service Improvement Initiative How to Guide, 2000. Treasury Board Secretariat,
Canada.
Service Management Framework Report, 2004. Fiona Seward. Treasury Board
Secretariat Canada and Burntsands Consulting.
Summary Report on Service Standards, 2001. Consulting and Audit Canada
(Project 550-0743)
In-person
Calls Answered by
IVR Successfully
Calls
Callers
Visitor Access
Search Engine
Applications/Pieces
Ranking
Opened
Client Satisfaction Direct Access Ratio Applications
Level
Completed
Service Complaints Server Availability Applications/Mail in
Percentage
Process
Cost per Contact
Referral Percentage Average Cycle Time
Average Speed to
Answer
Answer Accuracy Visitors
Client Satisfaction
Level
Cost per Call
First Call Resolution
Call Avoidance
Internet
Conversion Rate
Pass Through Ratio
Site Error MessagesClient Satisfaction
Level
Professionalism
Service Complaints
Client Satisfaction Cost per Contact
Level
Cost per Visit, Cost Applications
per Visitor
Received
Visits
Visitors