Escolar Documentos
Profissional Documentos
Cultura Documentos
Executive Summary
The development of key performance indicators (KPIs) for the Government of Canada (GoC) became a priority as Canadas Government Online (GOL) initiative matured from 1998 through 2004. The rapid development of the Internet channel as a means of providing effective public service delivery created an appetite for revolutionary change in all types of service delivery. Prior to GOL, large-scale improvements to service delivery were confined to specific government programs and services. Interdepartmental projects were rare. The advent of the Internet and the preference of Canadians to to access on-line government services has created cutting edge opportunities for change in delivering services to Canadians. In the past three years, dozens of interdepartmental initiatives have taken hold and have helped to foster citizen-centred service delivery. As more and more business improvement opportunities were conceived, it became clear that the Government of Canada needed clear communication for analytical decision making. Many departments have made significant investments in performance management and made progress towards the disciplined decision-making characteristic of the worlds best corporations. Nevertheless, differences in terminology, definitions, usage, data collection, and performance frameworks were quickly identified as limiting the ability to monitor and affect enterprise-level performance. The genesis of the Core KPI project came from the GoCs Telephony Service Working Group an interdepartmental collection of GoC call centre managers and executives that came together to share best practices, establish consistent service standards and generally improve the capabilities of GoC call centre operations. In 2003, this working group quickly identified, and provided precise definitions of, common KPIs. Coincident with this achievement, Treasury Board of Canada, Secretariat developed a modernized approach to the management of the public sector organizations and programs called the Management Accountability Framework (MAF). This comprehensive set of tools, standards, and processes provided an over-arching framework for the Core KPI project. The operational nature of KPIs strongly supported the MAF and provided direct information to two of the primary MAF categories stewardship and citizen-focused service. In 2003, as the GoCs Internet channel rapidly matured and initial significant transactional capability came online, new interdepartmental working committees were formed to deal with the complexities of multi-service, multi-channel delivery alternatives. Internet gateways and clusters rapidly evolved; this helped organize services in parallel with client segments and life events. This has created opportunities to effect corresponding changes in how GoC services are delivered
in-person and by mail. By 2004, there was a clear need to establish common core KPIs and establish a working environment to develop further a common performance language. The Core KPI project brought together numerous government managers experts in delivering services to Canadians, visitors and businesses. Managers with operational responsibility for call and mail processing centres, Internet sites, and in-person locations were engaged in several meetings to identify the KPIs that provide maximum management value. The result of these meetings was a small set of channel-specific core KPIs that reflect specific MAF themes. These KPIs will be required for a variety of reporting requirements, Treasury Board submissions, and ongoing reviews. Additional operational KPIs were identified that are recommended by Treasury Board (but not required) as effective indicators that provide strong operational benefits to service delivery organizations. The Core KPI project is not complete. There is an ongoing requirement for implementation, improvement, and additions as the GoC service delivery strategy evolves. Perhaps the most important and lasting benefit is the networking of the best performance management people in the GoC. These experts continue to develop new techniques and identify improvements to ensure that Canada remains one of the world leaders in public sector service delivery. And that position clearly improves our competitive position in the twenty-first century.
Record of Changes
Version V 0.9 V 1.0 Date August 30, 2004 Sept. 30, 2004 Summary of Changes First draft for formal review Minor edits
Acknowledgements
Project Authority: Victor Abele Director, Service Strategy, CIOB Treasury Board Secretariat, Canada Analyst, Service Strategy CIOB, Treasury Board Secretariat, Canada Equasion Business Technologies
Project Analyst:
Phillip Massolin
Author: Contributors:
Dan Scharf Daryl Sommers Colin Smith Reina Gribovsky Dolores Lindsay Daniel Tremblay Kyle Toppazzini Marg Ogden
Web Content:
Morris Miller
1.0 INTRODUCTION
Citizens are faced with a greater choice of channels than ever before to access government services (in-person, phone, internet and mail) , creating corresponding challenges for organizations to manage service delivery across all channels. Key Performance Indicators (KPI) are increasingly used by the private and public sectors to measure progress towards organizational goals using a defined set of quantifiable measures. For the GoC, KPIs are becoming an essential part of achieving Management Accountability Framework (MAF) compliance. Once approved, the KPI framework will constitute a key element of departments annual monitoring. You can navigate KPIs by: 0* Channel - each channel includes standard metrics (KPIs) for managing performance, or 1* Management Accountability Framework category A series of government-wide workshops identified the requirement for a consistent approach to measure service delivery performance across the GoC. Workshop results can be assessed to help you create baseline frameworks for channel measurement and provide input into the process.
The majority of service delivery indicators relate to the operational nature of the Stewardship category. Additional indicators measure progress to objectives under the Citizen-Focused Service Category and People. The Accountability category provides checklists and processes for establishing effective service level agreements. Specific assessment tools are used for the Policy and Programs and Risk Management categories.
COMPONENT
Service Level Agreement Name Service Description Service Criticality Level
DESCRIPTION
The name of the SLA, particularly useful when a single SLA is used across multiple service offerings. The details of the service the government intends to provide and the benefits the users can expect to receive Normally identified in a Service Catalogue based on already defined metrics. This level of criticality should be based primarily on the service users requirements. Identifies which channels this service is available through e.g. telephone, mail, in-person, Internet, and appropriate contact information for the channels. The department or agency which is primarily responsible for the service. Other partner-departments that provide support to a Primary Service Provider for a service. e.g. GTIS provides the Internet Server to the Health Canada Provides the details of the quality of the service a client can expect. This is frequently time based e.g. Passports will be processed in X number of days. Delivery targets describe the key aspects of the service provided, such as access, timeliness and accuracy. The effective start and end dates of the agreement. A review date must also be identified so that performance measurements can be made and the SLA can be adjusted or action can be taken to improve the performance of an SLA.
Service Channels
Service Primary Service Provider Service Partner Providers Pledge Delivery Targets Dates
Service Hours
Throughput
Change Management
Metrics for Service Effectiveness KPI: First Call Resolution Description: The degree to which the client needs are met without further referral or call-back within a designated time interval. Objective: Minimize cost and maximize client satisfaction. Definition: number of single calls by unique phone number within 48 hour period not abandoned Status: recommended as Core KPI KPI: Accurate Referral Description: A redirect to the correct service for resolution of client need (may be to a higher service tier or to a separate organization/jurisdiction providing the service). Objective: Measures key caller criteria of more than 2 transfers. Definition: will require further working group participation Status: Not recommended. Not technically feasible at this time.
KPI: Call Avoidance Description: A call that quickly exits the system after an introductory message or bulletin that provides a desired answer for a substantial portion of the calling population, e.g. related to an immediate but temporary service outage. Objective: Measures utility of IVR/bulletins to answer high-volume inquiries. Definition: Calls terminated at specific IVR marker after bulletin Status: Proposed as Core KPI KPI: Calls Answered by IVR Successfully Description: A call that terminates in IVR tree after success marker. Objective: Measures utility of IVR response tree to provide self-service answer; an important indicator of IVR utility; secondary indicator of client satisfaction. Definition: Calls terminated at all IVR success markers. Status: Proposed as Core KPI Metrics for Channel Take-up KPI: Calls Description: Total calls received Objective: Measures overall service demand Definition: Number of calls received at switch. Note that this will include repeat callers who are refused at the switch. Status: proposed as Core KPI KPI: Callers Description: Unique Callers Objective: Measures service demand more accurately. Definition: Unique phone numbers dialing the service Status: proposed as Core KPI
KPI: Visitor Access Description: Count of visitors who either a ) are serviced at agent stations or b) obtain selfservice through in-location computers OR Count of visitors entering facility. This depends on the service model and facility. Objective: Basic volume measure. Definition: Total visitors entering facility over measurement period. Suggested benchmark / Range: TBD Status: Proposed as Core KPI. Tracked by all operations. KPI: Visitors Serviced Description: Ratio of visitors receiving agent service to total visitors. Provides indication of utilization of self-service capabilities and overall operational capacity. Definition: Total agent-visitor services divided by total visits Status: Recommended as an operational measure. Metrics for Delay KPI: Average Wait Time (AWT) Description: The average delay from time of entering facility to introduction at agent station. Objective: Primary Indicator of visitor satisfaction. Definition: The total number of minutes from pulling of service ticket to service. Derivation: Measured by service management system Status: Recommended as an operational KPI. Measured by all Queued service operations. Not trackable within retail service model. KPI: Service Level Description: Percentage of visitors that reach an agent within target wait time.
Metrics for Agent Utilization KPI: Cost per Contact Description: Total labour costs divided by total service requests. Objective: Provides a snapshot of current operational efficiency specifically related to agent/manpower. Definition: TBD Status: Recommended as Core KPI. Definition of labour cost to be determined. KPI: Agent Capacity Description: The anticipated number of hours of agent time available for counter service for each agent.
Metrics for Service Effectiveness Working group is asked to contribute suggestions for KPIs in this theme. KPI: Turn Around Time Description: The average time to transaction complete (i.e. receipt by client) expressed as a percentage of target time. Objective: Measures the response time to the client primary indicator of customer satisfaction. Definition: Status: Under review.
KPI: Self-Service Ratio Description: A visitor to the service office that accesses computers Objective: Measures utility computer facilities within service office. Definition: Count of number of computer accesses divided by total visitors during measurement period. Status: Proposed as a Core KPI. Metrics for Channel Take-up KPI: Visitors Description: Total visitors entering the office. Objective: Measures overall service demand Definition: See ACCESS measure. Status: Proposed as Core KPI. MAF CATEGORY: PEOPLE Total Months Staff on Strength, Average Months on Strength per Agent: A measure of the total experience level of the agents/staff within the service centre. Monitoring this over time provides a measure of the impact of staff turnover. Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides a secondary indicator of service centre health and it often correlates to overall customer satisfaction levels. Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helps measure the utilization of service centre supervisor time as well as the investment in agent skill improvement. Training Days/Agent: Total training days delivered during the measurement period divided by the number of agents. Training is required for program/service delivery, for technology, and for the development of skills related to professionalism and customer interaction.
Key Performance Indicators for Service Delivery Channels 10.0 KEY PERFORMANCE INDICATORS Internet Channel
The Canadian Gateways team has published a definitive report on Internet Measurement identifying the suitability and meaning of specific web measures (for example, hits versus visits). Readers are asked to review this document (see Appendix B). MAF CATEGORY: CITIZEN FOCUSED SERVICE Metrics for Access
In the Internet Channel, the access theme includes measures concerning the availability of the site to potential site visitors. There are two primary components to site availability: a) How easily can site visitors locate the site through search engines, links from other sites or via publication of the URL through other channels such as the phone and mail? and b) Is the site available for site visitors once it has been located? Other qualitative characteristics contributing to access include compliance with W3C Accessibility Standards to ensure the site is fully inclusive and available to persons with disabilities.
KPI: Search Engine Ranking Description: Relevance ranking weighted from distribution of site visitors who entered the site through commercial search engines. Metric assumes that a high search engine rank provides maximum accessibility to those visitors who access the site via search. Objective: Measures overall site access through search engines. Definition: Sum of (relevance ranking multiplied by search engine referring count) divided by total search engine referrals Derivation: Relevance rank from top five referring search engines using visitor representative sample of search terms Suggested benchmark / Range: Status: Proposed as a Core KPI
KPI: Direct Access Ratio Description: Percentage of visits which access the site directly via same or known URL to total visitors.This metric assumes that visits accessing the site directly are either typing or pasting a URL in from another source (e.g. a brochure) or have bookmarked the site as a result of repeated visits. Objective: Assessment of site memory through known URL or bookmarking; Definition: Visits arriving at any page in site who do not have a referring URL associated with the visit. Derivation: Web traffic statistics counting visits arriving at site without referring URL.
KPI: Server Availability Percentage Description: Total available server hours over total planned server hours during reporting period. Objective: Indicative of overall Internet service capacity Definition: sum of total available server hours less scheduled maintenance hours divided by total planned server hours Derivation: Server/Operating System Logs Status: Proposed as a Core KPI KPI: Referral percentage. Description: Percentage of total visits arriving at the site from planned referral sites. This KPI can be further broken down into specific sites: e.g. GoC Gateways, other GoC sites, other jurisdictions etc. Objective: Measures another access route to the site and can be used to adjust access strategies. Definition: Total visits arriving from specified websites divided by total visits. Status: Proposed as Core KPI KPI: Conversion Rate Description: Rate at which visitors initiate transactions and reach the submit page. Objective: Key Measure of overall service level and visitor satisfaction Definition: Total visits reaching submit pages divided by total visits viewing transaction start pages. Derivation: web monitoring package Suggested benchmark / Range: Status: Proposed as Core KPI
KPI: Abandonment Rate Description: Rate at which visitors initiate transactions but do not reach the submit page PLUS visitors exiting site from non-content pages Objective: Key Measure for overall service level. Definition: Visits with unsatisfactory exit pages divided by total visits Derivation: web traffic statistics Suggested benchmark / Range: Status: Proposed as Operational Measure
MAF CATEGORY: STEWARDSHIP KPI: Cost per Visit, Cost per Visitor Description: The total operational cost of the site over the reporting period divided by total visits/visitors handled during the reporting period.. Objective: Provides high level indication and trend of overall service performance. Definition: will require further working group consultation Status: Recommended as Core KPI Metrics for Agent Utilization The following four measures can be tracked for agent-assisted calls concerning the Internet channel and for all messages/e-mails submitted through the Internet site. All are recommended as Operational Measures.
Metrics for Service Effectiveness KPI: First Visit Resolution Description: Unique visitors over x-day period who exited the site from success content pages Objective: Minimize cost and maximize client satisfaction. Definition: number of single unique visits within x-day period who exited the site from specific success (i.e. answer found) pages
MAF CATEGORY: PEOPLE At publishing time, KPIs for the MAF PEOPLE category had not yet been proposed to the working group for review. Some examples of KPIs that might be suitable for this MAF category include: Total Months Staff on Strength, Average Months on Strength per Agent: A measure of the total experience level of the agents within the call centre. Monitoring this over time provides a measure of the impact of staff turnover. Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides a secondary indicator of Call Centre health and it often correlates to overall customer satisfaction levels.
KPI: Applications/Pieces Opened Description: Count of new envelopes opened during reporting period. Objective: Basic volume measure. Definition: Total envelopes opened less inappropriate mail (junk mail, wrongly-addressed etc) Suggested benchmark / Range: Status: Proposed as Core KPI. KPI: Applications Completed Description: Outbound mail for completed files. Definition: Status: Proposed as Core KPI. KPI: Applications/Mail in Process Description: All files remaining open at end of reporting period. Represents the work in progress within the processing centre. Definition: Previous open files + applications received less applications completed. Status: Proposed as Core KPI. Metrics for Delay KPI: Average Cycle Time (ACT) Description: The average elapsed time that the application/mail was held within the processing centre prior to completion. Objective: Primary Indicator of visitor satisfaction. Definition: The total number of minutes from opening of envelope to mailing of response. Derivation: Measured by mail tracking system. Status: Recommended as a Core KPI.
Metrics for Agent Utilization KPI: Cost per Contact Description: Total labour costs divided by total service requests. Objective: Provides a snapshot of current operational efficiency specifically related to agent/manpower. Definition: TBD Status: Recommended as Core KPI. Definition of labour cost to be determined. KPI: Agent Capacity Description: The anticipated number of hours of agent time available for mail service for each agent. Objective: Ensures that agent resources are dedicated to required service
Metrics for Service Effectiveness KPI: Description: Objective: Definition: Status: Metrics for Use of Technology
KPI: Automated Response Ratio Description: Ratio of applications received and completed but not handled by agents to total applications received.
Metrics for Channel Take-up KPI: Applications Received Description: Total applications/mail entering the processing centre. Objective: Measures overall service demand Definition: See ACCESS measure. Status: Proposed as Core KPI. MAF CATEGORY: PEOPLE Total Months Staff on Strength, Average Months on Strength per Agent: A measure of the total experience level of the agents/staff within the service centre. Monitoring this over time provides a measure of the impact of staff turnover. Staff Turnover Ratio: A measure of the churn rate within the Agent team. Provides a secondary indicator of service centre health and it often correlates to overall customer satisfaction levels. Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helps measure the utilization of service centre supervisor time as well as the investment in agent skill improvement. Training Days/Agent: Total training days delivered during the measurement period divided by the number of agents. Training is required for program/service delivery, for technology, and for the development of skills related to professionalism and customer interaction. Further discussion with departments and agencies will be conducted to identify effective KPIs under this MAF category.
ACD Automatic Call Distributor a software/hardware device that manages call queues, delivers IVR recordings as selected by the caller, and routes calls from the queue to appropriate agents based on any number of caller parameters CTI Computer Telephony Integration technology that provides an integrated phone/computer capability to the service agent. CTI provides features such as automatic caller file retrieval, soft phone, referral/call back electronic forms with response script suggestion, caller wait time, and quick access to the mainframe and online reference material.. Channel The primary service channels are telephone, Internet, mail and inperson. IVR/VR Interactive Voice Response/Voice Recognition two related terms describing two types of self-service technology employed in the Telephone Service Channel. Interactive Voice Response provides the caller with a series of options to be selected using the telephone keypad. Voice Recognition allows the caller to speak the question or say an option from a recorded list. KPI - Key Performance Indicator a measurable objective which provides a clear indication of service centre capability, quality, customer satisfaction, etc.
APPENDIX B: References
Citizen First 3 report, 2003. Erin Research Inc, Institute for Citizen-Centred Service, Institute of Public Administration of Canada. Common Web Traffic Metrics Standards, March 21, 2003. Version 1.1., Treasury Board Secretariat, Canada. Key Performance Indicators Workshop, 2003. Service Delivery Improvement, Treasury Board Secretariat, Canada. Performance Management Metrics for DWP Contact Centres, March 14, 2003. Version 2.0. Ivackovic and Costa. Department of Works and Pensions, United Kingdom. Peformance Measures for Federal Agency Websites: Final Report. October 1, 2000. McClure, Eppes, Sprehe and Eschenfelder. Joint report for Defense Technical Information Center, Energy Infomration Administration and Government Printing Office, U.S.A. Service Improvement Initiative How to Guide, 2000. Treasury Board Secretariat, Canada. Service Management Framework Report, 2004. Fiona Seward. Treasury Board Secretariat Canada and Burntsands Consulting. Summary Report on Service Standards, 2001. Consulting and Audit Canada (Project 550-0743)
Key Performance Indicators for Service Delivery Channels APPENDIX C: Summary of Core Key Performance Indicators
These core indicators were vetted by the working group and are recommended for inclusion into the MAF. Phone Call Access Caller Access Abandoned Calls Average Speed to Answer Answer Accuracy Client Satisfaction Level Cost per Call First Call Resolution Call Avoidance Calls Answered by IVR Successfully Calls Callers In-person Visitor Access Client Satisfaction Level Service Complaints Cost per Contact Visitors Internet Search Engine Ranking Direct Access Ratio Server Availability Percentage Referral Percentage Conversion Rate Site Error Messages Professionalism Client Satisfaction Level Cost per Visit, Cost per Visitor Visits Visitors Mail Applications/Pieces Opened Applications Completed Applications/Mail in Process Average Cycle Time Pass Through Ratio Client Satisfaction Level Service Complaints Cost per Contact Applications Received