Você está na página 1de 8

governance

From Boardroom to Bedside:


How to Define and Measure
Hospital Quality
Michael Heenan, Haajra Khan and Dorothy Binkley

Abstract
Following the release of its strategic plan, in which patient
safety and quality were highlighted as key directions,
St. Josephs Healthcare Hamilton recognized the importance
of engaging its board of trustees to achieve these goals.
Following a collaborative retreat with senior management,
medical staff leadership and professional practice leaders,
the board enhanced its governance oversight on quality. By
removing quality from the consent agenda, defining quality
and selecting a series of big dot measures, the board has
led the development of a culture of quality that cascades
from the boardroom to the bedside. This article describes
how the organization followed a systematic process to
define quality and select big dot quality indicators.

t. Josephs Healthcare Hamilton (SJHH) is a publicly


funded, 700-bed, three campus academic health sciences
centre in Hamilton, Ontario. Serving a population of
more than 1.3 million, the organization is the regional
lead for chest, head and neck; kidney-urinary; mental health and
addictions; and ophthalmology services. St. Josephs has over 4,000
employees and an annual operating budget of $500 million.
Like most hospitals boards in Ontario, the SJHH board had
a Quality Committee for years. Often unfocused, the Quality

Committee was the last place volunteer board members wanted to


serve. Management and medical staff often tried to illustrate for
the board that the hospital was providing quality care by inviting
clinical or corporate leaders to present on issues of the day using
positive stories related to the hospitals mission, a series of ratebased data and complex clinical acronyms that the lay board
member rarely understood. The end result was a lack of focus
that rarely raised discussion at the main board table or challenged
executive and medical leaders to make a substantial change at the
bedside. Fortunately, the renewed focus in healthcare patient
safety and quality has also renewed the role of governance.
Following the release of its strategic plan, St. Josephs recognized that achieving its commitment to become one of Canadas
safest hospitals would require strong leadership and engagement
across the organization. To accomplish the hospitals strategic
directions related to clinical quality and patient safety, the
organization began to re-emphasize the partnerships between
the board, management and medical staff required for success.
To engage the board, the organization educated itself on governance practices and innovated the way in which it governed
quality at both the main board and Quality Committee. By
removing quality from the consent agenda, defining quality and
selecting a series of big dot measures, the board has led the
way to a culture of quality that is cascading from the boardroom
to the bedside.

Healthcare Quarterly Vol.13 No.1 2010

55

From Boardroom to Bedside: How to Define and Measure Hospital Quality Michael Heenan et al.

Governing Quality by Defining Quality


To adequately govern and monitor quality, the board first had
to understand what quality really means. Following advice
provided by the Institute for Healthcare Improvement (IHI) and
the Ontario Hospital Association (OHA), the board decided to
begin by defining quality. In doing so, the organization realized
that this was easier said than done.
The first step was to analyze how peer organizations define
quality. Following a review of various teaching hospitals in
Canada, the United States, the United Kingdom and Australia,
we found that few hospitals publicly publish a definition of
quality. A number of organizations mention components
of a framework or list a series of metrics, but few specifically
mention a formal definition. Further reviews of both provider
agencies and policy or think tank organizations such as the
Institute of Medicine (IOM), IHI, OHA, the Ontario Health
Quality Council, the Canadian Patient Safety Institute and the
Quality Healthcare Network revealed that existing definitions of
quality can be divided into two main categories: (1) system-level
definitions those that policy makers use to address population
health issues; and (2) provider-level definitions those reflective
of care delivery organizations such as hospitals, long-term care
or home care agencies.
Most system-level definitions were based on the IOMs
Six Dimensions of Patient Expectations, namely, safe, timely,
effective, efficient, equitable and patient centred. Some system
definitions incorporated further dimensions such as capacity
or appropriately resourced to acknowledge the differences in the
American healthcare system (on which the IOM dimensions
are based) and the public healthcare systems in the United
Kingdom, Canada and Australia. Table 1 presents the matrix
that was created to map the various definitions found along
the dimensions mentioned in those definitions. The most

commonly mentioned dimensions at the system level were safe,


equitable and patient centred.
A review by SJHH staff also showed that there are very
few hospitals that have publicly published their definitions of
quality. The definitions at the provider level were not based
directly on the IOMs Six Dimensions, but dimensions still
generally encompassed the expectations of patients. Table 2
illustrates the dimensions found in a selection of provider-level
definitions. The most common themes in the provider definitions we examined were safe and patient centred.
SJHH discussed the possibility of framing the definition
around the patient credo (Heal me, Dont hurt me and Be nice to
me and an additional dimension, Treat me quickly). In the end,
however, it was decided that a formal definition was required to
educate board members and the public on the minimum expectations all patients should have with regard to quality care. As a
result, SJHH developed its own provider definition of quality as
a teaching institution: Quality care at SJHH is safe, kind, effective and timely and is provided in an environment of inquiry
and learning.
Measuring and Cascading Quality with
Big Dots
As with most volunteer hospital boards, the challenge SJHH
board members face is that despite spending only 75100 hours
a year in the organization, they are expected to understand a
complex array of metrics and to make decisions impacting
quality of care within a two-hour meeting. While SJHHs
scorecard system was proving valuable, the increased focus of
the board on quality and safety required a more useful tool to
enable it to govern more effectively and hold senior management accountable for the performance of these measures. With
a definition of quality in place, SJHH decided it would follow

Table 1. Dimensions of patient expectations mentioned in various healthcare systems definitions of quality

Safe

Timely

Effective

Efficient

Equitable

PatientCentred

System 1

System 2

System 3

System 4

System 5

System 6

56

Healthcare Quarterly Vol.13 No.1 2010

Capacity

Outcomes

Value

Appropriately
Resourced

Population
Health
Focus

Michael Heenan et al. From Boardroom to Bedside: How to Define and Measure Hospital Quality

Table 2. Dimensions of patient expectations mentioned in various providers definitions of quality

Safe

Timely

Effective

Efficient

Equitable

PatientCentred

Hospital 1

Hospital 2

Hospital 3

Hospital 4

Capacity

Outcomes

Value

At hospital 5, quality is not just a simple measure. Quality is a comprehensive look at all aspects of a patient's experience.
Quality at hospital 5 involves the totality of a patient's experience from the first phone call to the last appointment.

Hospital 5

Figure 1. Big dot quality cascade pyramid

BOARD
Executive Team

Re

po

rti

ng

ea

su

res

Big
Dots

Medical & Clinical Leadership


Clinical and Corporate Programs

the advice of IHI and select a number of big dot indicators:


whole-system measures that address core processes or functions
that patients expect the organization to perform in order to
improve quality and safety.
By monitoring big dot measures, the board is able ensure that
the organization is performing well without getting involved in
operational issues. Since big dots are whole-system measures
that reflect the overall quality of the healthcare system, and all
other smaller measures flow to and from the big dots, if the
system is performing well at the highest level of aggregation,
then it is likely performing well at the lower levels.
SJHH realized that the key to using big dots for change is
linking them to other measurable process and outcome indicators at the program or unit levels little dots. At SJHH, we illustrated this by creating the big dot cascade pyramid, presented in

Little
Dots

Figure 1. At the top of the pyramid is


the board, which reviews the systemlevel measures or big dots. Moving
down the pyramid, management
and clinical leadership are responsible for overseeing the little dots
and their associated improvement
at the program or unit level. A good
example is the issue of infection,
where the big dot may be a certain
infection rate or outcome, and this
is connected to many little dots,
including hand hygiene compliance,
housekeeping practices and central
line insertion compliance, among
others. Illustrated in this manner,
it is clear how the boards role can
improve care at the bedside.

Selecting the Big Dots


When first examining the literature
on big dots, SJHH thought it had to select five to seven indicators on which the board could focus, versus the 25-plus it had
on its balanced scorecard. After examining the current indicators, however, it was agreed that a short list of big dot indicators
could not accurately reflect the complexity of the organization. An example of this dilemma is the hospital standardized
mortality ratio (HSMR). It was decided that the HSMR could
not be the only mortality indicator the board examines given
that the HSMR accounts for only about 68% of the organizations annual death count. The second example of this dilemma
was finding a single infection metric, given that certain infection
rates are calculated against denominators such as patient days
and others are calculated against device days.
Thus, it was decided that fewer metrics were needed but that

Healthcare Quarterly Vol.13 No.1 2010

57

From Boardroom to Bedside: How to Define and Measure Hospital Quality Michael Heenan et al.

they would have to be presented in newly created big dot categories. By selecting categories, the organization was able to frame
indicators into a language that was easy to comprehend and had
relevance to clinical practice at the bedside. While no single set
of categories is currently recommended in the literature, three
category types emerged for the board to consider. As shown in
Figure 2, these categories included placing indicators around
a safety or communication theme such as the patient credo,
within categories that reflect clinical care or within categories
reflective of the organizations strategic directions.
To illustrate how these categories function, consider the
themed categories of the patient credo. A big dot indicator in
the bucket Heal me may be outcome indicators such as readmission rates or functional improvement scores; indicators for Dont
hurt me, Be nice to me and Treat me quickly may include reported
patient incidents, patient satisfaction and wait times, respectively.
At SJHH, a decision to ensure the boards work maintained
relevance to clinical practice at the bedside led the board to elect
to monitor big dots using the clinical categories of timely access,
incidents, infection, mortality and satisfaction.
Selecting Big Dot Indicators: The Criteria
Once the big dot categories were selected, the board began to
select individual indicators to be housed in each category. The
major challenge facing the board, management and medical staff
was distinguishing whole-system measures (big dots) from process
indicators (little dots). In order to facilitate the selection of big
dot indicators, the following criteria were developed and defined:

the organization. It is not program, unit or disease specific.


Is outcome driven. As opposed to a process indicator, a big
dot speaks to results or effects of the delivery of care. It is not
reflective of compliance with a procedure or practice standard.
Connects to other little dots or processes (multifaceted). A
big dot has a variety of little dots that flow from it; these are
defined and measurable process indicators operating at the
program or unit level.
Reflects the organizations strategic priorities. A big dot
should reflect the organizations corporate priorities.
Reflects the organizations definition of quality. The dimensions emphasized in the organizations definition of quality
should be reflected in the big dots.
By agreeing on a set of criteria, the big dot working group
evaluated the metrics it had on its current scorecard to help
narrow the number of big dot metrics it would monitor. This
also enabled the working group to identify areas where a previously monitored metric was perhaps not needed. Table 3,
(http://www.longwoods.com/product.php?productid=21245)
illustrates the big dot categories and indicators selected by
SJHH.

HSMR: A Real Big Dot Example


As stated, the goals of governing big dot measures are not only
to have the board examine system-wide measures but to probe
management and medical leadership on quality and safety issues
that may be of concern. A tangible example of how the board
of SJHH changed practice at the bedside involved analyzing
Is institution wide. A big dot indicator is a whole-system our HSMR.
measure that reflects the overall quality and performance of
As illustrated in Figure 3, in 20072008, management
presented the board with
an HSMR value of 88
against a target of 76. In
Figure 2. Types of big dot categories
the ensuing discussion,
a board member queried
Themed Categories
Clinical Categories
Strategic Categories
why the organizations
Patient Credo
target was 76 when all the
organization was required
Access
to do was meet the national
Patient Safety
Heal Me
benchmark of 100.
Management responded
Incidents
by indicating that the
Patient Flow
Dont Hurt Me
top quartile of Canadian
Infection
hospitals measuring
HSMR began at 76, and
Be Nice To Me
Service Excellence
if SJHH wished to be one
Mortality
of Canadas safest hospitals, its HSMR should be
Treat Me Quickly
Financial Stewardship
Satisfaction
in the top quartile. The
board member then asked

58

Healthcare Quarterly Vol.13 No.1 2010

Michael Heenan et al. From Boardroom to Bedside: How to Define and Measure Hospital Quality

Figure 3. HSMR: board quality example

HSMR Score
of 88 to Board

Jan 2008

Exec & MAC


Discussions

Mar 2008

Question:
Why is target 76 if
benchmark is 100?
Answer:
Top Quartile
Question:
How do we get there?

Internal
Analysis

MAC &
Program
Discussions

Spring 2008

Fall 2008

Action: Sepsis
Management Project
in ER & ICU

Jan 2009

Mgmt examines data by


death type, unit location,
and researchers other
peers across globe
Mgmt Findings: Sepsis &
safety campaigns including
infections & hand-washing
key to lowering HSMR
Number 1 cause of death
at SJHH: Sepsis

ER = emergency room; ICU = intensive care unit; HSMR = hospital standardized mortality ratio; MAC = Medical Advisory Committe; QPPIP = Quality Planning and Performance Improvement
Program; SJHH = St. Josephs Healthcare Hamilton.

management and medical leadership how it planned to get from


88 to 76. Management and medical staff began to examine
the literature on HSMR reduction and its own internal data.
In addition to learning about the need to focus on infection
prevention, hand washing and rapid response teams, the hospitals data indicated an increasing number of deaths related to
sepsis. Thus, as result of the boards discussion, a sepsis management program was initiated.
By asking two simple questions about a big dot and its target,
the board required management and medical staff to discuss
clinical practice, to further investigate the hospitals data, to
research best practice on sepsis management and to change
practice at the bedside a process that arguably would not have
occurred without the board oversight of a big dot metric.

Team Unity: Education with Management, Medical


Staff and Professional Practice

The board began its journey by hosting a retreat on governing


quality and patient safety to which senior management, medical
staff leadership and professional practice leads were invited.
Outside experts advised of the boards moral, ethical and legal
role in quality. By involving all four partner groups at the outset,
each group was able to identify its different perspectives and the
common goals that would help drive quality to the bedside.
Methodical Approach to Selecting Big Dot Categories
and Indicators

Success Factors
A number of key success factors helped St. Josephs Healthcare
in its quality governance journey: identifying board champions,
promoting team unity, taking methodical and systematic
approaches, resourcing the process and reviewing outcomes.

A working group of board members, members of management, physicians and allied health professionals researched the
different definitions of quality and the various categories of big
dots, and it evaluated each of the big dots scorecard metrics
against a set criteria. All participants agreed that by involving
each stakeholder and methodically choosing categories and
metrics, the process had credibility and the outcome was representative of the organizational strategy and focus on quality.

Identification of Board Champions: Ownership

Systematic Approach: Quarterly Review

The process was driven and guided by the chair of the board and
the chair of the Quality Committee. By involving two board
members, the board owned the process and forced management and medical staff to think how the board could govern
quality.

In addition to the Quality Committees greater focus on data, the


board dedicates strategic discussion every quarter to reviewing
big dot data. By systematically adopting a stop and review point
in the work plan, the board created an accountability mechanism for senior management and clinical leadership.

Healthcare Quarterly Vol.13 No.1 2010

59

From Boardroom to Bedside: How to Define and Measure Hospital Quality Michael Heenan et al.

Resourcing the Process: This Is a Journey

Board members are community volunteers and spend only


about 75 hours per year in the hospital. The work required to
research definitions, summarize the literature and define big
dot criteria cannot reasonably be expected from a volunteer or
off the side of ones desk. Assigning a management lead was
crucial to ensuring that the goals of the board were met.
Outcome: Showing That the Board Has Value

Previous to this focus, the board often asked how its work was
impacting care. Now, each quarter, the board is updated on
key actions that result from their questions. The board sees
how clinical outcomes result from its oversight, and it is more
engaged than ever before.
Conclusion
Supported by the research of IHI and OHA, St. Josephs
enhanced focus on patient safety is proving that hospital boards
matter. Boards represent the people we serve our patients
and our communities. By designing a process that engages
board members, clinical leadership and management, we have
all parties working collaboratively toward a common goal. By
defining quality, selecting big dot measures, framing data in
the patient credo and using raw numbers and pulling quality
from the consent agenda, the board is beginning to hold senior
management and medical staff accountable, thereby creating a
quality and patient safety culture that is improving care at the
bedside.
Bibliography

Baker, G.R. and P. Norton, V. Flintoft, R. Blais, A. Brown, J. Cox,


E. Etchells, W.A. Ghali, P. Hbert, S.R. Majumdar, M. OBeirne, L.
Palacios-Derflingher, R.J. Reid, S. Sheps and R. Tamblyn. 2004. The
Canadian Adverse Events Study: The Incidence of Adverse Events
among Hospital Patients in Canada. Canadian Medical Association
Journal 170(11): 167886.
Baker, M.A., A. Corbett and J.L. Reinertsen. 2008. Quality and Patient
Safety: Understanding the Role of the Board. Toronto, ON: Governance
Centre of Excellence and the Ontario Hospital Association. Retrieved
May 11, 2009. <http://www.oha.com/client/oha/oha_lp4w_lnd_
webstation.nsf/page/Publications+for+Sale>.
Beasley, C. 2003. Leading Quality: Thoughts for Trustees and Other
Leaders. Cambridge, MA: Institute for Healthcare Improvement.
Retrieved May 11, 2009. <http://www.vahhs.org/EventDocs/
Leading%20Quality.ppt>.
Campbell, S.M., J. Braspenning, A. Hutchinson and M. Marshall.
2002. Research Methods Used in Developing and Applying Quality
Indicators in Primary Care. Quality and Safety in Health Care 11:
35864.
Conway, J. 2008. Getting Boards on Board: Engaging Governing
Boards in Quality and Safety. Joint Commission Journal on Quality
and Patient Safety 34(4): 21420.
Health Quarterly Council. 2007, April. RHA Boards Must Lead the
Charge on Improving Quality, Safety: Reinertsen. Health Quality
Council Review. <https://www.hqc.sk.ca/portal.jsp;jsessionid=5lh5-

60

Healthcare Quarterly Vol.13 No.1 2010

32fh11?q45RfFNvzgvL+j+W57aw9TBIzBf0QfLQkUwK4QBZaJsN
clR3Gq3AV1VvI5thiwzu>.
Institute for Healthcare Improvement. 2003. Move Your Dot:
Measuring, Evaluating, and Reducing Hospital Mortality Rates (Part
1) (IHI Innovation Series White Paper). Boston, MA: Institute for
Healthcare Improvement.
Institute for Healthcare Improvement. 2006. How Do Hospitals Become
Best in Class? Presented at the VHA Georgia CEO Summit on
Quality, February 910, Georgia.
Kohn, L., J. Corrigan and M. Donaldson. 1999. To Err Is Human,
Building a Safer Health System. Washington, DC: National Academy
Press.
NHS Quality Improvement Scotland. n.d. NHS Quality Improvement
Scotland 2008/092010/11 Delivery Plan. Edinburgh, Scotland:
Author. Retrieved April 9, 2009. <http://www.nhshealthquality.org/
nhsqis/files/NHSQIS_DeliveryPlan_200809-201011.pdf>.
Ontario Health Quality Council. 2006. First Yearly Report. Toronto,
ON: Author. Retrieved August 25, 2009. <http://www.ohqc.ca/pdfs/
ohqc_report_2006en.pdf>.
Ontario Health Quality Council. 2006. OHQC Reporting Framework:
Attributes of a High-Performing Health System. Toronto, ON: Author.
Retrieved August 25, 2009. <http://www.ohqc.ca/pdfs/ohqc_
attributes_handout_-_english.pdf>.
Peters, W.A. 2008, August 30. Ten Elements for Creating Solid Health
Care Indicators. Quality Digest. Retrieved May 14, 2009. <http://
www.qualitydigest.com/magazine/2008/sep/article/ten-elementscreating-solid-health-care-indicators.html>.
Pugh M. and J.L. Reinertsen. 2007. Reducing Harm to Patients.
Healthcare Executive 22(6): 62, 6465.
Pulcins, I. 2008. The State of Big Dot Performance Measures in Canada.
Ottawa: ON: Canadian Institute for Health Information. Retrieved
May 8, 2009. <http://www.qhn.ca/events/OH_Patientsafety_CIHI.
ppt>.

About the Authors

Michael Heenan, MBA, CPHQ, is now the director of quality


performance and risk management at The Credit Valley
Hospital in Mississauga, Ontario. Previously he was working
at St. Josephs Healthcare Hamilton as the Director of the
Quality Planning and Performance Improvement Program.
Haajra Khan, MBA, CPHQ is a performance improvement
consultant in the Quality Planning and Performance
Improvement Program at St. Josephs Healthcare Hamilton.
Dorothy Binkley, MBA, CPHQ, worked at St. Josephs
Healthcare Hamilton as a co-op student. She recently
completed her MBA at the DeGroote School of Business at
McMaster University, in Hamilton, Ontario.
For further information on this article you may contact Romeo
Cercone, VP for quality and strategic planning, St. Josephs
Healthcare Hamilton at 905-522-1155, ext. 33405, or by e-mail
at rcercone@stjoes.ca.

governance

From Boardroom to Bedside: How to Define and Measure Hospital Quality


Michael Heenan, Haajra Khan and Dorothy Binkley

Table 3. SJHH big dot categories and associated indicators

Timely Access

Total time spent in ED high acuity (CTAS I, II)


Total time spent in ED low acuity (CTAS III, IV, V)
ED left without being seen
Mental health outpatient wait times
Cancer surgery wait times
Cataract surgery wait times
MRI wait times
CT scans wait times
Readmission rate
Number of ALC equivalent beds
Incidents

Number of serious incidents


Number of never events
Seclusion incidents (mental health)
Infections

Central line infection rates per 1,000 device days


Infection rate Clostridium difficile per 1,000 patient days
Infection rate MRSA per 1,000 patient days
Infection rate VRE per 1,000 patient days
Surgical site infection prevention rate
Ventilator-associated pneumonia rates per 1,000 ventilator days
Mortality

Hospital standardized mortality ratio


Deaths in acute care
Deaths in CCC, rehabilitation and mental health

Healthcare Quarterly Vol.13 No1 2010 1

From Boardroom to Bedside: How to Define and Measure Hospital Quality Michael Heenan et al.

Table 3. Continued.

Satisfaction

Patient satisfaction acute care


Patient satisfaction emergency care
Patient satisfaction surgical care
Patient satisfaction mental health
ALC = Alternative Level of Care; CCC = Complex Continuing Care; CT = Computed Tomography; CTAS = Canadian
Triage and Acuity Scale; ED = Emergency Department; MRI = Magnetic Resonance Imaging; MRSA = Methicillinresistant Staphylococcus aureus; SJHH = St. Josephs Healthcare Hamilton; VRE = Vancomycin-Resistant Enterococci.

Você também pode gostar