Você está na página 1de 11

Measure for Measure

Developing Benchmarks for Clinical Engineering Activities: A Methodology


Jonathan A. Gaev

n fall 2006, AAMI selected ECRI Institute to perform a study to determine if it is feasible and desirable to develop benchmarks for the activities of hospital-based clinical engineering (CE) departments. The study reported here corresponds to the rst phase of the project. The ultimate goal of the project is to enable departments from different institutions to compare their performance. This study included a review of literature related to benchmarking, eld interviews, and an analysis of ECRI Institutes benchmarking projects. ECRI Institutes project also included a series of structured interviews with more than 30 healthcare executives whose job descriptions ranged from CE managers to hospital chief executive ofcers (CEOs). ECRI Institute then developed criteria to identify a representative set of indicators (RSI) that would highlight the data elements required to establish benchmarks; this step would evaluate the feasibility of collecting the data. We need to emphasize that the purpose of developing this RSI is to evaluate the desirability and feasibility of benchmarking hospital-based CE department activities. The goal of the project was not to develop a nal set of recommended indicators for the entire CE community. The RSI include the percentage of repairs completed within one working day; total CE cost/device serviced; percentage of preventive maintenance (PM) complete; and percentage of technician time spent on maintenance, customer satisfaction, CE department development, and technology management intensity. ECRI Institute proposed that the equipment inventory and hospital parameters representing the frequency of use of the equipment under CE management be used to establish the context in which to interpret the set of indicators to compare CE departments.
Biomedical Instrumentation & Technology

As the RSI were primarily derived from the requirements expressed by decision-makers and based on data elements that are feasible to obtain by most CE departments, ECRI Institute has concluded that it is feasible and desirable to develop the benchmarks described above.

Methodology
ECRI Institutes method to demonstrate feasibility and desirability is to develop an RSI that could be used as benchmarks, and then show that the indicators are both desirable and feasible. By representative set, we mean that the indicators are meaningful, but may not be the nal or optimal set of indicators that will be used by the CE community. The nal set of indicators may be determined in another project or study. Desirability is determined by identifying the indicators that are of most interest to those who will use the information. Feasibility is determined by the challenges in collecting the data required to determine the indicators under real-life conditions. This study needed to demonstrate both feasibility and desirability, but used desirability as the starting point for the analysis. The steps in developing the RSI were: Research: Literature review, interviews, and review of ECRI Institute experience to collect the following: 1. List of indicators 2. Methodology for developing indicators of CE department performance 3. Needs/desires of CE department managers/supervisors for data related to department performance. 4. Hospital, department, or inventory characteristics to be considered when comparing indicators
267

COVER STORY
Measure for Measure

5. Concerns of CE department members 6. Questions for additional investigation 7. CE department activities Indicator development: From the information obtained during research, we developed a comprehensive list of indicators and applied a logical process developed for this project to narrow this down to a limited, desirable, and feasible RSI. Validity testing: We conducted a limited check of the RSI by applying several management scenarios to determine how the RSI would respond to real-life problems. Comprehensive data collection and verication was beyond the scope of this project.

What information is essential for you to monitor the performance of the CE department? How does using measures of performance impact your activities or decisions? What information regarding CE department performance is requested by your superiors or peers? What are your biggest concerns on benchmarking? In the second round we interviewed 22 people (including two who were interviewed in the rst round). Their job descriptions included CE manager, director of clinical engineering, vice president of facilities, and members of the executive suite. Among our goals for this second round of interviews was to discover indicators that were not mentioned in the literature.

Literature Review
We conducted an extensive review of available CE literature and business publications, looking for sources relevant to CE benchmarking. We recorded each indictormore than 125 in allexactly as it was stated. There were some signicant omissions in the literaturefor example, no methodology for developing indicators of CE department performance was found. Almost no literature discussed the needs of hospital managers above the level of CE department director in terms of data related to the departments performance. Although a signicant challenge to the adoption of benchmarks is to ensure that fair, apples to apples comparisons are made between institutions (i.e., peer groups), we didnt nd any literature that described how to develop peer groups. Addressing this point is crucial to the adoption of benchmarks, as organizations wont be motivated to share information if they feel that it wont be used appropriately. These deciencies in the literature alerted us to information that we would need to obtain through the interview process.

Summary of Interview Results


Based on our ndings and experience, we concluded that hospital managers two levels above the CE director do not typically deal with operational problems of the CE department. They assume that these issues have been handled by the CE director and his or her supervising administrator. The senior hospital managers we spoke with focus on the nancial aspects of the CE department and want to know that their hospital is getting a good value for the funds expended, judged on an annual basis. From their point of view, CE services would not need to be benchmarked more frequently than that. CE department directors, on the other hand, want their superiors to interpret the value of the CE department in the proper context by only comparing it to the appropriate peer group in other institutions. ECRI Institute was surprised about what wasnt said in either round of interviews. No group emphasized the CE departments role in patient safety, nor did they mention additional projects that the department did that were not directly related to equipment service (such as investigating a new type of telemetry device or providing equipment planning services for a new emergency department). The emphasis given by all interviewees was that the CE departments goal was to make sure that the equipment was up and running, in good condition, and ready to be used by the clinical staff. We prompted interviewees by asking how the CE department contributes to patient safety. They expected the CE department to keep the equipment safe by performing preventive maintenance, attending Environment of Care meetings, and handling recalls.
July/August 2007

Interviews
We conducted two rounds of interviews. The rst was to learn about the concerns of CE department employees and superiors related to benchmarking, and to design the questions used in the second round. Ten people were interviewed in the rst round. Their job descriptions included biomedical equipment technician (BMET), clinical engineer, CE manager, director of clinical engineering, and hospital CEO, and also included representatives from third-party service organizations. The institutions, each in a different state, ranged in size from 102 beds to a multi-facility health system with 2,072 beds. The questions that resulted from that process included:
268

COVER STORY
Jonathan A. Gaev

Additional Indicators
ECRI Institutes Health Systems Group regularly conducts onsite reviews of CE department performance. For many years, ECRI Institute also sold and serviced a computerized maintenance management system (Hospital Equipment Control System, now sold through Phoenix Data Systems in their AIMS.NET product). It continues to sell and support the Inspection and Preventive Maintenance System. Both systems are still used in hundreds of hospitals. From those experiences and others, we are aware of indicators of CE department performance currently used in the eld. We added those indicators to the list, together with indicators mentioned by the interviewees, bringing the total to more than 140 indicators. In the database we noted the source for each indicator. We removed many redundancies; indicators that could easily be calculated from other indicators (for example, if we know the repair cost per device we could calculate the repair cost per 100 devices); and indicators that were mentioned in the literature but not by the interviewees or identied by ECRI Institute. The nal list contained 42 indicators, so further analysis was required to identify the RSI and evaluate the indicators for desirability and feasibility.

Denitions of Benchmarking and Related Concepts


Benchmark. A reference value for an indicator. The reference value may be asserted (many hospitals set a benchmark of 100% for the monthly PM completion rate of life-support equipmenta rate mandated by the Joint Commission for this type of equipment) or established using other methods such as internal or external benchmarking. External Benchmark. The reference value for the indicator is established by comparing the performance of one institution to its peer group (see below). For example, if CE departments at similar institutions achieve an average monthly PM completion rate of 95% for general medical equipment, institutions within that peer group may use 95% as a benchmark to measure the performance of their CE departments. Internal Benchmark. The reference value for the indicator is established by comparing the performance of the institution to itself. For example, if a CE department achieved an average monthly PM completion rate of 97% in 2006, it may choose to use 97% as the reference value for its 2007 monthly PM completion rate. Indicators. Numbers that measure an activity of a CE department. An example is the monthly PM completion rate (percentage of scheduled PMs that are completed in a month). Peer Group. A group of institutions with similar characteristics that are relevant to the benchmarking indicators. For example, the number of pieces of medical equipment and the specic types (general biomedical, clinical laboratory, imaging, etc.) in the inventory managed by the CE department may be important factors in establishing a peer group.

12 11 10 9 8 7 6 5 4 3 2 1

Selection Criteria
Combining the information that we had gathered, we developed a list of characteristics that would be required for the RSI. Each indicator must be: Independent of the size of the institution. Meaningful and desirable to both the CE department director and supervisor. Revealing as internal and external benchmarks. Based on data easily collected by the CE department. Intuitive in meaning. Easy to communicate to the senior management team and clinicians. Preference was given to indicators that are already used by many institutions as it is a signicant advantage to know the existing strengths and weaknesses of an indicator and to know that the data can be collected and has proven useful. The RSI, taken together as a set of indicators, must: Demonstrate the tradeoffs between performance and cost. Address the majority of time spent on CE department activities.
Biomedical Instrumentation & Technology

269

COVER STORY
Measure for Measure

Representative Set of Indicators


The list of 42 indicators was reviewed by two former managers of CE activities, a senior member of ECRI Institutes Health Systems Group, and ECRI Institutes technical director and vice president of health technology evaluation and safety. To ensure that we developed a feasible set of indicators, the review team conrmed that the data required for each indicator typically resided in systems under the control of the CE department, could easily be calculated by the CE department, or could be obtained from other departments with a modest effort. This was an iterative process as we needed to rst ensure that the individual indicator met the selection criteria and then to ensure that the set of indicators met the selection criteria. In some cases, the denition that was initially stated for the indicator was revised so that the resulting indicator would use data more easily acquired by the majority of CE departments. The indicators are summarized in Table 1 (page 274). One of the RSI requirements is that it represents the majority of CE activities. We compared the proposed indicators with the job responsibilities described in the Journal of Clinical Engineerings salary survey. Table 1 also shows that all activities are covered and that some indicators reect the impact of more than one activity. We needed to create an indicator that addressed the CE departments technology management activities, as those efforts occupy about 15% of the CE departments time (Journal of Clinical Engineering 2005). This indicator, Technology Management Intensity, is described in the next section. In this study, we assume that the data are collected for benchmarking purposes on an annual basis. The indicators presented in Table 1 are summarized below. For each indicator, several other candidates were also considered but not included in this presentation. We recognize that if a set of indicators will be proposed to the CE community to be used to benchmark CE activities, as a result of the rst phase of AAMIs study, additional discussion regarding the denitions of each indicator will be required. To facilitate comparisons among CE departments, each department would need to prepare a complete inventory of all medical equipment under its management (including equipment maintained under service contracts and time and material service provided by outside vendors). The inventory would need to be separated into three categories: general biomedical equipment, imaging
270

equipment, and clinical laboratory equipment. To develop a central database of information, we recognize that more specic denitions regarding the equipment are needed (for example, some CE departments maintain hospital beds and some dont). We believe that the abbreviated denitions presented here are sufcient to determine the desirability and feasibility of benchmarking CE department activities.

The Indicators
% repairs completed in one working day = [number of CM events completed in one working day / total number of CM events] * 100% This indicator requires that an accurate start and stop time and date be entered for each repair (also called corrective maintenance or CM). For the purpose of this project, we considered that the time spent on the repair also includes activities required to coordinate outside services and purchasing/requisitions related to the repair. If this indicator is used for benchmarking, CE departments that enter work order information only when they return the equipment to service would need to change their practice. They would need to either enter the dates and times of when the repair began and ended and when the equipment was returned to service, or they would need to create a user-dened eld so that the technician could declare that the repair was (or was not) completed within one working day. We would have preferred using an indicator that reected all of the time that the equipment was unavailable to the clinician, starting when the call was received by

AAMIs Involvement in Indicator Development


AAMI has been involved in the exploration of CE department indicators for many years. Ted Cohen published the results of the rst year of AAMIs Validating Metrics pilot project (BI&T, Jan/Feb 1997). He proposed three indicators for repair and maintenance services: ratio of service cost to acquisition cost, repair requests completed per device, and average turnaround time per repair. To validate the indicators, he requested data from 100 clinical engineers, but only received satisfactory data from seven institutions, which was insufcient for validation. The ratio of service cost to acquisition cost has become widely cited in the literature and is used by many third-party service organizations and some CE departments to benchmark their programs.
July/August 2007

COVER STORY
Jonathan A. Gaev

the CE department and ending when the equipment was incoming testing, PM, and corrective mainreturned to service. We did not propose starting the tenance] / [2,080 hours * number of techniclock with the receipt of the call, however, as many CE cians] departments are not able to record this precise data. The time spent on maintenance activities also includes Total CE cost / device serviced = total cost coordinating outside services and purchasing/requisitions for all CE activities / total number of devices related to maintenance. To facilitate comparison among receiving service facilities, ECRI Institute applied a neutral standard of The denominator devices serviced includes all de- 2,080 hours (52 weeks * 40 hours/week), but recognizes vices that received service through the CE department. that this will vary depending on each institutions policies The total CE cost includes administrative and manage- for breaks, vacation time, etc. Technicians spend time on ment costs, CE department costs not directly related to useful activities beyond equipment maintenance, so this service, and service contracts. Data from various insti- indicator will not reach 100%. We recognize that clinical tutions can be more accurately compared if the internal engineers and some CE department directors also spend labor costs are based on salaries without applying over- time maintaining equipment, but did not include their head, as the overhead varies with each institution. efforts in this indicator as we felt that focusing on BMET We did not select the service cost ratio (Service cost maintenance activities would make it easier to collect the / device acquisition cost) as an indicator because the information required and interpret the results. research conducted conrms that many CE departments can not easily obtain the device acquisition cost for the entire inventory under their management. The indicator had desirability, but not feasibility. % PM complete = [# PM events completed / # PM events scheduled] * 100% According to our interviews, this indicator is very important to many senior healthcare managers as they associate safety with making sure that PM is up to date. They also know that equipment needs to be appropriately maintained to comply with Joint Commission requirements. ECRI Institute recommends that CE departments review their PM intervals for each device category as some departments may nd that they could discontinue PM for some types of devices without increasing their failure rate. We have included % PM complete in the representative set of CE department indicators to ensure that the set of indicators appears credible to the senior management team. % Technician time spent on maintenance = 100% * [Time spent on inspection, The methodology used by ECRI Institute in phase 1 of the study.
Biomedical Instrumentation & Technology 271

COVER STORY
Measure for Measure

Customer satisfaction (5-point scale) The customer is usually dened as the clinical staff. Many institutions already have a customer satisfaction survey that meets their needs, and we do not recommend changing established surveys. These survey results are likely to be most helpful for internal benchmarking. Instead of prescribing a single survey or set of questions to be used by all institutions, we propose that the CE director have the results of the facilitys internal survey translated into a single numeric score, derived from a 5-point scale where 1 = poor, 3 = average, and 5 = exceptional. CE department development = hours spent on development activities per year / [# of BMETs + Clinical Engineers + CE department Managers] CE department development applies to the hospital staff of the CE department. These activities include attending on- or off-site training courses, professional meetings, conferences, and other events that improve staff skills. Technology Management Intensity: [# hours spent on these activities in one year / Total number of working hours for all CE department employees in one year] * 100% Technology management intensity represents activities that contribute to patient care but have not been captured in the other categories. ECRI Institute prepared a list of technology management activities based on our review of the literature; direct experience reviewing the performance of CE activities; and development of policy and procedures for technology management, inspection, preventive maintenance and repair, and risk management. CE departments can review the list of technology management activities in Table 2 (page 275) and record the time spent on those activities during the year.

equipment) and the frequency of use for that equipment. For example, if two institutions are both 250bed facilities with equally skilled staff and similar equipment responsibilities, but one has twice as much imaging equipment as the other, we would expect their Total CE cost/device to differ substantially. If two institutions both have 250 beds, offer similar services, and have similar types and quantities of equipment, but one has many more in-patient admissions and outpatient visits than the other, we would also expect their Total CE cost/device to differ. To obtain the inventory information, each institution would need to organize that information in a standard format and agree to share it with other institutions (the data would be reported in a condential manner to ensure anonymity). This could be accomplished by establishing a central database accessible to all institutions who contributea project that ECRI Institute believes to be feasible. A rough estimate puts the annual cost at less than $50 per hospital, assuming that half of the hospitals in the United States purchase the information.

Two parameters are required to establish peer groups: the equipment inventory (the number of items for imaging, clinical laboratory, and general medical equipment) and the frequency of use for that equipment.
We have several examples of implementing this concept. ECRI Institute has designed and maintained databases of condential hospital information that is shared among other institutions regarding the cost of equipment and reports of problems with medical devices. The Department of Health for the State of Pennsylvania collects information describing CTs and MRIs, the number of cardiac catheterization laboratories, and the utilization of many medical services for hospitals in Pennsylvania. Since databases containing the types of information required to determine equipment inventory have been developed, we conclude that it is feasible to obtain information for the types of hospital equipment. The frequency of use of the equipment will also impact service activities, but direct measures of frequency of use are not available. For the purpose of determining feasibility, we believe that the frequency of use may be related
July/August 2007

Peer Groups
Benchmarking requires that CE department performance in one institution be compared to similar CE departments in other institutions. To show that it is feasible to benchmark CE department activities, we needed to identify the information required to establish peer groups for CE departments and to show that it is feasible to obtain that information. Two parameters are required to establish peer groups: the equipment inventory (the number of items for imaging, clinical laboratory, and general medical
272

COVER STORY
Jonathan A. Gaev

to hospital parameters that are found in the American Hospital Association (AHA) guide, such as the number of medical services offered, in-patient data (number of admissions), and the number of out-patient visits. Further studies would be required to demonstrate the correlation between equipment maintenance parameters (such as the number of failures) and the related hospital parameters. Since the AHA information is readily available, we conclude that it is feasible to obtain information for the intensity of equipment use.

change in the percentage of repairs that were completed in one work day. The overall expenses for the CE department were decreased so the Total CE cost/device decreased. The department really did have too many people. Reducing staff saved money and didnt adversely affect performance. Result 2: Looking at the technician productivity, it appears that the CE department can reduce stafng by one technician so one BMET was laid off. Heres the impact:
Indicator *% repairs completed within one working day *Total CE cost/device serviced % PM complete % Technician time spent on maintenance Customer satisfaction CE department development Technology management intensity Impact Decreased Decreased No Change Increased Decreased No Change No Change

Testing the Indicators


We tested the indicators by proposing hypothetical scenarios that are likely to occur in practice, recording how the indicators change based on the scenario, and evaluating those changes to see if they are consistent with our practical experience. If the indicators did not respond to the scenario, they would be judged to be poor indicators and new indicators would need to be selected. The following simple scenarios are provided for illustrative purposes to represent some of the management scenarios that were considered.

Scenario 1
Management Challenge: Your expenses are too high. Reduce your staff! Result 1: Looking at the technician productivity, it appears that the CE department can reduce stafng by one technician so one BMET was laid off. Heres the impact:
Indicator *% repairs completed within one working day % PM complete % Technician time spent on maintenance Customer satisfaction CE department development Technology management intensity Impact No Change

*Total CE cost/device serviced Decreased No Change Increased No Change No Change No Change

In this case, the remaining technicians were not able to handle the additional workload. They spent more time on CM and PM, which was reected by the increase in % technician time spent on maintenance. Since senior management at the hospital and the Joint Commission place a very high priority on the PM completion rate, the department made sure to complete all of its PMs on time, so there was no change in the % PM complete. Laying off the BMET decreased the overall expenses for the CE department as reected in the decrease in Total CE cost/device. The technicians fell behind on their repair work as shown in the decrease in % repairs completed within one working day, which led to a decrease in customer satisfaction. The department did not have too many people. Reducing staff saved money but adversely affected performance.

Scenario 2
Management challenge: The head of nursing wants to reduce medical errors. Do something! Result: The CE department spent more time with the procurement process and convinced the hospital to standardize with a particular model of infusion pumps.
273

The remaining technicians were able to handle the additional workload as reected by the increase in the % technician time spent on maintenance. The PMs that were scheduled were completed on time and there was no
Biomedical Instrumentation & Technology

COVER STORY
Measure for Measure

They also implemented a smart pump system in the areas with the most critical patients. This effort was led by the clinical engineer who was assisted by one of the BMETs. Heres the impact:
Indicator *% repairs completed within one working day *Total CE cost/device serviced % PM complete % Technician time spent on maintenance Customer satisfaction CE department development Technology management intensity Impact No Change No Change No Change No Change Increased Decreased Increased

when the other project was nished. The smart pump project required more effort than was anticipated so the clinical engineer was not able to spend as much time on CE department development activities, which adversely impacted CE department development. For now, the nursing department is pleased with the CE department response, which is reected in their increased Customer satisfaction score.

Discussion
The overall goal of phase 1 of this project is to identify the desirability and feasibility of developing benchmarks for CE activities in U.S. hospitals. We spoke with many healthcare professionals while conducting this study and all of them agreed that it would be helpful to have meaningful benchmarks of CE department performance and that the existing measures were not sufcient. We conclude that benchmarking of CE department activities, primarily maintenance activities, is desirable.

The BMET spent a bit less time servicing equipment, but still got his work done. The clinical engineer was nishing up an equipment planning project for the emergency department and took on the smart pump project

Indicator *% repairs completed within one working day *Total CE cost/device serviced *% PM complete % Technician time spent on maintenance Customer satisfaction CE department development

CE Department Activity from the Journal of Clinical Engineering Article Repairs All activities are considered in the cost calculation Scheduled PM and safety testing Repairs, Scheduled PM and safety testing, Incoming testing, Coordinate outside services, purchasing/ requisitions Most activities Most activities may be affected Clinical support, design, modications, research and development, incident investigation/risk management

Data Source Computerized Maintenance Management System (CMMS) CMMS CMMS CMMS and CE department work time recording system CE department or hospital survey CE department work time recording system

Technology management intensity

CE department work time recording system

Table 1. The representative set of indicators and data source. Indicators with an asterisk are reported for the following equipment categories: imaging, clinical laboratory and general hospital equipment (such as physiological monitors and infusion pumps).

274

July/August 2007

COVER STORY
Jonathan A. Gaev

Category Technology Evaluation

Activity

Support of hospital research activities through device design, direct device operation, and other technical support Technology Management Centralized service contract management Clinical department rounds Other support for clinical departments Special Activities Attend and support capital planning meetings and functions Development of the CE profession and policy inuence (e.g., participation in and leadership of industry associations and standards-setting organizations) Community outreach/education Device design and development Research support Wireless network management and support Support of specic patient safety activities Health information technology and medical device integration Incident investigation/risk management Hazard and recall management Technology Planning and Assessment Assistance in equipment standardization efforts

Providing capital budget assistance Supporting new technology forecasting, assessment, and planning efforts Attend/participate/manage technology assessment programs Conduct/manage technology assessments Conduct/manage comparative device evaluations Capital equipment process, including multi-year planning Systems analysis and support (e.g., technology integration) Equipment planning.

Table 2. Technology Management Intensity Table. Biomedical Instrumentation & Technology 275

COVER STORY
Measure for Measure

The key challenge was to nd out what is feasible. Benchmarking requires that hospitals provide specic data. Only a program that facilitates participation in will be feasible, and management (either at the CE director level or above) must allocate the required resources. Because of this, an expensive data collection program will not succeed, and we proposed using indicators that require data that can be obtained from current information sources and systems available at small hospitals (fewer than 200 beds).

It would be very interesting to gather data and then test the relationships between the indicators.
Maintenance activities were an obvious focus for this study. Maintenance activities represent the majority of the CE department budget and therefore get more attention by senior managers than other CE department activities. ECRI Institute strongly believes that there are many other activities that may be performed by a CE department that have significant value for the institution, especially activities that improve patient safety, but we recognize the great difficulty in quantifying those other activities and do not feel that meaningful indicators can easily be developed. In the real world, even with very precise systems for collecting information, the data collected will have some errors. Making sure that the definitions are consistently applied regarding equipment to be included in the inventory and services performed is hard to achieve. In practice, the data do not need to be perfect to enable managers to use them well, but benchmarking helps CE managers to use their time well to effectively focus their analysis on the right areas. The Veterans Health Administration has effectively used benchmarking data for more than 20 years, even though the data they receive is not perfect. Based on a report that they receive of the performance of VHA institutions, their CE managers identify areas where the performance of their department differs significantly from that of their peer group. Once those areas are identified, the CE manager knows that it will be worthwhile to find out the reasons for the variance, as he or she may discover that changes can be made to improve practices and obtain better results.
276

It would be very interesting to gather data and then test the relationships between the indicators. Testing hospital data would enable us to know if hospitals with higher levels of equipment use (as measured by number of in-patient and out-patient admissions) really do have higher maintenance costs. That type of knowledge would enable us to conrm the parameters used to establish meaningful peer groups and to therefore develop effective benchmarks for the performance within those groups, ensuring a good apples-toapples comparison. The nal and most persuasive test of indicators, though, is whether they are used. If managers nd that the indicators can truly be used as benchmarks and that benchmarking helps them to improve their processes and decision-making, then benchmarking would be shown to be useful (as well as feasible and desirable).

Conclusion
We have developed and applied a logical, transparent methodology to the development of CE activity indicators, producing a set of representative indicators that can be used to compare performance of the majority of the activities performed by CE departments. These indicators were selected to demonstrate the feasibility and desirability of benchmarking CE department performance. Although we believe that they are helpful indicators of CE department performance, we do not intend for them to be interpreted as the recommended set of indicators. We want to collect ideas from the CE community on how to develop a RSI. Please send your comments on the development of the RSI, as well as any other comments that you have regarding this study, to the author at jgaev@ecri.org.
Jonathan A. Gaev, MSE, CCE, HEM, PMP, is the director of technical programs, Health Devices Group, ECRI Institute.

Acknowledgments
ECRI Institutes team included Jonathan Gaev, Jim Keller, Harvey Kostinsky, Rob Maliff, Tim Ritter, and Jonathan Treadwell. Signicant contributions were also made by Mark S. Brody, CCE, of the Veterans Health Administration (VHA), Department of Veterans Affairs; Phil Englert of Catholic Health Initiatives; and Malcolm Ridgway of Masterplan.
July/August 2007

COVER STORY
Jonathan A. Gaev

References
AAMI, Design of Clinical Engineering Quality Assurance and Risk Management Programs, 1990. American College of Clinical Engineering, ACCE Body of Knowledge Survey Results-2005, Submitted June 20, 2006. American College of Clinical Engineering. Guideline for medical equipment management programs (MEPs). Published January 12, 2006. American Hospital Association, AHA Guide 2005 edition (based on data collected as of June 30, 2004) 2004, Health Forum LLC, an afliate of the American Hospital Association. ANSI/AAMI EQ56:1999 Recommended practices for a medical equipment management program. American Hospital Association, Estimated Useful Lives of Depreciable Hospital Assets, revised 2004 edition, Health Forum inc., 2004. Bates et al. The impact of computerized physician order entry on medication error prevention. Journal of the American medical Informatics Association 6(4):319. ECRI Institute Experience and HECSTM Reports. Autio DD, Morris RL. Clinical engineering program indicators. The Biomedical Engineering Handbook, 2nd Ed. (2000):1701 170-9. Barber F, Strack R. The surprising economics of a people business. The Harvard Business Review (June 2005):8190. Bauld TJ. Productivity: Standard terminology and denitions. Journal of Clinical Engineering 12(2):139145. Brown S et al. S-Business: Dening todays technology services business. ASFM, August 2005. Campbell S, Piotrowski MB, Diez C. Salary survey: benchmarking your employment information. Biomed Instrum Technol. 37(6):398404. Cohen J. Statistical Power Analysis for the Behavioral Sciences, 2nd Ed. Lawrence Erlbaum Associates, 1988. Cohen T et al. Benchmark indicators for medical equipment repair and maintenance. Biomed Instrum Technol. 29(4):308321. Cohen T. Validating medical equipment repair and maintenance metrics: a progress report. Biomed Instrum Technol. 31(1)2332. David Y, Rohe D. Clinical engineering program productivity and measurements. Journal of Clinical Engineering, 11(6): 435443. ECRI Institute. Guidance article-best practices for health technology management. Health Devices. 35(12):437448. Fennigkoh L. Cost-effectiveness and productivity. The Clinical Engineering Handbook (2004):199202. Flex Monitoring Team Brieng Paper No. 7, Financial Indicators for Critical Access Hospitals, University of Minnesota, University of North Carolina at Chapel Hill, University of Southern Maine (May 2005). Available at http://www.exmonitoring.org. Fotopoulos M. Are you benchmarking yet? 24x7 (December 2006):2431. Furst E. Productivity and cost-effectiveness of clinical engineering. Journal of Clinical Engineering. 11(2):105113. Gater L. The business of running an in-house biomed program. 24x7. (Accessed Sept. 21, 2006). Gordon GJ. Break through management: a new model for hospital technical services. Association for the Advancement of Medical Instrumentation. 1995.

Haas J. Tips from the eld: How to strengthen your customer service program. Biomed Instrum Technol. 36(4):231236. Halept VA. Department benchmarks. Health Facilities Management (May 2003):28, 29. Hansen DK, Hansen L. A new perspective on clinical technology management. Biomed Instrum Technol. 37(3):181189. Health Plan Employer Data information Set 2007, National Committee for Quality Assurance, Washington, DC, 2006. Hinesly D. Safety in numbers. 24x7. (Accessed Sept. 21, 2006). Health Facilities Management. May 2003 p. 28. Available at http://www.hfmmagazine.com. Hertz E. Developing Quality Indicators for a Clinical Engineering Department, Plant, Technology and Safety Management Series: Measuring Quality in PTSM, Chicago, Joint Commission on Accreditation of Healthcare Organizations (1990):2933. Joint Commission Primer on Indicator Development and Application-Measuring Quality in Healthcare. Joint Commission on Accreditation of Healthcare Organizations, Comprehensive Accreditation Manual for Hospitals: The Ofcial Handbook 2007. Joint Commission on Accreditation of Healthcare Organizations, A comprehensive Review of Development and Testing for National Implementation of Hospital Core Measures. Kaplan S, Norton P. Translating Strategy into Action-The Balanced Scorecard, The Harvard Business School Press, 1996. The Leapfrog Group Hospital Quality and Safety Survey. June 8, 2006, V3.2.1. Stiefel RH. Developing an effective inspection and preventive maintenance program. Biomed Instrum Technol. 36(6):405408. Stiefel, RH. How to be in complete and continuous compliance with the JCAHO standards, AAMI 2004 p. 1931. Tackel IS et al. Focus On: Thomas Jefferson University Hospital, Department of Biomedical Instrumentation, Journal of Clinical Engineering. 18(6):501509. Thomas R. 2005 Survey of salaries and responsibilities for hospital biomedical/clinical engineering and technology personnel. Journal of Clinical Engineering. October/December 2005:229 224. Triola M. Elementary Statistics 8th Ed. Addison Wesley, NY 2001. VHA Directive 2006-15, Benchmarking VHA biomedical engineering operations. March 27, 2006, Department of Veterans Affairs, Washington, DC. Wang B et al. Global failure rate: A promising medical equipment management outcome benchmark. Journal of Clinical Engineering, July/September 2006:145151. Wang B. Who drank my wine? Biomed Instrum Technol. 40(6):418.

FOR MORE ON BENCHMARKING:

please see On Sculpture, Baseball, and Benchmarking on page 332.

Biomedical Instrumentation & Technology

277

Você também pode gostar