Healthcare Policy

Healthcare Policy 3(4) May 2008 : e160-e174.doi:10.12927/hcpol.2008.19898
Research Papers

Adapting the Balanced Scorecard for Mental Health and Addictions: An Inpatient Example

Elizabeth Lin and Janet Durbin

Abstract

The Balanced Scorecard (BSC) is a performance-monitoring framework that originated in the business sector but has more recently been applied to health services. The province of Ontario is using the BSC approach to monitor quality of inpatient care in five service areas. Feasibility of the scorecard framework for each area has been assessed using a standard approach. This paper reports results of the feasibility study for the mental health sector, focusing on three issues: framework relevance, underlying strategic goals and indicator selection. Based on a literature review and extensive stakeholder input, the BSC quadrant structure was recommended with some modifications, and indicators were selected that aligned with provincial mental health reform policy goals. The mental health report has completed two cycles of reporting, and has received good support from the field.

The Balanced Scorecard (BSC) is an approach increasingly used to monitor performance of healthcare systems. Introduced and developed by Kaplan and Norton (1992, 1996) for the business sector, its central premises are that a company should be evaluated on its progress towards its strategic objectives using both traditional financial measures and measures in three other areas: customer perspective, internal business processes and organizational learning and growth. Indicators should measure organization performance for key strategic objectives in all four areas and should be composed of component parts that reflect specific company practices. The BSC thus provides both a comprehensive picture of a business's progress and a guide for targeting interventions. Its advantages have been widely described (Kaplan and Norton 1992, 1996; Meyer 2002).

Baker and Pink (1995) proposed a strategy for adapting the BSC to healthcare organizations (also see Pink et al. 2001; Zelman et al. 2003), and many examples of its use have been reported (Wolfersteig and Dunham 1998; Griffith et al. 2002; ten Asbroek et al. 2004; Auger and Roy 2004; Yang and Tung 2006; Inamdar et al. 2002). However, mental health and addictions have lagged behind other health sectors in adopting the BSC, as evidenced by few published case examples (Coop 2006; Santiago 1999; Schmidt et al. 2006). 

An opportunity to use the BSC for monitoring mental health and addictions inpatient services in Ontario, Canada emerged in the late 1990s as part of a larger initiative to develop inpatient scorecards for acute, emergency, rehabilitation and complex continuing care (HRRC 2007). All these report cards were based on the BSC framework. A decision was made to add mental health and addictions inpatient care to the suite, pending a feasibility study. This paper describes the process and results of that study, focusing on three issues: assessment of framework relevance, strategic goal selection and indicator selection. 

Study Background

The Hospital Report Research Collaborative (HRRC) was a partnership between academic centres, hospital stakeholders and the Ontario Ministry of Health and Long-Term Care, formed in 1997 to develop relevant and scientifically valid report cards for monitoring the quality of inpatient care. The HRRC (2007) focused first on acute care and expanded quickly to other health sectors. Its reports were based on the BSC framework for healthcare organizations proposed by Baker and Pink (1995), who re-conceptualized the four quadrants as clinical utilization and outcome, system integration and change (the equivalent to Kaplan and Norton's innovation and learning quadrant), patient satisfaction and financial performance and condition. Report card development for all sectors followed the same sequence of determining feasibility, recommending indicators and then reporting indicators at the regional level before shifting to individual organization reporting. 

A study was funded to develop a mental health hospital report following these three steps. The team formed a multi-stakeholder advisory panel to provide consultation and advice throughout the study. Additionally, the team attended HRRC meetings to maintain as much consistency as possible with the definitions and methods used in the other sectors. In developing its report, the study team encountered three significant problems that required resolution. The first was that the four quadrants described by Baker and Pink (1995) did not match commonly used categories in mental health monitoring. The second was the need to identify the strategic objectives that could be measured by the report card. The third was to select indicators that, based on strategic objectives, were meaningful, valid and feasible. This paper reviews each issue, first describing the method used and then reporting the decisions. A final section briefly summarizes the resulting framework and indicators. 

Framework Relevance

A major challenge in applying the BSC to mental health and addictions care is the already long-standing tradition of performance measurement in mental health. One of the oldest and best-known US examples is the Mental Health Statistical Improvement Program (MHSIP) developed by the National Institute of Mental Health (Leginski 1989). The MHSIP began with a focus on the information needed to manage and deliver high-quality mental healthcare in adult community services but has expanded to include children, youth and inpatient services (MHSIP 2007). Data sources were initially administrative but eventually included a consumer survey (MHSIP Task Force 1996). Indicators assessed performance in five areas: access, appropriateness/quality, outcomes, participation and continuity. The MHSIP has continued to evolve through developing and piloting mental health service indicators (Lutterman et al. 2003) and producing mental health quality reports (Ganju 2006; Smith and Ganju 2006).

The distinct approaches represented by the MHSIP and the BSC posed a dilemma when choosing or developing an approach for mental health performance monitoring. Toolkits and frameworks such as the MHSIP provide a common language and are generally accepted within the mental health community, whereas the BSC, because of its business management origins, is relatively foreign and thus risks poor credibility with mental healthcare providers (Coop 2006). However, because of its growing application to healthcare monitoring, especially in Ontario, adoption of the BSC provided an opportunity for mental health and addictions to share a common language with other healthcare measurement strategies. 

The study team reviewed mental health monitoring frameworks to assess for similarities to the BSC. A snowball sampling method was used. A list of key performance initiatives (such as the MHSIP) was created through a literature review and suggestions from international experts. Follow-up on this list led to identification of other candidates. Search criteria included

  • focus on mental health and addictions sectors;
  •  
  • development and refinement using wide stakeholder participation;
  •  
  • coverage of a range of service and system functions consistent with those available in Ontario, and an underlying health system similar to that of Ontario;
  •  
  • detailed descriptions for framework, including a rationale and discussion of limitations; and
  •  
  • inclusion of at least one completed cycle of implementation.

Nine frameworks met most or all of the criteria (Table 1). Of these, two were not specific to mental health but were included because of their Canadian relevance. A third was at the conceptual stage only, but was included because it was comprehensive and well known nationally. Print and Web-based documents for each framework were reviewed, with any needed clarification sought by telephone. Follow-up interviews also sought feedback on developmental and implementation challenges, especially related to use of data for decision-making. The major conceptual areas or domains from each framework were recorded, along with the rationale for their inclusion and any recommended or calculated indicators. 


[Table 1]


Table 1 shows the domains assessed in each of the nine initiatives. The most striking feature is the high degree of consistency, with the majority of frameworks sharing more than half the domains. The second is the limited correspondence between these domains and the BSC quadrants. This lack of an easy equivalence led to the development of a matrix (Table 2) representing the BSC quadrants and five mental health domains. Domains were selected based on their near-universality across the reviewed frameworks (Accessibility, Appropriateness and Outcomes) or their particular relevance to Ontario healthcare policy (Participation and System Management). 


[Table 2]

 

Selection of Strategic Objectives

The project mandate was to develop a BSC for individual hospital use. However, there are 56 Ontario hospitals that provide psychiatric care in designated mental health beds and that were the targeted users for the report. Representing the strategic goals of all 56 in a single BSC was not feasible. Fortunately, the province, like many other jurisdictions, has developed a series of policy documents spanning nearly two decades that outline the goals of mental health reform and the roles that different sectors are expected to play (OMH 1988, 1993, 1999). These policies elaborate different aspects of a consistent vision in which the central goals mark a shift from treating symptoms to treating the whole person, from institutional to community-based care and from "silos" of care to integrated and seamless services. 

Given Ontario's universal and largely single-payer healthcare system, it seemed appropriate to treat these policies as the mental healthcare system's strategic plan that should strongly influence the practices of its constituent providers and organizations. This decision was supported by senior hospital administrators on the advisory panel, who felt that a provincial-level report card would be a useful complement to individual hospital monitoring efforts. The decision to apply the strategic plan of a larger system to smaller operational units differs from the approach taken in the United Kingdom (Schmidt et al. 2006) and New Zealand (Coop 2006), where the scorecard was driven by the system's own strategy rather than that of a larger entity in which the system was embedded. A strength of our approach is that it creates a multi-level framework that allows the common goals of units (e.g., hospitals) and their larger environment to be considered in tandem with those goals that are unit-specific (Lin et al. 2002).

A review of Ontario's mental health reform policies (OMH 1999) yielded four objectives specifically relevant to inpatient care:

  1. targeted and appropriate use of inpatient services, that is, care delivered in the least restrictive setting, based on need;
  2. a comprehensive continuum of services and supports that are linked and coordinated, allowing individuals to move easily from one part of the system to another;
  3. services based on current evidence about best practices; and
  4. consumer-centred care, that is, tailored to the needs and preferences of the individual to support an improved quality of life.

Indicator Selection

The framework and four strategic goals provided one set of guidelines for indicator selection. Other criteria commonly used to assess performance indicators include scientific soundness, meaningfulness or relevance, feasibility and actionability (Rosenheck and Cicchetti 1998; Hermann and Palmer 2002; Hermann et al. 2004; Larson and Mercer 2004). This project gave particular emphasis to feasibility and relevance because of feedback from other jurisdictions that lack of data and user buy-in limited what could actually be reported or used. Potential indicators were selected using the following criteria:

  • Did the measure reflect one of the four provincial strategic objectives?
  •  
  • Could it be calculated (or was it already being calculated) from available data?
  •  
  • Did it have a clearly desirable direction or pattern that identified better performance?
  •  
  • Was it actionable?
  •  
  • Were the BSC quadrants and mental health domains represented in the final indicator set?
  •  
  • Was the final indicator set manageable (e.g., relatively short, understandable)?

 

These criteria were applied in a three-part, iterative process to select indicators. First, the project team evaluated the measures used in the reviewed frameworks against the four strategic objectives and available data in provincewide sources such as health insurance claims and hospital discharge abstracts. Indicators not measured by existing provincial data were retained and bookmarked if a suitable data source was expected soon. 

Next, the project advisory panel reviewed both existing and bookmarked indicators on the criteria of meaningfulness vis-à-vis provincial strategy, whether there was a desired value or direction and hospital control over results. Consultation with the panel continued as data became available, this time shifting the evaluation from the theoretical to the actual numbers. At this point, data quality was also considered, as was performance variation across hospitals. Fortunately, the advisory panel - composed of policy makers, planners, hospital administrators and senior management, providers and consumers - has remained largely intact throughout the project (Lin et al. 2002, 2005).

The third step involved end-user feedback. Hospitals were surveyed after they received their individual draft results and site visits were conducted to elicit feedback and suggestions for improvement. These processes were particularly critical because the use of provincial strategic directions to guide indicator selection created the risk that hospitals would find the results irrelevant or not actionable. Because hospitals were the intended end users (Brown et al. 2004), an important ingredient in the successful adaptation of the BSC was user acceptance.


[Table 3]


Table 3 reports survey results. Hospitals rated each indicator on its relevance to their own strategic goals, whether they were already calculating and using it and the extent to which they had control over the indicator's value. Of the 56 canvassed hospitals, 41 (73%) responded. Overall, there was solid endorsement of the relevance of the indicators ("very relevant" judgments averaged 69% across all indicators and ranged between 39% and 91%). There was a similar finding for the numbers of hospitals calculating these or similar indicators at least yearly (average 66%, range 20%-96%), but there was also a drop-off in the proportion judging that the indicators were completely (average 38%, range 10%-74%) or somewhat (average 54%, range 26%-75%) under their control. These results influenced subsequent indicator refinement. However, their primary value may be the accompanying discussions about local factors that might affect hospital performance, processes or structures that might be changed and the hospital and governing bodies that should be at the accountability and quality improvement tables.

Resulting Framework and Indicators

The result of the feasibility study has been an endorsement of the BSC framework for mental health inpatient reporting in Ontario. A report structure and indicators were proposed, and two cycles of reporting (first provincial results only, then hospital-level reporting) have since occurred. The most recent report includes 29 indicators. These are shown in Table 4, organized by the quadrant/domain matrix and labelled with the strategic objective that they reflect. Table 4 represents the current state of a process that began with 40 recommended indicators (Lin et al. 2002); these were reduced to 31 three years later (Lin et al. 2005) and were subsequently decreased again. Reasons for removing indicators include very small numbers and hence little variation (e.g., rate of formal complaints), unreliable measures (e.g., percentage of emergency room discharges admitted to "no available inpatient bed") and no clearly desirable direction or pattern in terms of quality of care (e.g., average length of stay). The expectation is that the process of indicator selection and refinement will continue as new data sources become available, strategic objectives are accomplished or provincial directions change.


[Table 4]

 

Discussion and Conclusions

Zelman et al. (2003: 12) point out the necessity of modifying the BSC to fit "industry and organizational realities." Our project found three critical points where adaptation was required. Like other implementations of the BSC in healthcare, we encountered concepts that did not map easily onto the BSC. Solutions reported in the literature have included modifying the scope of the original quadrants (Baker and Pink 1995), adding new quadrants (e.g., Santiago 1999) or changing the expected sequence or causal relationship among them (Rimar 2000). Because of our interest in maintaining consistency with the other HRRC report cards, yet remaining on familiar conceptual territory with mental healthcare providers, we created a matrix that would accommodate both BSC and mental health and addictions perspectives rather than force-fit one onto the other. 

Our second critical point was deciding which strategic goals should drive indicator selection. Zelman et al. (2003) distinguish between scorecards for healthcare organizations and for healthcare sectors. In both cases, the strategy of the relevant unit of analysis (i.e., the organization or the sector) drives indicator selection. However, the organizational-level scorecard is internally applied using specific indicators and a focus on quality improvement. The healthcare sector scorecard is externally applied using general system indicators with a focus on public accountability. Our use of provincial policy as a systemwide "strategic plan" is similar to the latter, with an important difference. The role of inpatient care as one point on the care continuum implies a strong concordance between system-level and hospital-level strategic objectives. It also implies that hospital performance is at least partially contingent on strong and coordinated performances by other sectors. These implications are consistent with the hospital evaluations of the relevance of our chosen indicators to their own strategic goals as well as their perceived degree of control over the indicator values. Under these circumstances, the distinction between internal and external is not always straightforward, and perhaps some concept of shared quality improvement or mutual accountability may be appropriate.

The third critical point was selecting the indicators. Our iterative use of information from existing frameworks, available data and feedback from our advisory panel and the end users allowed us to apply multiple criteria in a more complex way than using a serial set of filters. This process has also resulted in an ongoing relationship with Ontario hospitals that should assist in future performance monitoring using the BSC. 

There is still insufficient information from the field to allow us to judge which of our modifications may be useful to healthcare in general, which are specific to mental health and addictions and which are even more specific to Ontario. Other reports indicate that the divide between managerial and provider perspectives pervades many healthcare sectors (Schiff 2000; Horwitz 2005) and that the choice of what should be monitored and how are ubiquitous challenges (Campbell et al. 2003; Hermann and Palmer 2002). As more monitoring initiatives are reported, a broader range of potential solutions will become available. The variations in their purpose, strengths and limitations will be useful information for those developing and implementing new performance monitoring systems.


Adapter le tableau de bord prospectif aux services de santé mentale et de toxicomanie : l'exemple des patients hospitalisés

Résumé

Le tableau de bord prospectif (TBP) est un cadre de suivi du rendement qui provient du secteur des affaires et qui a été récemment adopté dans les services de santé. L'Ontario utilise les TBP pour surveiller la qualité des services aux patients hospitalisés pour cinq types de services. La faisabilité du cadre de travail des TBP a été évaluée pour chaque service au moyen d'une approche normalisée. L'article fait état des résultats de l'étude de faisabilité pour le secteur de la santé mentale, touchant trois enjeux : la pertinence du cadre de travail, les objectifs stratégiques sous-jacents et le choix des indicateurs. À la suite d'une revue de la littérature et de nombreuses informations recueillies auprès des parties prenantes, la structure à quadrants des TBP a été recommandée avec quelques modifications et des indicateurs ont été choisis en correspondance aux objectifs de réforme politique de la province en matière de santé mentale. Le rapport sur la santé mentale a franchi deux étapes de son cycle et a reçu un appui favorable de la part du secteur concerné.

About the Author(s)

Elizabeth Lin, PhD
Assistant Professor, Department of Psychiatry
University of Toronto
Research Scientist, Health Systems Research and Consulting Unit
Centre for Addiction and Mental Health
Toronto, ON

Janet Durbin, PhD
Assistant Professor, Department of Psychiatry
University of Toronto
Research Scientist, Health Systems Research and Consulting Unit
Centre for Addiction and Mental Health
Toronto, ON

Correspondence may be directed to: Elizabeth Lin, Research Scientist, Health Systems Research and Consulting Unit, Centre for Addiction and Mental Health, 33 Russell Street, T313 Toronto, Ontario M5S 2S1; tel: 416-535-8501 (x 4102); fax:  416-979-4703; e-mail: Elizabeth_Lin@camh.net.

Acknowledgment

This study was funded by a joint initiative of the Government of Ontario and the Ontario Hospital Association, 2001-2007. We would like to acknowledge the helpful comments of Carol Strike, PhD, and the assistance of Natalia Zaslavska, MPh.

References

Auger, N. and D.A. Roy. 2004. "The Balanced Scorecard: A Tool for Health Policy Decision-Making." Canadian Journal of Public Health 95(3): 233-34.

Baker, G.R. and G. Pink. 1995. "A Balanced Scorecard for Canadian hospitals." Healthcare Management 8(4): 7-13. 

Brown, A.D., M. Alikhan, G.M. Anderson, G.R. Baker, R. Croxford, I. Daniel, P. Lindsay, 

F. Markel, I. McKillop, C. Moty, G.H. Pink, M.J. Schull and D. Tregunno. 2004. Hospital Report 2003: Emergency Department Care. Joint initiative of the Ontario Hospital Association and the Government of Ontario. Toronto: Hospital Report Research Collaborative, University of Toronto.

Campbell, S.M., J. Braspenning, A. Hutchinson and M.N. Marshall. 2003. "Research Methods Used in Developing and Applying Quality Indicators in Primary Care." British Medical Journal 326(7393): 816-19.

Canadian Council on Health Services Accreditation (CCHSA). 2001. Indicators and the AIM Accreditation Program. Ottawa: Author. 

Canadian Institute for Health Information (CIHI). 2001. Mental Health and Addiction Services Phase One Indicators, Data Dictionary, and Conceptual Framework. Ottawa: Author. 

Center for Mental Health Policy and Services Research. 2001 (May). PERMES, Georgia's Performance Measurement and Evaluation System: 2000-2001 Performance Profile HMRSA Statewide Summary. Pittsburgh: University of Pennsylvania. 

Coop, C.F. 2006. "Balancing the Balanced Scorecard for a New Zealand Mental Health Service." Australian Health Review 30(2): 174-80.

Eisen, S.V., M. Wilcos, H.S. Leff, E. Schaefer and M.A. Culhane. 1999. "Assessing Behavioral Health Outcomes in Outpatient Programs: Reliability and Validity of the BASIS 32." Journal of Behavioral Health Services and Research 26: 5-17. 

Ganju, V. 2006. "Mental Health Quality and Accountability: The Role of Evidence-Based Practices and Performance Measurement." Administration and Policy in Mental Health and Mental Health Services Research 33(6): 659-65. 

Ghinassi, F. 2000. "Pittsburgh Systems Benchmarking Plan Is Performance 'Dashboard' for Providers." Managed Behavioral Health News 6: 1-3. 

Griffith, J.R., J.A. Alexander and R.C. Jelinek. 2002. "Measuring Comparative Hospital Performance." Journal of Healthcare Management 47(1): 41-57.

Hermann, R.C. and H.R. Palmer. 2002. "Common Ground: A Framework for Selecting Core Quality Measures for Mental Health and Substance Abuse Care." Psychiatric Services 53: 281-87. 

Hermann, R.C., H. Palmer, S. Leff, M. Shwartz, S. Provost, J. Chan, W.T. Chiu and G. Lagodmos. 2004. "Achieving Consensus across Diverse Stakeholders on Quality Measures for Mental Healthcare." Medical Care 42(12): 1246-53.

Horwitz, J.R. 2005. "Making Profits and Providing Care: Comparing Nonprofit, For-Profit, and Government Hospitals." Health Affairs 24(3): 790-801.

Hospital Report Research Collaborative (HRRC). 2007. Retrieved January 19, 2008. < http://www.hospitalreport.ca >.

Inamdar, N., R.S. Kaplan and M. Bower. 2002. "Applying the Balanced Scorecard in Healthcare Provider Organizations." Journal of Healthcare Management 47(3): 179-95.

Kaplan, R.S. and D.P. Norton. 1992. "The Balanced Scorecard - Measures that Drive Performance." Harvard Business Review 70(1): 71-79.

Kaplan, R.S. and D.P. Norton. 1996. "Why Does Business Need a Balanced Scorecard?" In The Balanced Scorecard: Translating Strategy into Action (pp. 21-41). Boston: Harvard Business School Press.

Larson, C. and A. Mercer. 2004. "Global Health Indicators: An Overview." Canadian Medical Association Journal 171(10): 1199-1200.

Leginski, W.A., C. Croze, J. Driggers, S. Dumpman, D. Geertsen, E. Kamis-Gould, M.J. Namerow, R.E. Patton, N.Z. Wilson and C.R. Wurster. 1989.  Data Standards for Mental Health Decision Support Systems. Series FN No.10. DHHS Pub. No. (ADM) 89-1589. Washington, DC: National Institute of Mental Health.

Lin, E., N. Degendorfer, J. Durbin, P. Prendergast and P. Goering. 2002. Hospital Report 2001: Mental Health. Joint initiative of the Ontario Hospital Association and the Government of Ontario. Toronto: Hospital Report Research Collaborative, University of Toronto. 

Lin, E., J. Durbin, C. Koegl, M. Murray, T. Tucker, I. Daniel, F. Markel, L. McGillis Hall, 

I. MicKillop, G. Pink, C. Layne, P. Prendergast and P. Goering. 2005. Hospital Report 2004: Mental Health. Joint initiative of the Ontario Hospital Association and the Government of Ontario. Toronto: Hospital Report Research Collaborative, University of Toronto.

Lutterman, T., V. Ganju, L. Schacht, R. Shaw, K. Higgins, R. Bottger, M. Brunk, J.R. Koch, 

N. Callahan, C. Colton, D. Geertsen, J. Hall, D. Kupfer, J. Letourneau, J. McGrew, S. Mehta, 

J. Pandiani, B. Phelan, M. Smith, S. Onken, D.C. Simpson, A.D. Rock, J. Wackwitz, M. Danforth, O. Gonzalea and N. Thomas. 2003. "Sixteen-State Study on Mental Health Performance Measures." In R.W. Manderscheid and M.J. Henderson, eds., Mental Health, United States, 2002 (Chapter 6). DHHS Publication No. SMA04-3938. Rockville, MD: Center for Mental Health Services, Substance Abuse and Mental Health Services Administration.

McEwan, K. and E. Goldner. 2001. Accountability and Performance Indicators for Mental Health Services and Supports: A Resource Kit. Ottawa: Health Canada. 

Mental Health Statistics Improvement Program (MHSIP). 2007. MHSIP Online. Retrieved January 19, 2008. < http://www.mhsip.org >. 

Mental Health Statistics Improvement Program (MHSIP) Task Force. 1996. Consumer-Oriented Mental Health Report Card. Rockville, MD: Centre for Mental Health Service, Substance Abuse and Mental Health Services Administration, US Department of Health and Human Services.

Meyer, M.W. 2002. Rethinking Performance Measurement: Beyond the Balanced Scorecard. Cambridge, UK: Cambridge University Press. 

Ontario Ministry of Health (OMH). 1988. Building Community Support for People (Graham Report). Toronto: Government of Ontario.

Ontario Ministry of Health (OMH). 1993. Putting People First: The Reform of Mental Health Services in Ontario. Toronto: Government of Ontario.

Ontario Ministry of Health (OMH). 1999. Making It Happen: Implementation Plan for Mental Health Reform. Toronto: Government of Ontario.

Pink, G.H., I. McKillop, E.G. Schraa, C. Preyra, C. Montgomery and G.R. Baker. 2001. "Creating a Balanced Scorecard for a Hospital System." Journal of Health Care Finance 27(3): 1-20.

Provincial Performance Monitoring Reference Group. 2000 (May). Assessing the Performance of the Mental Health System: A Provincial Report 1997/1998. Victoria, BC: British Columbia Ministry of Health. 

Rimar, S. 2000. "Strategic Planning and the Balanced Score for Faculty Practice Plans." Academic Medicine 75(12): 1186-88.

Rosenheck, R. and D.A. Cicchetti. 1998. "Mental Health Program Report Card: A Multidimensional Approach to Performance Monitoring in Public Sector Programs." Community Mental Health Journal 34: 85-109.

Rosenheck, R.A. and D. DiLella. 2000.  "Department of veterans affairs national mental health program performance monitoring system: fiscal year 1999 report.  VA Connecticut Healthcare System.

Santiago, J.M. 1999. "Use of the Balanced Scorecard to Improve the Quality of Behavioral Health Care." Psychiatric Services 50: 1571-76.

Schiff, G. 2000. "Fatal Distraction: Finance vs. Vigilance in Our Nation's Hospitals." Journal of General Internal Medicine 15(4): 269-70.

Schmidt, S., I. Bateman, J. Breinlinger-O'Reilly and P. Smith. 2006. "A Management Approach That Drives Actions Strategically: Balanced Scorecard in a Mental Health Trust Case Study." International Journal of Health Care Quality Assurance 19(2): 119-35.

Smith, M.E. and V. Ganju. 2006. "The MHSIP Mental Health Quality Report: The Next Generation of Performance Measures." In R.W. Manderschied and J.T. Berry, eds., Mental Health, United States, 2004 (Chapter 9). DHHS Pub. No. (SMA)-06-4195. Rockville, MD: Center for Mental Health Services, Substance Abuse and Mental Health Services Administration.

ten Asbroek, A.H., O.A. Arah, J. Geelhoed, T. Custers, D.M. Delnoij and N.S. Klazinga. 2004. "Developing a National Performance Indicator Framework for the Dutch Health System." International Journal of Quality in Health Care 16(Suppl. 1): i65-71.

Wolfersteig, J. and S. Dunham. 1998. "Performance Improvement: A Multidimensional Model." International Journal of Quality in Health Care 10(4): 351-54.

Yang, M.C. and Y.C. Tung. 2006. "Using Path Analysis to Examine Causal Relationships among Balanced Scorecard Performance Indicators for General Hospitals: The Case of a Public Hospital System in Taiwan." Health Care Management Review 31(4): 280-88.

Zelman, W.N., G.H. Pink and C.B. Matthias. 2003. "Use of the Balanced Scorecard in Health Care." Journal of Health Care Finance 29(4): 1-16.

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed