Objective: To develop a measure of cancer services integration (CSI) that can inform clinical and administrative decision-makers in their efforts to monitor and improve cancer system performance.
Methods: We employed a systematic approach to measurement development, including review of existing cancer/health services integration measures, key-informant interviews and focus groups with cancer system leaders. The research team constructed a Web-based survey that was field- and pilot-tested, refined and then formally conducted on a sample of cancer care providers and administrators in Ontario, Canada. We then conducted exploratory factor analysis to identify key dimensions of CSI.
Results: A total of 1,769 physicians, other clinicians and administrators participated in the survey, responding to a 67-item questionnaire. The exploratory factor analysis identified 12 factors that were linked to three broader dimensions: clinical, functional and vertical system integration.
Conclusions: The CSI Survey provides important insights on a range of typically unmeasured aspects of the coordination and integration of cancer services, representing a new tool to inform performance improvement efforts.
For more than a decade, health services researchers have focused on the integration of health services as a means to improve performance. Measures have been developed that assess both provider- and patient-derived aspects of the coordination and continuity of health services within and across sectors (Gillies et al. 1993; Burns et al. 2001; Alexander et al. 2001; Fairchild et al. 2002; Ware et al. 2003; Durbin et al. 2004; Dolovich et al. 2004). Cancer systems, representing a microcosm of broader healthcare systems (including health promotion, cancer prevention/screening, surgical interventions, radiation and systemic therapies, supportive and palliative care), present a particularly challenging context for service integration (Sullivan et al. 2008). Cancer patients are often cared for by multiple providers (e.g., surgeons, medical oncologists, radiation oncologists, nurses, radiation therapists, social workers, community healthcare providers, etc.) in multiple care settings (e.g., at specialized/comprehensive cancer centres, teaching and community hospitals, primary care settings and/or at home). In Ontario, Canada, a 2001 review of cancer services highlighted their fragmented nature and recommended "ways to improve the integration of cancer services at the local and regional levels, the quality of patient care, and the productivity and efficiency in the cancer service component of the Ontario health system" (Cancer Services Implementation Committee 2001). While this review led to major reorganization of the Ontario cancer system (Sullivan et al. 2004; Dobrow et al. 2006), a specific measure of cancer services integration (CSI) to guide restructuring and monitor performance improvement did not exist. This paper reports on efforts to develop a measure of integration specific to cancer services as part of a broader undertaking to monitor and improve cancer system performance in Ontario.
We employed a systematic approach to survey development, including a scan for existing models of "integrated" cancer services, a literature review of concepts and measures of health services integration, key-informant interviews and focus groups with cancer system clinicians and administrators. These were followed by item generation, testing and reduction, including pilot surveys and feedback interviews with cancer system decision-makers, before the launch of the CSI Survey in February 2007.
Scan for models of integrated cancer services
Through the mid- to late 1990s, the Veterans Health Administration in the United States went through a period of major restructuring, including the realignment of its cancer services (Wilson and Kizer 1998). Decision-making was decentralized and a system of integrated service networks was developed. This included primary, secondary and comprehensive cancer centres, local cancer registries, a research partnership with the National Cancer Institute and a standard electronic data infrastructure that supported a program of performance accountability and quality improvement (Wilson and Kizer 1998). Similarly, England and Wales recently went through a process of redesigning their cancer services (Department of Health 2000; Griffith and Turner 2004). Their ambitious reforms coincided with broader reforms in the National Health Service (Department of Health 1997, 1998), with a comprehensive cancer plan promoting collaborative partnerships and focused on improving the patient experience (Department of Health 2000). In Canada, British Columbia has developed an integrated cancer system based on central program/network infrastructure, a research centre, a comprehensive cancer registry and a network of service organizations and practice leaders to drive development of standardized processes of care (Carlow 2000).
While these illustrations of evolving cancer systems in different jurisdictions help to characterize important elements of an integrated cancer system, none provided specific definitions of, or tools for measuring, CSI. To augment the jurisdictional scan, we conducted a broader literature review.
Review of measures of cancer/health services integration
A search focusing specifically on published measures of CSI did not yield relevant findings. This was consistent with the findings of two recent reports, one a synthesis on health systems integration research (Suter et al. 2007) and another a systematic review of health system integration measures (Raina et al. 2006). Both identified a number of general and disease-/condition-specific measures of integration; however, none were specific to cancer services. Therefore, to inform our work, we first examined non-cancer measures and drew on the evolving body of research on health services integration to provide a conceptual basis for development of a measure of CSI.
Some of the best-known work comes from the Health Systems Integration Study (Shortell et al. 2000), which characterized health system performance as an output of integration, linking a system's vision, culture, strategy and leadership with three main dimensions of integration (Gillies et al. 1993):
Functional integration is defined as the extent to which key support functions and activities (such as financial management, strategic planning, human resource management, and information management) are coordinated across operating units of a system.
Physician-system integration is defined as the extent to which physicians are economically linked to a system, use its facilities and services, and actively participate in its planning, management and governance.
Clinical integration is defined as the extent to which patient care services are coordinated across the various functions, activities and operating units of a system.
In their extensive review of this work, Shortell and colleagues (2000) suggested that functional integration was most important for financial management and operating policies, information systems, resource allocation, quality improvement and strategic planning and less important for administrative support, human resources and marketing. Physician-system integration reflected physician remuneration, incentive, interdisciplinary care and accountability models, with physicians under pressure to contain costs, shift focus from individual to population levels and provide public accountability for performance. Shortell and colleagues (2000) described three levels of clinical integration, including a corporate level, where structural, systemic and cultural factors influence clinical integration; an intermediate/managerial level, where economies of scope or scale influence the standardization or duplication of clinical services; and a technical level that reflects the use of practice guidelines or protocols to influence care delivery (Shortell et al. 2000). These authors suggested that clinical integration is the most challenging and important component of an organized delivery system.
Leatt and colleagues (2000) described characteristics of integrated service delivery that reflect health system structures in Canada. They emphasized focus on the individual patient experience, starting with primary healthcare, sharing and utilizing information, creating virtual coordination networks at the local level, revising funding methods and developing performance monitoring capacity (Leatt et al. 2000). In a review of 41 studies, Leatt (2002) recommended that integrated service delivery should be characterized along three key dimensions: clinical, information and vertical integration. Clinical integration was linked to disease management programs, reflecting use of clinical protocols, pathways, guidelines and multidisciplinary teams, along with participatory structures and policies, and communication strategies to ensure stakeholder acceptance (Leatt 2002). Information integration focused on information management and technology that allows timely information sharing across traditional organizational and professional boundaries for all stakeholders (Leatt 2002). Vertical integration was linked to the patient experience, described as interorganizational arrangements across the continuum of care that allow improved coordination of patient care (Leatt 2002).
Leatt's patient-centred focus on integration differs somewhat from other views (Conrad and Dowling 1990; Hernandez 2000; Budetti et al. 2002; Burns and Pauly 2002), raising a fundamental conceptual question regarding the measurement of integration: Should measures of integration be derived from provider or patient perceptions? Interest in continuity of care dates back more than 30 years (Mindlin and Densen 1969; Bass and Windle 1972), yielding diverse patient-derived conceptions of what it is and how it can be measured (Reid et al. 2002; Freeman et al. 2001). In a multidisciplinary review, Haggerty and colleagues (2003) suggested that the concept of continuity of care should capture aspects of informational continuity (use of information on past events and personal circumstances to make current care appropriate for each individual), relational continuity (ongoing therapeutic relationship between a patient and one or more providers) and management continuity (a consistent, coherent approach to management of a health condition that is responsive to a patient's changing needs). More fundamentally, they suggested that "[c]ontinuity is not an attribute of providers or organisations ... [it] is how individual patients experience integration of services and coordination" (Haggerty et al. 2003). Conrad (1993) cautioned, however, that focus ultimately needs to be at the level of the system:
[t]he essence of a system is the ability to aggregate up individual level care coordination and clinical processes into a system level capacity to plan, deliver, monitor, and adjust the structures and strategies for coordinating the care of populations over time. The coordination of care for individual patients is a necessary but not sufficient condition to realizing system level clinical integration.
Despite these apparent contradictions, both provider-derived conceptions of health services integration and patient-derived conceptions of continuity of care are related. With a survey of ambulatory oncology patient satisfaction already underway in Ontario, which included questions on continuity and coordination of care, our intent was to develop a provider-derived measure of CSI that would complement data and insights drawn from the patient-derived measure.
Interviews, Focus Groups and Survey-Item Generation
We next looked to local cancer system leaders to examine what aspects of existing health services integration measures were relevant to cancer services. Interviews were conducted with clinical program leaders (i.e., systemic therapy, radiation oncology, surgical oncology, nursing, health human resources, clinical guideline development, prevention/screening, palliative care, supportive care, pathology and social work) from Ontario's cancer system. Each informant was asked to describe key challenges or barriers to the integration of cancer services, and to formulate three potential survey items. Focus groups were conducted with members of Cancer Care Ontario's Clinical Council (including clinical program leaders) and Provincial Leadership Council (including regional administrative heads for each Regional Cancer Program and Cancer Care Ontario's executive team). In both cases, council members were asked to identify examples of effective and ineffective integration in the Ontario cancer system and desired features reflecting integrated cancer services.
Survey items were generated iteratively, initially drawing on the 54-item survey instrument produced through the Health Systems Integration Study (Gillies et al. 1993) and supplemented by items suggested by key informants. After field testing and a pilot survey, the survey instrument was further refined, resulting in a 67-item questionnaire (13 demographic and 54 Likert scale items) with specific versions of each item tailored for the three main participant groups (i.e., physicians, other clinicians, administrators) to improve relevance and comprehension (item descriptions provided in Appendix).
Healthcare providers and administrators that had regular opportunities to interact with the cancer system were the primary focus of the survey (Table 1 describes the target population). Given cost considerations, an electronic survey was selected as the distribution mode, allowing a much larger sample of cancer care providers and administrators to be surveyed than would have been possible with more traditional paper- or telephone-based surveys. The electronic survey allowed real-time data collection and customized survey design, including use of conditional (skip/jump) logic to ensure that respondents were asked questions relevant to their position and region. However, the target population did require Internet or e-mail access at work.
|Table 1. Target population for the CSI Survey|
|Medical Oncologist||Pharmacist||Corporate Leadership (e.g., CEO, Executive Director)|
|Radiation Oncologist||Systemic Therapy Clinic Nurse||Cancer Services|
|Paediatric Oncologist||Chemotherapy Nurse||Case Management|
|Radiologist||Inpatient Oncology Nurse||Client/Patient Services|
|Surgical Oncologist||Radiation Therapy Nurse||Clinical Programs|
|Surgeon - General||Advanced Practice Nurse||Finance|
|Surgeon - Gynaecologist||Clinical Trials Nurse||Human Resources|
|Surgeon - Urologist||OBSP Nurse||Information Technology/Management|
|Surgeon - Thoracic||Social Worker||Nursing|
|Surgeon - Otolaryngologist||Dietician||Prevention/Screening|
|Respirologist||Community Care Planners|
|Palliative Care Physician|
The sampling frame was constructed from a variety of sources, including the Canadian Medical Directory, Cancer Care Ontario's e-mail directories and direct
contact with provider organizations, including hospitals and community care access centres (CCACs). In addition to the inclusion of all 14 CCACs in Ontario, 63 Ontario hospitals were selected based on the following criteria:
- all Regional Cancer Program host hospitals
- all teaching hospitals
- all children's hospitals
- all Cancer Surgery Agreement (CSA)/Systemic Therapy Agreement (STA) hospitals1
- all hospitals performing over 100 cancer surgeries per year (2005/06)2
- minimum of three hospitals per geographically defined Local Health Integration Network (where criteria 1 through 5 did not provide this, up to two additional hospitals were selected in order of highest cancer surgery volume).
Because there were only minimal cost implications of expanding the sample size when using the electronic survey, the sample included the entire target population of identifiable cancer care providers and administrators in Ontario that had Internet/e-mail access at work.
The survey was launched on February 26, 2007 with responses accepted at any time over a three-week period. An e-mail introduction to the survey was sent to all study subjects from the appropriate Regional Cancer Program leader. This mailing was followed by an automated e-mail invitation and three automated reminder e-mails, each with a link to the Web-based survey and co-signed by the appropriate Regional Cancer Program leader and two members of Cancer Care Ontario's executive team. These e-mail invitations described the study, provided contact details for further information and offered an explicit option for the study subject to decline participation and be removed from the reminder list. All respondents were offered a $5 electronic gift certificate for participating. Ultimately, the survey was received by 5,366 cancer care providers and administrators throughout Ontario.
Data were captured automatically through a Surveymonkey.com database and downloaded for analysis using SPSS (version 15). An exploratory factor analysis was the main analytical approach taken to guide identification of CSI dimensions (Harman 1976; Rummel 1970). The factor structure of the full 54-item scale was assessed through unweighted least squares analysis with varimax rotation (Jöreskog 1977). Resultant factors were then interpreted by examining item content and pattern of coefficients.
Ethics approval for the study was granted by the University of Toronto's Research Ethics Board.
Participation rates and participant characteristics
Of the 5,366 e-mail invitations sent to valid e-mail addresses, there were 2,031 responses (i.e., the survey was accessed via the Web link). For the purposes of this study, we defined "participation" as those respondents who completed question 10, which required identification of the Regional Cancer Program most relevant to the respondent's clinical or professional work. According to this criterion, there were 1,769 participants, resulting in a participation rate of 33%. Provincially, 47% of administrators participated in the survey, while participation rates for physicians (25%) and other clinicians (32%) were considerably lower. A detailed analysis of participation rates has been reported elsewhere (Dobrow et al. 2008).
Of the 1,769 participants, 28% were physicians, 35% were other clinicians and 37% were administrators, with the majority female (69%) between the ages of 40 and 60 (71%) (Table 2). Participants represented all 13 Regional Cancer Programs in Ontario, identifying teaching hospitals (47%), community hospitals (37%), CCACs (13%) or other locations (3%) as their primary place of work. A Regional Cancer Program host hospital (teaching or community) was the main location of work for 50% of participants, suggesting that participants provided good representation for both cancer centre and non-cancer centre based individuals.
|Table 2. Participant characteristics (n=1,769)|
|Region Cancer Program (RCP)*†||RCP A||79||4.5%|
|Location of work||Teaching Hospital||835||47.2%|
|Community Hospital (100 or more beds)||613||34.7%|
|Community Hospital (less than 100 beds)||43||2.4%|
|Community Care Access Centre||230||13.0%|
|Other (e.g., Private Practice Clinic, Public Health Unit)||39||2.2%|
|Distance from main RCP in region||At main RCP hospital||878||49.6%|
|Less than 10 km but not at main RCP hospital||285||16.1%|
|Between 11 and 20 km from main RCP hospital||132||7.5%|
|Between 21 and 100 km from main RCP hospital||255||14.4%|
|More than 100 km from main RCP hospital||195||11.0%|
|* Answer to item required.
† Sample size for each RCP varied.
It was possible to compare a few characteristics of the survey participants (n=1,769) and the full sample (N=5,366), with no major differences detected. Comparing regional response, 11 of the 13 regions had participation rates within 1% (with all 13 within 3%) of the regional breakdown for the full sample. Compared with the full sample, participants included relatively more administrators and fewer physicians.
Item response distribution and missing data
For the 54 Likert scale items, a five-point scale was used ("strongly agree" to "strongly disagree"), along with a "not applicable" option. Missing responses were relatively low for all items, with non-response rates not higher than 10% for any one item and combined missing and "not applicable" response rates not higher than 20% for any one item. Frequency distributions indicated a full range of responses for all items, with no floor or ceiling effects noted. Therefore, all 54 items were retained for further analysis.
Exploratory factor analysis
Given participants' varying individual item completion rates (i.e., no missing data or "not applicable" responses) for all 54 items, the exploratory factor analysis (EFA) was ultimately based on 722 valid responses. Following examination of eigenvalues, scree plot and factor loadings, a 12-factor (36-item) solution was determined to provide the best fit. Eigenvalues for the 12 factors ranged from 11.6 to 1.1, accounting for 51% of the common variance. While factor loadings above 0.32 can be considered meaningful (Tabachnick and Fidell 2007), 49 of the 54 items had loadings greater than 0.32, creating a complex interpretation of the resultant factors. Therefore, a higher threshold of 0.5 was used to allow clearer interpretation of the resultant factors (Table 3). Internal consistency reliability for each of the resultant factors was estimated using Cronbach's coefficient alpha with acceptable values ranging from 0.74 to 0.90 (Table 3).
Various methods of imputation were performed, including substitution and stochastic regression imputation, to assess the impact of missing data on the resultant factor structure (Little and Rubin 2002). This included recoding "not applicable" responses to "neither agree nor disagree" or extreme values (e.g., "strongly agree" or "strongly disagree") and using regression residuals to impute values for missing data. This approach allowed data from all 1,769 responses to be analyzed. This sensitivity analysis showed that while imputing extreme values did, as expected, produce inconsistent factor structures, recoding of "not applicable" to "neither agree nor disagree" and stochastic substitution using regression residuals resulted in factor structures highly consistent with the initial approach taken.
Overall, the EFA produced a consistent factor structure, with the interpretation of the 36 items loading to one of the 12 factors relatively clear and each of the inferred themes addressing important aspects of CSI (Table 3).
Dimensions of CSI
Our intent was to develop a measure of CSI that could provide insights on typically unmeasured aspects of the coordination and integration of cancer services. The 12 factors were compared to the dimensions of integration identified in the literature review, with particular focus on the provider-derived dimensions of health services integration (Table 3). Four factors (factors 1, 2, 3 and 12) reflect key elements of clinical integration (i.e., clinical responsiveness to requests for advice from medical/radiation oncologists, surgeons and pathologists; effectiveness of multidisciplinary clinical teams; and clinical leadership/guidance regarding best practices). Each of these factors directly influences patient care services and directs attention to different aspects of clinical integration, including informal clinical interactions (factors 1 and 12), formal multidisciplinary clinical conferences (factor 2) and the role of clinical leadership in facilitating best practice (factor 3). Accounting for the top three factors in terms of common variance explained, these results are consistent with the findings of Shortell and colleagues (2000), who suggested clinical integration was the most challenging and important component of an organized delivery system. These findings suggest that efforts to improve clinical integration would have the greatest impact on overall service integration.
Four other factors (factors 4, 7, 10 and 11) reflect elements of functional integration (i.e., regional coordination of resources; awareness of whom to contact for advice regarding palliative/supportive care, public health and community-based services; existence of standardized policies and training programs; and access to computers/Internet). These functional integration factors reflect the potential to facilitate patient care activities, representing a mix of communication and information infrastructure and coordination or standardization of policies and programs. It should be noted that while some of these functional integration factors directly reflect Leatt's (2002) conceptualization of information integration, overall the study's findings suggest that information integration was relevant, and often essential, to most of the 12 identified factors, and therefore difficult to categorize exclusively. Therefore, our interpretation of functional integration is more consistent with that of Shortell and colleagues (2000), which focused on the coordination of key support functions and activities.
The four remaining factors (factors 5, 6, 8 and 9) constitute the final dimension of CSI. These factors primarily reflect elements of system leadership, including support for the role of a system leadership entity (i.e., the Regional Cancer Program in the Ontario context), with specific focus on its awareness of comparative performance (i.e., practice variation within and among regions) and its influence over key stakeholder relationships (i.e., resource allocation, regional coordination of promotion and prevention activities). Consistent with Leatt's (2002) conception of vertical integration, these four factors emphasize the importance of governance and accountability issues and extend Gillies and colleagues' (1993) conception of physician-system integration, which reflects individual and organizational roles and relationships within a broader system. These four factors also emphasize system-level capacity to coordinate services, reflecting Conrad's (1993) attention to aggregated rather than individual-level coordination processes. Therefore, considering these four factors together, we have characterized this third dimension as vertical system integration.
The CSI Survey tool
Improving service integration is a key component of performance improvement efforts in many areas of healthcare, and particularly important for cancer services given the challenges of multiple providers and multiple care settings (Sullivan et al. 2008). However, given the lack of a measure of CSI, an important gap exists for decision-makers interested in improving system performance. Our findings suggest that clinical, functional and vertical system integration represent the key elements of variation that influence CSI.
The CSI Survey provides decision-makers with the ability to measure 12 key components of service integration, representing an important tool to make informed performance improvement decisions. The 12 CSI factors and three dimensions provide direction for decision-makers, both in terms of targeting where efforts are needed to achieve performance improvements in CSI and in identifying appropriate levels of responsibility for cancer system leaders. Ultimately, the 36 Likert scale items contributing to the 12 factors can detect the majority of variation in CSI, representing a more concise tool for measuring service integration in cancer systems (Appendix).
Preliminary work to disseminate findings from the CSI Survey with cancer system leaders in Ontario has been encouraging. However, to validate the tool further, application of the CSI Survey in other jurisdictions is needed. With most of the identified factors representing aspects of service integration relevant to other complex disease management areas, the CSI Survey may also have broader application beyond a specific focus on cancer services.
With the low clinician participation rate for the CSI Survey, a common problem with surveys of clinicians (Schoenman et al. 2003), caution should be exercised when extrapolating these results to broader populations of cancer care providers in Ontario or elsewhere. Similarly, while the survey requirement that participants have an e-mail address and Internet access may have introduced a selection bias, concerns that specific groups of providers or administrators were excluded were not raised in our numerous interactions with provider organizations.
It should also be noted that the sample did not include family physicians. While we acknowledge the contribution that family physicians make in the care of cancer patients, our survey development work suggested that most family physicians in Ontario typically care for only a limited number of cancer patients. Therefore, as the survey was designed and relevant for healthcare providers who routinely provide care to a large number of cancer patients, family physicians were excluded. However, despite their exclusion, the survey still produced several important factors related to the coordination of health promotion, cancer prevention/screening activities, the awareness of primary care contacts and the responsiveness of palliative and supportive care (factors 6, 7 and 12).
Although missing data also presented challenges, the EFA was analytically sound, producing consistent results using various imputation methods and assumptions. Finally, it should be noted that the CSI Survey was developed in the context of a large, publicly funded healthcare system. However, the integration dimensions are broadly relevant and should be largely transferable to other types of healthcare systems.
We set out to develop a measure of CSI that can inform clinical and administrative decision-makers in their efforts to monitor and improve cancer system performance. Through the development of the CSI Survey, we have created a provider-derived survey tool that provides insights on 12 key factors across three dimensions of integration (i.e., clinical, functional and vertical system). The CSI Survey provides an important starting point for measuring the coordination and integration of cancer services, establishing a tool to guide cancer system leaders on how to target efforts and resources in the ongoing pursuit of high performance.
To view the appendix please visit click here.
Mesure de l'intégration des services de cancérologie afin d'appuyer l'amélioration du rendement : le sondage sur l'intégration des services de cancérologie
Objectif : Mettre au point une mesure de l'intégration des services de cancérologie qui permette de renseigner les décideurs cliniques et administratifs dans le suivi et l'amélioration du rendement du réseau de cancérologie.
Méthode: Nous avons employé une approche systématique pour la mise au point de mesures, notamment par la revue des mesures actuelles de l'intégration des services de cancérologie et de santé, par des entrevues auprès d'informateurs clés et par des groupes de discussion auprès des dirigeants du réseau de cancérologie. L'équipe de recherche a élaboré un sondage en ligne qui a été testé, précisé puis mené auprès d'un échantillon d'administrateurs et de prestataires de soins de cancérologie en Ontario, au Canada. Nous avons ensuite effectué une analyse factorielle exploratoire afin de déterminer les aspects essentiels de l'intégration des services de cancérologie.
Résultats : Au total, 1769 médecins, cliniciens et administrateurs ont répondu au sondage de 67 questions. L'analyse factorielle exploratoire a permis de dégager 12 facteurs qui sont liés à trois aspects généraux : les aspects cliniques, les aspects fonctionnels et les aspects liés à l'intégration systémique verticale.
Conclusions : Le sondage sur l'intégration des services de cancérologie donne d'importantes pistes concernant une variété d'aspects habituellement non mesurés en matière de coordination et d'intégration des services de cancérologie, ce qui représente un nouvel outil pour renseigner les initiatives d'amélioration du rendement.
About the Author(s)
Mark J. Dobrow, PhD
Scientist, Cancer Services and Policy Research Unit, Cancer Care Ontario
Assistant Professor, Department of Health Policy Management and Evaluation
University of Toronto
Lawrence Paszat, MSc, MD
Senior Scientist, Institute for Clinical Evaluative Sciences
Associate Professor, Department of Health Policy, Management and Evaluation
University of Toronto
Brian Golden, PhD
Sandra Rotman Chair in Health Sector Strategy
Rotman School of Management
University of Toronto and University Health Network
Adalsteinn D. Brown, DPhil
Assistant Deputy Minister, Health System Strategy, Ontario Ministry of Health and Long-Term Care
Assistant Professor, Department of Health Policy, Management and Evaluation
University of Toronto
Eric Holowaty, MSc, MD
Senior Consultant, Population Studies and Surveillance, Cancer Care Ontario
Associate Professor, Department of Public Health Sciences
University of Toronto
Margo C. Orchard, MSc
Policy Advisor, Strategy, Canadian Partnership Against Cancer
Neerav Monga, MSc
Biostatistician, Cancer Care Ontario
Terrence Sullivan, PhD
President and Chief Executive Officer, Cancer Care Ontario
Associate Professor, Departments of Health Policy, Management and Evaluation and Public Health Sciences
University of Toronto
Correspondence may be directed to: Mark J. Dobrow, PhD, Assistant Professor, Dept. of Health Policy, Management and Evaluation, University of Toronto, 620 University Avenue, Toronto, ON M5G 2L7; tel.: 416-217-1380; fax: 416-217-1294; e-mail: firstname.lastname@example.org.
AcknowledgmentThe authors acknowledge exceptional support for the project from cancer care providers and administrators representing Regional Cancer Programs, hospitals, community care access centres and public health units across Ontario. Funding for this project was provided by a grant from the Canadian Health Services Research Foundation (#RC1-1071-06), with matching funds provided by Cancer Care Ontario.
Alexander, J.A., T.M. Waters, L.R. Burns, S.M. Shortell, R.R. Gillies, P.P. Budetti and H.S. Zuckerman. 2001. "The Ties That Bind: Interorganizational Linkages and Physician-System Alignment." Medical Care 39(7): I30-I45.
Bass, R.D. and C. Windle. 1972. "Continuity of Care: An Approach to Measurement." American Journal of Psychiatry 129(2): 196-201.
Budetti, P.P., S.M. Shortell, T.M. Waters, J.A. Alexander, L.R. Burns, R.R. Gillies and H. Zuckerman. 2002. "Physician and Health System Integration." Health Affairs 21(1): 203-10.
Burns, L.R., J.A. Alexander, S.M. Shortell, H.S. Zuckerman, P.P. Budetti, R.R. Gillies and T.M. Waters. 2001. "Physician Commitment to Organized Delivery Systems." Medical Care 39(7):
Burns, L.R. and M.V. Pauly. 2002. "Integrated Delivery Networks: A Detour on the Road to Integrated Health Care?" Health Affairs 21(4): 128-43.
Cancer Services Implementation Committee. 2001. Report of the Cancer Services Implementation Committee. Toronto: Ontario Ministry of Health and Long-Term Care.
Carlow, D.R. 2000. "The British Columbia Cancer Agency: A Comprehensive and Integrated System of Cancer Control." Healthcare Quarterly 3(3): 31-45.
Conrad, D. 1993. "Coordinating Patient Care Services in Regional Health Systems: The Challenge of Clinical Integration." Hospital and Health Services Administration 38(4): 491-508.
Conrad, D.A. and W.L. Dowling. 1990. "Vertical Integration in Health Services: Theory and Managerial Implications." Health Care Management Review 15(4): 9-22.
Department of Health. 1997. "The New NHS: Modern, Dependable." London, UK: The Stationery Office.
Department of Health. 1998. "A First-Class Service: Quality in the New NHS." London, UK: The Stationery Office.
Department of Health. 2000. "The NHS Cancer Plan: A Plan for Investment, A Plan for Reform." London, UK: The Stationery Office.
Dobrow, M., B. Langer, H. Angus and T. Sullivan. 2006. "Quality Councils as Health System Performance and Accountability Mechanisms: The Cancer Quality Council of Ontario Experience." HealthcarePapers 6(3): 8-21.
Dobrow, M.J., M.C. Orchard, B. Golden, E. Holowaty, L. Paszat, A.D. Brown and T. Sullivan. 2008. "Response Audit of an Internet Survey of Health Care Providers and Administrators: Implications for Determination of Response Rates." Journal of Medical Internet Research 10(4): e30.
Dolovich, L.R., K.M. Nair, D.K. Ciliska, H.N. Lee, S. Birch, A. Gafni and D.L. Hunt. 2004. "The Diabetes Continuity of Care Scale: The Development and Initial Evaluation of a Questionnaire That Measures Continuity of Care from the Patient Perspective." Health and Social Care in the Community 12(6): 475-87.
Durbin, J., P. Goering, D.L. Streiner and G. Pink. 2004. "Continuity of Care: Validation of a New Self-Report Measure for Individuals Using Mental Health Services." Journal of Behavioral Health Services and Research 31(3): 279.
Fairchild, D.G., J. Hogan, R. Smith, M. Portnow and D.W. Bates. 2002. "Survey of Primary Care Physicians and Home Care Clinicians: An Assessment of Communication and Collaboration." Journal of General Internal Medicine 17(4): 253-61.
Freeman, G., S. Shepperd, I. Robinson, K. Ehrich and S. Richards. 2001. Continuity of Care: Report of a Scoping Exercise for the National Co-ordinating Centre for NHS Service Delivery and Organisation R&D (NCCSDO). London, UK: National Co-ordinating Centre for NHS Service Delivery and Organisation R&D.
Gillies, R.R., S.M. Shortell, D.A. Anderson, J.B. Mitchell and K.L. Morgan. 1993. "Conceptualizing and Measuring Integration: Findings from the Health Systems Integration Study." Hospital and Health Services Administration 38(4): 467-89.
Griffith, C. and J. Turner. 2004. "United Kingdom National Health Service. Cancer Services Collaborative 'Improvement Partnership': Redesign of Cancer Services. A National Approach." European Journal of Surgical Oncology 30(Suppl. 1): 1-86.
Haggerty, J.L., R.J. Reid, G.K. Freeman, B.H. Starfield, C.E. Adair and R. McKendry. 2003. "Continuity of Care: A Multidisciplinary Review." British Medical Journal 327(7425): 1219-21.
Harman, H.H. 1976. Modern Factor Analysis. Chicago: University of Chicago Press.
Hernandez, S.R. 2000. "Horizontal and Vertical Healthcare Integration: Lessons Learned from the United States." HealthcarePapers 1(2): 59-65.
Jöreskog, K.G. 1977. "Factor Analysis by Least-Square and Maximum-Likelihood Method." In K. Enslein, A. Ralston and H.S. Wilf, eds., Statistical Methods for Digital Computers. New York: John Wiley and Sons.
Leatt, P. 2002. Integrated Service Delivery. Ottawa: Minister of Public Works and Government Services.
Leatt, P., G.H. Pink and M. Guerriere. 2000. "Towards a Canadian Model of Integrated Healthcare." HealthcarePapers 1(2): 13-35.
Little, R.J.A. and D.B. Rubin. 2002. Statistical Analysis with Missing Data. Hoboken, NJ: John Wiley and Sons.
Mindlin, R.L. and P.M. Densen. 1969. "Medical Care of Urban Infants: Continuity of Care." American Journal of Public Health and the Nation's Health 59(8): 1294-301.
Raina, P., P.L. Santaguida, M. Rice, M. Gauld, S. Smith and J. Hader. 2006. "A Systematic Review of Health Care System Performance Indicators: Measures of Integration." Hamilton, ON: McMaster Evidence-Based Practice Centre.
Reid, R., J. Haggerty and R. McKendry. 2002. Defusing the Confusion: Concepts and Measures of Continuity of Healthcare. Ottawa: Canadian Health Services Research Foundation. Retrieved May 26, 2009. < http://www.chsrf.ca/final_research/ commissioned_research/programs/ pdf/cr_contcare_e.pdf > .
Rummel, R.J. 1970. Applied Factor Analysis. Evanston, IL: Northwestern University Press.
Schoenman, J.A., M.L. Berk, J.J. Feldman and A. Singer. 2003. "Impact of Differential Response Rates on the Quality of Data Collected in the CTS Physician Survey." Evaluation and the Health Professions 26(1): 23-42.
Shortell, S.M., R.R. Gillies, D.A. Anderson, K.M. Erickson and J.B. Mitchell. 2000. Remaking Health Care in America: The Evolution of Organized Delivery Systems. San Francisco: Jossey-Bass.
Sullivan, T., M. Dobrow, L. Thompson and A. Hudson. 2004. "Reconstructing Cancer Services in Ontario." HealthcarePapers 5(1): 69-80.
Sullivan, T., M.J. Dobrow, E. Schneider, L. Newcomer, M. Richards, L. Wilkinson, L. Borella, C. Lepage, G.P. Glossmann and R. Walshe. 2008. "Améliorer la responsabilité clinique et la performance en cancérologie" ["Improving Clinical Accountability and Performance in the Cancer Field"]. Pratiques et Organisation des Soins 39(3): 207-15.
Suter, E., N.D. Oelke, C.E. Adair, C. Waddell, G.D. Armitage and L.-A. Huebner. 2007. Health Systems Integration: Definitions, Processes and Impact. A Research Synthesis. Calgary: Health Systems and Workforce Research Unit, Calgary Health Region.
Tabachnick, B.G. and L.S. Fidell. 2007. Using Multivariate Statistics. Boston: Allyn and Bacon.
Ware, N.C., B. Dickey, T. Tugenberg and C.A. McHorney. 2003. "CONNECT: A Measure of Continuity of Care in Mental Health Services." Mental Health Services Research 5(4): 209-21.
Wilson, N.J. and K.W. Kizer. 1998. "Oncology Management by the 'New' Veterans Health Administration." Cancer 82(10) (Suppl.): 2003-9.
1. CSA and STA hospitals were identified through their contract status with Cancer Care Ontario to provide incremental service volumes for cancer surgery and/or systemic therapy.
2. Hospital-specific cancer surgery volumes were obtained from Cancer Care Ontario data sources.
Be the first to comment on this!
Personal Subscriber? Sign In
Note: Please enter a display name. Your email address will not be publically displayed