Healthcare Policy

Healthcare Policy 19(Special Issue) October 2023 : 88-98.doi:10.12927/hcpol.2023.27176
Research Paper

Lack of Publicly Available Documentation Limits Spread of Integrated Care Innovations in Canada

Tara Stewart, Émilie Dionne, Robin Urquhart, Nelly D. Oelke, Jessie Lee McIsaac, Catherine M. Scott and Jeannie Haggerty

Abstract

As healthcare in Canada is provincially operated, the program innovations in one jurisdiction may not be readily known in other jurisdictions. We examine the availability of implementation-specific data for 30 innovative Canadian programs designed to integrate health and social services for patients with complex needs. Using publicly available data and key informant interviews, we were able to populate only ~50% of our data collection tool (on average). Formal program evaluations were available for only ~30% of programs. Multiple barriers exist to the compilation and verification of healthcare programs' implementation data across Canada, limiting cross-jurisdictional learning and making a comparison of programs challenging.

Introduction

The 2001 First Ministers Health Accord launched an $800 million federal investment over five years to renew primary healthcare in the provinces and territories (Health Canada 2006). One of the five objectives was to facilitate the coordination and integration of primary healthcare with services offered in other institutions and in the community (Health Canada 2007: 1). This investment enabled the development and rollout of primary care–specific “integrated care” programs. However, the Canadian healthcare “system” comprises a collection of separate provincially run healthcare systems – meaning that the program innovations in one province are not immediately known (or well understood) in another jurisdiction. This reality limits both the scale-up and spread of innovative primary care models (Greenhalgh and Papoutsi 2019).

Among the initiatives to promote integrated health-related services in Canada, the Canadian Primary Care Research Network (CPCRN) promoted cross-jurisdictional research and knowledge exchange “to accelerate the pace of integrated care solutions” (CIHR 2022). Within this initiative, our pan-Canadian team (see p. 105) conducted a descriptive policy and program analysis across the 10 Canadian provinces to generate information that could support the scale-up and spread of innovative models of integrated primary healthcare between Canadian jurisdictions. We focused on two patient groups with complex care needs who are managed in primary care and who require other medical, social and community-based services to maintain functional health or mitigate its decline: children and youth (0–25 years) with high functional health needs and community-dwelling older adults (≥ 65 years) experiencing functional decline.

Two research teams identified publicly funded programs across all provinces that integrate primary care with additional health and social services for children and youth and for older adults. Our goal was to provide analytic insight into the conditions of success (or failure) concerning the achievement of comprehensive integrated primary healthcare by collecting and analyzing detailed program implementation data from exemplar programs for both patient groups.

However, in the course of data collection and analysis, we encountered a challenge that threatened the achievement of this objective: the lack of publicly available information on the selected programs. This article presents the tools we developed for data collection (which we view as contributions to knowledge) and describes how and why we were unable to apply our data collection tools to satisfactorily address the original objectives due to the lack of publicly available data. We then discuss why this incidental finding of the paucity of publicly available implementation information on exemplary programs is an issue and what might be done about it.

Method

Our team of 46 researchers, clinicians, decision makers and patients across 10 Canadian provinces was solicited to provide the names of exemplar publicly funded programs implemented since the 2001 First Ministers Health Accord (Health Canada 2006) that were designed to connect primary care to social services, public health and/or community supports for either of the two populations of interest (children and youth; and older adults). Team members were also asked to provide websites for the nominated programs and the names of one or more key informants who could provide information about program implementation. After several rounds, the team nominated 99 programs across 10 provinces. Co-investigators from our pan-Canadian team then reviewed summaries of each of the nominated programs from their home province, rating each program using the Innovative Practices Evaluation Framework (Health Quality Ontario 2016). Thirty programs were selected (14 for children and youth and 16 for older adults) whose brief description suggested: (1) novelty and innovativeness in the approach to integration and (2) the potential for scalability. We selected at least two programs from every province (i.e., one designed for children and youth and one designed for older adults).

Data collection tools

Parallel to the identification and selection of programs, a small working group (JH, ED, TS, Shelley Doucet [SD], RU, NO) designed a data collection tool to gather program implementation data across all selected programs. This Program Implementation Data Collection Tool (see Appendix 3, available online here) itemized service-level implementation factors identified as particularly important to service integration as per Suter et al.'s (2009) “Ten Key Principles for Successful Health Systems Integration” and Damschroder et al.'s (2009) Consolidated Framework for Implementation Research. Specifically, we aimed to gather program information in each of the following dimensions of implementation and integration: (1) program overview and context, (2) history, (3) program goals, (4) design, (5) governance structure, (6) patient focus, (7) information systems, (8) trialability, (9) policy instruments, (10) financial management and (11) performance management. Next, a Program Integration Rating Tool (see Appendix 4, available online here) was designed to rate the extent to which each element of program implementation and integration was met as inspired by Hebert and Veil's (2004) tool that assessed the degree of integration achieved by a model of integrated service delivery for frail older people.

Data collection

Data were collected from September 2017 to May 2018, and the analysis was conducted from May 2018 to February 2019. Research assistants drew on publicly available information such as health system and government websites, as well as grey and published literature. For both pragmatic and budgetary reasons, we allotted no more than 20 hours of research assistant time to complete the Program Implementation Data Collection Tool (Appendix 3 available online here) for each program and to engage a key informant for each program (i.e., an individual who was closely involved with the design, implementation and/or evaluation of the program). Key informants were interviewed by phone to validate and complete the Program Implementation Data Collection Tool (Appendix 3 available online here). Narrative summaries of each program were written by research assistants (ranging in length from five to eight pages) and were reviewed by a team of three co-principal investigators (ÉD, SD, TS) for consistency and completion. Narrative summaries were sent back to program key informants for validation and were then finalized.

Next, the Program Integration Rating Tool (Appendix 4 available online here) was applied to the narrative summaries in order to assess the degree to which each program had been able to successfully achieve integration. Two-to-three raters (i.e., project researchers) were assigned to each program. Raters supplied their data back to the leads of each patient group team who consolidated the data (Dionne et al. 2023; Stewart et al. 2023).

Analysis

The original intent was to conduct a comparative analysis across the selected programs – synthesizing across the program integration ratings and the program implementation data – and identify the dimensions of implementation and integration that appeared to contribute most to program success (or failure). However, as data collection began, it quickly became apparent that we did not have comparable information across programs due to limitations to the publicly available information we required to populate the Program Implementation Data Collection Tool (Appendix 3 available online here). The paucity of publicly available program data meant that it was no longer feasible to address our original objective of assessing the degree of integration and implementation in a valid and comparable manner.

Instead, we chose to focus our analysis on the extent to which the Program Implementation Data Collection Tool (Appendix 3 available online here) could be completed for each program and on identifying the information that was most likely to be missing. We examined the finalized program implementation data (and narrative summaries) for each program and extracted data on four indicators of data availability and data quality: (1) whether or not we were able to locate publicly available program evaluation findings (yes/no), (2) the percentage of our tool we were able to complete within a time-limited (e.g., 20 hours) search of publicly available information, (3) whether or not we were able to secure engagement of a suitable key informant to review the tool (yes/no) and (4) the percentage of our tool that was completed following participation of the key informant (if applicable).

Findings

The 30 programs are described in more detail in companion papers (Dionne et al. 2023; Stewart et al. 2023), and the complete program narratives are available upon request.

Availability of program implementation data

Tables 1 and 2 list the selected programs for children and youth (n = 14) and older adults (n = 16), respectively. These tables document the extent to which we were able to find implementation-related data for each of the 30 programs. Across all 30 programs, completeness of our data collection tool ranged from 0% to 78% after a maximum of 20 hours of searching and summarizing publicly available information (mean = 35% for children and youth; 27% for older adults). We were able to secure the participation of a key informant to validate and add to this information for 11 of the 14 youth and children programs and 11 of the 16 older adults programs (73% overall). Where input from a key informant was available, the completeness of our tool increased from an average of 20% to 61%.


TABLE 1. Indicators of data availability and data quality for selected programs designed to provide integrated care for children and youth
Province Program name Percentage (%) data template complete before key informant involvement Key informant engagement Percentage (%) data template complete after key informant involvement
British Columbia ON TRAC 38 Yes 49
Alberta Primary care networks 38 Yes 89
Regional collaborative service delivery 60 Yes 69
Saskatchewan Cognitive Disability Strategy 45 Yes 68
Manitoba Specialized Services for Children and Youth 11 Yes 78
United Referral and Intake System 20 Yes 51
Ontario Family Health Teams 63 No 63
Good 2 Go Transition Programs 18 No 30
Quebec Community Social Pediatric Centres 62 No 62
Programme d'aide personnelle, familale et communautaire 91 No 91
New Brunswick NaviCare 0 Yes 82
Integrated Service Delivery 0 Yes 47
Nova Scotia SchoolsPlus 24 Yes 61
Prince Edward Island Best Start 20 Yes 45
Summary 14 programs Average: 35 (0–60) 11/14 Average: 63 (30–91)

 


TABLE 2. Indicators of data availability and data quality for selected programs designed to provide integrated care for community-dwelling older adults
Province Program name Percentage (%) data template complete before key informant involvement Key informant engagement Percentage (%) data template complete after key informant involvement
British Columbia Kamloops Seniors Health and Wellness Centre 13 No 13
Alberta Seniors' Community Hub 18 Yes 22
Primary Health Care Integrated Geriatric Services Initiative 11 Yes 69
Saskatchewan Seniors House Calls Program 13 Yes 100
Connecting to Care (Hotspotting) 22 Yes 89
Manitoba My Health Teams: Financial health promoters 29 Yes 49
Program of Integrated Managed Care of the Elderly 7 Yes 31
Ontario Health Links 35 Yes 71
Community Hubs (Langs) 67 No 67
Quebec Réseau des services intégrés pour personnes âgées en perte d'autonomie cognitive 78 No 78
Québec Alzheimer Plan in family medicine groups and family medicine units 62 No 62
New Brunswick Rehabilitation and Reablement with the Extra-Mural Program 47 No 47
Nova Scotia Community Health Teams Yes 47
Newfoundland and Labrador Community Supports Program 11 Yes 27
Prince Edward Island Caring for Older Adults in the Community and at Home Yes 78
East Prince Seniors Initiative 13 Yes 67
Summary 16 programs Average: 27 (7–78) 11/16 Average: 57 (13–100)

 

At the time of our search, only 13 of 30 programs had robust evaluation findings that were publicly available (often via journal publications). Several other programs had been evaluated, but the findings were not publicly available or could not be shared cross-jurisdictionally due to the evaluation findings being embargoed or “internal.” Not surprisingly, the initial rate of tool completion was higher for those programs that had publicly available evaluation findings (43% vs. 21%).

In terms of the 11 dimensions included in our Program Implementation Data Collection Tool (Appendix 3 available online here), the three dimensions where we found the greatest amount of publicly available information were (1) program overview and context, (2) history and (3) program goals. This general program information was often publicly available; in addition, key informants (where involved) were able to make significant contributions to correct or contextualize the information in these sections. However, as the tool items became more nuanced, it became much less likely that information was publicly available or that key informants could supply detail. The three dimensions with the least amount of publicly available information (in decreasing order) were (10) financial management, (7) information systems and (5) governance structure.

Discussion

In the current project, we found an overall paucity of implementation-specific data on primary healthcare programs designed to integrate care for patients with complex care needs. Although it was feasible to document general program information (such as program overview, history and goals), more nuanced implementation factors (such as those related to program financing, information systems and governance structure) were difficult to find publicly. Indeed, large-scale repositories of program implementation data do not exist in Canada (i.e., as compared to, say, administrative health data or clinical health data). Despite the strong desire from Canadian research funding organizations, policy makers and providers to learn from one another and share lessons across provinces in the area of primary healthcare service delivery, capacity to enact inter-jurisdictional learning continues to be limited by a paucity of data about program implementation (PHAC 2022).

Why is this a problem?

With the advancement of the field of implementation science, it has become more widely understood that a nuanced understanding of program outcomes depends upon not only knowing the “what” (program design) but also knowing the “how” (program implementation) (Damschroder et al. 2009). Transferable knowledge about how a program was implemented is critical for program scale-up and spread (Bellg et al. 2004; Ginsburg et al. 2021); without it, programs imported from one jurisdiction to another frequently fail (Greenhalgh and Papoutsi 2019). Low availability of program implementation data has implications for both intra-jurisdictional scalability and inter-jurisdictional replicability. Nuanced details about implementation are critical for policy makers to consider when deciding whether to import a program from another jurisdiction and, more specifically, in determining whether/how a program might need to be adapted so as to achieve successful outcomes in their own local context.

What can be done about this problem?

Primary care, as a sub-field of Canadian healthcare, is not immune to the root causes that generally underlie limited healthcare data sharing in Canada. This includes outmoded, siloed data collection systems that obstruct interoperability, privacy laws and policies that were developed in a pre-digital records era, powerful disincentives for sharing data (e.g., risk aversion) and a lack of accountability for the collection and analysis of standard data (PHAC 2022). What strategies might be pursued to make cross-jurisdictional learning about primary healthcare policies and programs a reality? We discuss two general areas of reform that seem relevant: the collection of data in primary care settings and the analysis and reporting of data via service/program evaluations. We note that we are not the first to make recommendations in these areas of reform and also that progress is indeed taking place in both areas.

DATA COLLECTION: DEVELOPMENT OF COMMON PRIMARY HEALTHCARE PROGRAM INDICATORS/TOOLS

Record keeping and data collection in primary care are, historically speaking, notoriously unstandardized processes (Bergman 2007). The lack of standardized performance measurement in primary care is a well-recognized issue that is beginning to see more research (Wong et al. 2019). At a national level, a recent update of Canadian Institute for Health Information's “Pan-Canadian Primary Healthcare EMR Minimum Data Set” for performance measurement (CIHI 2022) defines a focused set of primary healthcare electronic medical record data elements to guide the creation of a comparable set of primary healthcare performance data across Canada. Where imminent comparison of program/policy research is concerned, we also submit, as a contribution, the Program Implementation Data Collection Tool (Appendix 3 available online here) that we developed to collect program implementation data across primary care programs, which may be of use to other research teams seeking to compare primary care programs.

DATA ANALYSIS: INCREASING CAPACITY FOR EVALUATION IN PRIMARY CARE

We found program evaluation reports/publications to be a rich source of implementation data. Yet evaluation findings are only made publicly available at the discretion of the commissioning program/organization. In order to support innovation, provinces/territories need to invest in evaluation and make the results public (Naylor et al. 2015). Understandably, the primary mission of most healthcare providers is to care for patients as opposed to planning and executing fulsome program evaluations. It is not uncommon, however, that program evaluation is a condition of funding for auxiliary primary care programming. Beyond mandating program evaluation, little infrastructure exists to support primary care providers to evaluate effectively. The call for increasing evaluation capacity in primary care is not a new one (Bergman 2007). Suggested solutions centre on designated provider time for evaluation and the development of evaluation resource toolkits designed especially for healthcare providers (Bergman 2007; Peek et al. 2014).

Finally, we offer the Program Integration Rating Tool (Appendix 4 available online here) that we developed for rating program integration as a contribution to knowledge. Although we were unable to adequately test and validate this tool in the current study due to data limitations, it does represent an expansion of the seminal integration scoring tool developed by Hebert and Veil (2004), which was only narrowly applicable to services designed for frail older people. In contrast, our tool is more broadly applicable to generic health services by building from Suter et al.'s (2009) “Ten Key Principles for Successful Health Systems Integration” and incorporating elements from the Consolidated Framework for Implementation Research by Damschroder et al. (2009). The dimensions and items listed on our Program Integration Rating Tool (Appendix 4 available online here) can also serve to inform the type of information that is needed to support the scale-up and spread of integrated care delivery solutions.

Limitations

The findings of this study must be interpreted with several limitations in mind. Firstly, it is possible that we under-realized the amount of relevant program information that was available and employing different/additional search strategies may have resulted in obtaining more data. For example, funding restraints and administrative factors limited our search to 20 hours per program; allotting additional search time beyond 20 hours may have resulted in more data. Next, although all program information searches were guided by standardized parameters (i.e., the Program Implementation Data Collection Tool [see Appendix 3 available online here]), no formal attempt was made to validate the search efforts of individual data collectors. Finally, the level of engagement between researchers and local healthcare leaders was not standard across all jurisdictions and likely contributed to disparities in the mobilization of key informants and access to internal evaluation findings. Regardless, we would assert that access to program implementation data that is predicated on close working relationships with individual healthcare leaders is not a sustainable model of data sharing, nor is relying on the recollections and impressions of key informants – many of whom are no longer involved with the program being studied. Rather, implementation data for publicly funded healthcare programs would need to be more straightforwardly accessible if jurisdictions were truly to be able to learn from one another and adapt successful programs from one jurisdiction to another.

Conclusion

Our analysis suggests that Canada's progress toward integrating health and social care remains largely in a state of “perpetual pilot projects” (Bégin et al. 2009: 1185), in part due to a lack of publicly available documentation and evaluation of innovations. Detailed and shared data on program implementation is vital if primary healthcare services are going to move beyond pilot projects and scale innovations across provinces to benefit Canadian patients (Bégin et al. 2009). Policy makers and government funders have influence to mandate data collection and are ultimately responsible for setting the direction in terms of what to measure and how transparent to be, and for providing adequate resources to enable evaluation. Regional leaders and program providers are then ideally responsible for collecting data, recruiting evaluation expertise and the transparent reporting that would support, among other things, the scale-up and spread of promising primary care innovations.

Correspondence may be directed to Tara Stewart by e-mail at tstewart3@wrha.mb.ca.

Le manque de documentation publiquement accessible limite la propagation des innovations en matière de soins intégrés au Canada

Résumé

Étant donné que les soins de santé au Canada sont administrés par les provinces, les innovations présentes dans une administration peuvent passer sous le radar des autres administrations. Nous examinons la disponibilité des données concernant la mise en œuvre de 30 programmes canadiens novateurs conçus pour intégrer les services de santé et les services sociaux à l'intention des patients ayant des besoins complexes. À l'aide des données publiquement accessibles et d'entrevues avec des informateurs clés, nous n'avons pu remplir qu'environ 50 % de notre outil de collecte de données (en moyenne). Les évaluations officielles des programmes n'étaient disponibles que pour environ 30 % d'entre eux. Il existe de nombreux obstacles à la compilation et à la vérification des données sur la mise en œuvre des programmes de soins de santé au Canada, ce qui limite l'apprentissage entre les administrations et complique la comparaison entre les programmes.

About the Author(s)

Tara Stewart, PhD, Assistant Professor, Department of Community Health Sciences, University of Manitoba Researcher/Evaluator, George & Fay Yee Centre for Healthcare Innovation, Manitoba Spor Support Unit, Winnipeg, MB

Émilie Dionne, PhD, Researcher and Adjunct Professor, Vitam – Centre de recherche en santé durable, Department of Sociology, Faculty of Social Sciences, Laval University, Quebec City, QC

Robin Urquhart, PhD, Endowed Chair in Population Cancer Research, Department of Community Health and Epidemiology, Dalhousie University, Halifax, NS

Nelly D. Oelke, PhD, Associate Professor, School of Nursing, Faculty of Health and Social Development, University of British Columbia–Okanagan, Kelowna, BC

Jessie Lee McIsaac, PhD, Research Chair in Early Childhood: Diversity and Transitions, Faculty of Education, Department of Child and Youth Study, Mount Saint Vincent University, Halifax, NS

Catherine M. Scott, PhD, Adjunct Professor, University of Calgary and University of British Columbia-Okanagan, Executive Coach and Knowledge Mobilisation Consultant, K2A Consulting, Calgary, AB

Jeannie Haggerty, PhD, McGill Research Chair in Family and Community Medicine, McGill University and St. Mary's Hospital Research Centre, Montréal, QC

Acknowledgment

The authors thank Yves Couturier, Shauna Zinnick and Leanne Dunne for assistance with data collection and synthesis.

References

Bégin, M., L. Eggertson and N. Macdonald. 2009. A Country of Perpetual Pilot Projects. CMAJ 180(12): 1185. doi:10.1503/cmaj.090808.

Bellg, A.J., B. Borrelli, B. Resnick, J. Hecht, D.S. Minicucci, M. Ory et al. 2004. Enhancing Treatment Fidelity in Health Behavior Change Studies: Best Practices and Recommendations from the NIH Behavior Change Consortium. Health Psychology 23(5): 443–51. doi:10.1037/0278-6133.23.5.443.

Bergman, J.S. 2007, March. Primary Health Care Transition Fund: Evaluation and Evidence. Health Canada. Retrieved August 24, 2023. <https://www.canada.ca/content/dam/hc-sc/migration/hc-sc/hcs-sss/alt_formats/hpb-dgps/pdf/prim/2006-synth-evaluation-eng.pdf>.

Canadian Institute for Health Information (CIHI). 2022. Pan-Canadian Primary Healthcare EMR Minimum Data Set, Version 1.1.

Canadian Institutes of Health Research (CIHR). 2022. Pan-Canadian SPOR Network in Primary and Integrated Health Care Innovations. Retrieved August 21, 2023. <http://www.cihr-irsc.gc.ca/e/49554.html>.

Damschroder, L.J., D.C. Aron, R.E. Keith, S.R. Kirsh, J.A. Alexander and J.C. Lowery. 2009. Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science. Implementation Science 4: 50. doi:10.1186/1748-5908-4-50.

Dionne, É., N.D. Oelke, S. Doucet, C.M. Scott, W. Montelpare, P. Charlton et al. 2023. Innovative Programs with Multi-Service Integration for Children and Youth with High Functional Health Needs. Healthcare Policy 19(Special Issue): 65–77. doi:10.12927/hcpol.2023.27178.

Ginsburg, L.R., M. Hoben, A. Easterbrook, R.A. Anderson, C.A. Estabrooks and P.G. Norton. 2021. Fidelity Is Not Easy! Challenges and Guidelines for Assessing Fidelity in Complex Interventions. Trials 22: 372. doi:10.1186/s13063-021-05322-5.

Greenhalgh, T. and C. Papoutsi. 2019. Spreading and Scaling Up Innovation and Improvement. BMJ 365: l2068. doi:10.1136/bmj.l2068.

Health Canada. 2006, May 8. Archived – 2003 First Ministers Health Accord. Retrieved August 21, 2023. <www.canada.ca/en/health-canada/services/health-care-system/health-care-system-delivery/federal-provincial-territorial-collaboration/2003-first-ministers-accord-health-care-renewal/2003-first-ministers-health-accord.html>.

Health Canada. 2007, March. Primary Healthcare Transition Fund: Summary of Initiatives, Final Edition. Retrieved August 21, 2023. <https://www.canada.ca/content/dam/hc-sc/migration/hc-sc/hcs-sss/alt_formats/hpb-dgps/pdf/phctf-fassp-initiatives-eng.pdf>.

Health Quality Ontario. 2016, June. Evaluation of Innovative Practices: Process and Methods Guide. Retrieved August 21, 2023 <https://www.hqontario.ca/Portals/0/documents/qi/health-links/evaluation-innovative-practices-process-guide-en.pdf>.

Hebert, R.R. and A.A. Veil. 2004. Monitoring the Degree of Implementation of an Integrated Delivery System. International Journal of Integrated Care 4(20): e05. doi:10.5334/ijic.106.

Naylor, D., N. Fraser, F. Girard, T. Jenkins, J. Mintz and C. Power et al. 2015, July. Unleashing Innovation: Excellent Healthcare for Canada: Report of the Advisory Panel on Healthcare Innovation. Health Canada. Retrieved August 21, 2023. <https://healthycanadians.gc.ca/publications/health-system-systeme-sante/report-healthcare-innovation-rapport-soins/alt/report-healthcare-innovation-rapport-soins-eng.pdf>.

Peek, C.J., D.J. Cohen and F.V. deGruy III. 2014. Research and Evaluation in the Transformation of Primary Care. American Psychologist 69(4): 430–42. doi:10.1037/a0036223.

Public Health Agency of Canada (PHAC). 2022, May 3. The Pan-Canadian Health Data Strategy: Expert Advisory Group Reports 1, 2, & 3. Retrieved August 21, 2023. <https://www.canada.ca/en/public-health/corporate/mandate/about-agency/external-advisory-bodies/list/pan-canadian-health-data-strategy-reports-summaries.html>.

Stewart, T., É. Dionne, R. Urquhart, N.D. Oelke, Y. Couturier, C.M. Scott et al. 2023. Integrating Health and Social Care for Community-Dwelling Older Adults: A Description of 16 Canadian Programs. Healthcare Policy 19(Special issue): 78–87. doi:10.12927/hcpol.2023.27177.

Suter, E., N.D. Oelke, C.E. Adair and G.D. Armitage. 2009. Ten Key Principles for Successful Health Systems Integration. Healthcare Quarterly 13(Spec No): 16–23. doi:10.12927/hcq.2009.21092.

Wong, S.T., J.M. Langton, A. Katz, M. Fortin, M. Godwin, M. Green et al. 2019. Promoting Cross-Jurisdictional Primary Healthcare Research: Developing a Set of Common Indicators Across 12 Community-Based Primary Health Care Teams in Canada. Primary Health Care Research and Development 20: e7. doi:10.1017/S1463423618000518.

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed