Healthcare Policy

Healthcare Policy 10(SP) September 2014 : 56-66.doi:10.12927/hcpol.2014.23918
Research Paper

Home and Community Care Sector Accountability

Carolyn Steele Gray, Whitney Berta, Raisa B. Deber and Janet Lum


This paper focuses on accountability for the home and community care (HCC) sector in Ontario. The many different service delivery approaches, funding methods and types of organizations delivering HCC services make this sector highly heterogeneous. Findings from a document analysis and environmental scan suggest that organizations delivering HCC services face multiple accountability requirements from a wide array of stakeholders. Government stakeholders tend to rely on regulatory and expenditure instruments to hold organizations to account for service delivery. Semi-structured key informant interview respondents reported that the expenditure-based accountability tools being used carried a number of unintended consequences, both positive and negative. These include an increased organizational focus on quality, shifting care time away from clients (particularly problematic for small agencies), dissuading innovation, and reliance on performance indicators that do not adequately support the delivery of high-quality care.

The home and community care (HCC) sector in Ontario has characteristics that create a number of challenges to accountability policy. As noted in the introduction to this issue (Deber 2014), the accountability literature suggests that accountability is best understood by three components: accountability "to whom" (the parties involved in the accountability relationship), accountability "for what" (the activities for which parties in the relationship are responsible) and accountability "at what cost" (the sanctions associated with failure to meet responsibilities) (Bergsteiner and Avery 2009; Brinkerhoff 2003; Thomas 1998). Accountability can serve three main purposes: (a) financial accountability (focuses on financial procedural compliance), (b) performance accountability (focuses on outputs and results) and (c) political/democratic accountability (focuses on fulfilling public trust) (Brinkerhoff 2003, 2004).

Using document analysis and key informant interviews, we define the accountability landscape for the HCC sector; this analysis is part of a larger study (Steele Gray 2014).

What Is Home and Community Care?

HCC refers to a basket of support services that can be delivered to clients in their home. The mix of HCC services is provided by both regulated and unregulated care providers. Nursing, physiotherapy, occupational therapy, speech therapy, social work and dietetic services are offered by regulated workers. Unregulated workers, including personal support workers and other community services staff, offer personal care (bathing, dressing and feeding), homemaking, respite services, Meals on Wheels, friendly visitor programs, transportation, security checks, recreation/social programs, lawn and home services, as well as day programs, in community settings. HCC services support a diverse population of clients including seniors, children with complex care needs, those with physical or mental disabilities, and individuals with mental health issues. HCC services have been classified as (a) acute care substitution (supporting individuals who would otherwise have to enter or remain in an acute care facility); (b) long-term care substitution (supporting individuals who would otherwise need to be institutionalized); and (c) maintenance and prevention (helping individuals stay independent in their current living environments (Anderson and Parent 2000; Baranek et al. 2004; Canadian Healthcare Association 2009; Dumont-Lemasson et al. 1999; Hollander and Walker 1998). The types of services and the methods through which they are delivered differ for varying client populations. This study focused on services provided to seniors over the age of 65, including the frail elderly.

As identified in the Introduction (Deber 2014), we anticipate that "production characteristics" of a sector may affect accountability. HCC services would be classified as highly contestable, in that there are low barriers to market entry and exit (Preker et al. 2000); as low in observability (i.e., it is not conducive to direct oversight), particularly when delivered in a home setting; and as low in measurability (i.e., it is difficult to measure performance in the sector).

Home and Community Care in Ontario

Both private not-for-profit and private for-profit organizations deliver HCC in Canada. Payers for HCC vary considerably by jurisdiction, type of service and type of client. HCC does not fall under the comprehensiveness requirements of the Canada Health Act, which specify full coverage only for medically necessary services provided in hospitals or by physicians; however, provinces and territories are able to go beyond these "floor" requirements (Marchildon 2013). In Ontario, public funds may pay for certain professional services (particularly for clients in the acute care substitution category, where policy makers can defend these services as cost-effective by speeding discharge from hospital), while clients and their families, private insurance and sometimes charities pay for most non-professional services. HCC services are available to clients through a variety of different access points, which may vary in their eligibility requirements and costs, and are funded in a variety of ways (Williams et al. 2009).

The Ontario Ministry of Health and Long-Term Care (MOHLTC) flows its HCC funding through 14 geographically based local health integration networks (LHINs), Ontario's regional health authorities created in 2006. The LHINs flow these funds to certain local health services including hospitals, long-term care, mental health and addictions services, community health centres, community care access centres (CCACs) and community support services. CCACs, in turn, purchase professional home care services for eligible clients on a competitive basis under capped budgets set by the province. Services are allocated to individuals, but there is a ceiling on the amount or units of services that individuals may receive (Williams et al. 2009). The LHINs also fund some community care services through multi-service accountability agreements (MSAAs), a multi-year agreement for service delivery between LHINs and health service providers (HSPs).1 The MSAA provides funding for a basket of services to be delivered by the HSP in a particular geographic area. Individuals can also purchase HCC services privately from a wide array of HCC service providers. The Ontario Home Care Association (OHCA) reports that MOHLTC annually purchases 28.6 million visits/hours of home care per year, with another 20 million visits/hours being purchased privately (OHCA 2010).

Accountability Instruments

This study draws on Doern and Phidd's (1992) model of policy instruments to classify and examine the accountability instruments in place for HCC agencies. The model helps to understand why some tools are chosen over others (tools may be chosen based on level of coerciveness, costs of implementation and political acceptability) and the policy trade-offs associated with those choices. Doern and Phidd's model suggests five policy instruments, which can be used as both the means and the ends through which policy goals are pursued. These include self-regulation (private behaviour), exhortation, expenditure, regulation (including taxation) and public ownership. Definitions of these tools are included in the introduction to this Special Issue (Deber 2014).


This paper presents selected findings gathered through an environmental scan, document analysis and semi-structured interviews. The environmental scan was used to identify accountability requirements imposed on HCC agencies. Documents were gathered primarily from websites including Ontario e-laws, Ontario ministry and agency sites and accreditation sites. In some cases, research partners or key informants who provided insight into accountability demands on HCC organizations in Ontario provided additional documents.

Key informant interviews were conducted with the representatives from one urban and one rural CCAC and LHIN (organizations holding HCCs to account) and with HCC agencies from those two regions as part of a larger study (Steele Gray 2014). Interviews were conducted between September and December 2011. Findings presented here flow from questions regarding challenges and unintended consequences of accountability tools. Purposive criterion sampling (Patton 2001; Teddlie and Yu 2007) was used to identify individuals and organizations to include in the sample, allowing us to target individuals with knowledge required to answer interview questions. Interviews were conducted with four individuals (one from each LHIN and CCAC) who were involved at the managerial or directorial level in administering the MSAAs or CCAC contracts, and 20 individuals representing 13 different community service agencies (CSAs) delivering HCC services in the two regions. Organizations were identified with the help of the LHIN and CCAC interviewees as examples of different types of organizations involved in MSAA and CCAC contracts.


Participants were allowed to view their transcripts and provide feedback on whether they felt their views were reflected. Documents relating to the LHIN MSAA and CCAC contract and interview transcripts were thematically coded. Code themes were identified from the literature,2 with additional codes being included as new themes emerged in the coding process. A subsample of three documents and four interviews were double-coded by the primary investigator and by a colleague to validate the coding scheme. Once researchers agreed on the set of themes, analytic coding was applied (Cresswell 2003) using NVivo 10 software. Further details regarding the analysis can be found in Steele Gray (2014).

The findings from our document and environmental scan serve to identify the accountability mechanisms in place, while findings from our key informant interviews shed light on implementation issues associated with dominant accountability tools. To preserve anonymity, respondents are identified by number (e.g., CSA 2) in the quotations presented below.

Document and Environmental Scan Findings: Accountability Mechanisms in Place

The environmental scan revealed a wide array of accountability requirements imposed on HCC agencies; they include examples of regulation, expenditure and exhortation policy instruments. Exhortation tools are used primarily by non-governmental stakeholders, such as clients, families, caregivers, shareholders (in the case of for-profit providers) and volunteers (in the case of not-for-profit providers). HCC agencies believe that they are accountable to these groups through their mission and value statements; however, it is not clear how they in fact demonstrate accountability to these non-governmental stakeholders other than through such mechanisms as annual reports or meetings.

Government agencies rely more heavily on regulation and expenditure instruments to hold HCC organizations to account. Regulation instruments include government legislation and regulations that apply broadly to most organizations operating in Ontario (e.g., Occupational Health and Safety Act, 1990), social regulations that are imposed on healthcare organizations more specifically (e.g., Personal Health Information Protection Act, 2004) and regulations concerning controlled acts performed by healthcare professionals (Regulated Health Professions Act, 1991).

Regulation tools are sometimes combined with expenditure tools; for example, HCC organizations that receive government funding for delivering certain services may be bound by regulations and policies linked to funding for those services (e.g., Assisted Living Services for High Risk Seniors Policy, 2011). Expenditure tools commonly rely on performance measurement to hold organizations to account for the funding they receive. Formalized performance reporting systems are particularly important as they enhance the strength of accountability relationships (Bergsteiner and Avery 2009). In Ontario, HCC organizations receiving funding from government agencies are required to report on performance measures on a fixed schedule (which may be quarterly, yearly or both), and missed performance targets can result in reduction or loss of funding. Expenditure tools, in particular LHIN and CCAC funding for HCC services, represent the primary method through which the Ontario government holds HCC agencies to account, and as such are the focus of the remainder of this paper.

Document analysis revealed that the MSAA and CCAC contracts primarily seek to ensure financial and performance accountability. However, production characteristics do appear to play a role. Performance accountability tends to focus on process-level performance (such as access to services, number of services delivered, number of clients), with little attention to potentially important but difficult-to-measure outcomes (such as maintaining a person's independence and capacity to stay at home and delaying functional decline in everyday activities).

Key Informant Interview Findings: Implementation Issues Associated with LHIN MSAAs and CCAC Contracts

Key informants were asked about the implementation issues associated with the MSAA and CCAC contract accountability requirements and specifically about any unintended consequences and challenges associated with performance indicators. The following sections summarize the central themes in these two areas identified through analysis of interview findings.

Unintended consequences

The analysis of interview findings suggested that there are both positive and negative unintended consequences associated with the MSAA and CCAC contracts. Participants considered the unintended positive changes in the organization to be the unexpected but welcome effects of the accountability agreements. For example, one positive unintended consequence identified by interview participants was that the MSAA and CCAC contracts increased focus on quality across their organizations:

[The MSAA] focus us probably a little bit more [on quality] like [another interviewee] said and make us more aware of things that we probably could do better or differently. (CSA 2)

I think that the fact that we report on quality indicators quarterly and monthly to the CCAC is known from top to bottom in the organization. They see their team reports. They get competitive between the teams about who is going to perform better. … If I was trying to engender that much focus on quality by myself as the director of quality, I think it would probably be a little bit harder. (CSA 8)

Participants also reported negative unintended consequences, and unexpected effects of the accountability agreements that yielded negative changes in the organization. Negative unintended consequences identified included (a) shifting front-line staff time and (b) hindering innovation. Shifting front-line staff time away from client care and towards reporting was amplified for smaller organizations that have fewer resources to devote to reporting requirements. This shift of time away from client care was one of the main reasons for one organization's abandoning its MSAA, returning the funding and deciding not to apply for an MSAA in the future. The participant identified that meeting MSAA requirements as part of reporting for strategic planning or updates took too much time away from clients:

… if I'm wasting 12 hours in a meeting, [and] a drive, it's gone. It's 12 hours that are not being given to my client. (CSA 1)

Participants also pointed to the negative effect of the MSAA on innovation. Some participants identified that they were not comfortable innovating with respect to service delivery, such as adopting new delivery processes or programs that could negatively affect performance targets:

There [are] very few people that are going to take a lot of risk in this kind of environment. … I don't think we are going to take too many risks. (CSA 10)

The MSAA, they are so criteria-based, you hit the mark, or you didn't. They don't allow that developmental learning to go on. … And they are very clear to me. If I don't hit my targets, it has implications on [my] receiving the funding again. (CSA 13)

Problems with performance indicators

In addition to unintended consequences, interview participants noted that the performance indicators in the MSAA and CCAC contracts focused heavily on process indicators to the exclusion of outcome measures. While they recognized that health outcomes in community care are difficult to measure, they nonetheless felt that such indicators were important to demonstrate the quality and the funding "worth" of HCC services:

We don't do a lot of outcome measurements. I don't think there are any actually related to client outcomes or achieving the client goals. … Outcome [indicators], although difficult … would be very important to ensure that we are using the CCAC dollars appropriately. (CSA 7)

The outcome measures are things like satisfaction with the service. Was the personal care that we are providing, is it increasing the person's independence? Those are harder to measure. (Urban CCAC)

Participants were also concerned that indicators did not take into consideration contextual factors that may affect the meeting of performance targets. Of particular concern to rural organizations is that current indicators merely count the number of home care visits without considering the travel time to get to the visit. Participants also identified that indicators reflected aspects of service delivery that providers cannot control, such as staff turnover rate, which is affected by severe labour shortages that are characteristic of the HCC sector (in large part because this sector pays lower wages than do hospitals or even LTC institutions):

Continuity is definitely something that is important to capture in terms of providing services to the client, but the targets that are established by the CCACs sometimes do not reflect the reality of the service providers in the ability to find staff to meet the targets. Same with the ability to accept referrals. As the service volumes increase and increase from CCACs, when we are experiencing severe human resource challenges there's a bit of a disconnect there. (CSA 4-1)

Participants further reported that some CCAC indicators actually compete against one another, such that meeting a target in one area may well mean sacrificing targets in another. One participant describes how meeting referral targets can affect continuity targets:

… sometimes we don't meet target because targets compete with each other; for instance, sometimes we compromise continuity because we want to take a referral. … A referral will come in, in the evening because the client has been discharged from hospital stay. We will send the evening nurse … or somebody that could go [in order to meet the referral request] … but on an ongoing basis [the] initial evening nurse won't be on [the service team for the client]. (CSA 5)

All respondents, even those from the LHINs and CCACs, reported that the MSAA and CCAC contract performance indicators do not accurately capture their perception of high-quality HCC services.


In general, the delivery of HCC services is more tightly controlled when agencies are receiving government funding; otherwise, they are subject only to existing legislation that does not cover many aspects of service delivery, and a variety of voluntary tools that organizations may or may not wish to follow. Expenditure instruments carry a number of advantages, including encouraging certain activities, permitting flexibility, encouraging innovation and potentially reducing costs. Expenditure instruments also tend to be politically acceptable to policy makers (Howlett and Ramesh 1993).

Despite the number of advantages and the political appeal of expenditure instruments, these tools may be difficult to establish, the associated information costs are high (i.e., requiring information on reporting) and in some instances they may be redundant (i.e., where the activity would have occurred without any funding) (Hood and Margetts 2007; Howlett and Ramesh 1993). By definition, expenditure tools often come with a set of conditions to control how that funding is spent. As previously indicated, performance reporting is considered to strengthen accountability; however, our respondents suggested that this advantage may be offset by the implementation issues identified in our analysis. As such, the problem may not be the expenditure tool, but rather the way in which it is implemented. It is possible that expenditure instruments could be implemented differently in order to support attributes such as innovation, for example, by building in incentives for innovative practice or developing performance indicators related to innovation.

While performance reporting strengthens financial accountability in funding relationships, our findings suggest that the other purposes of accountability (performance and political) are not well supported by the CCAC contracts and MSAAs. A particular concern is the problem with performance indicators, which do not adequately capture the quality aspect of service delivery. Furthermore, we need to recognize that performance may not always be equated with quality. As noted by Thomas (1998: 379): "Not all types of programs are equally amenable to results-based accountability. There are definitional and technical problems with performance measurement, especially when 'soft' services are being delivered, and these call into question the validity of such measures." One encouraging development is a push by government funders for improved quality measurement in the HCC sector (although, like existing measures, these new outcome measures will cover only care providers in Ontario that receive government funding).


Study findings suggest that HCC agencies have a number of accountability requirements, many of which do not cover specifics about HCC service delivery. While expenditure tools more directly hold HCC agencies to account for service delivery, the poor measurability and low observability of the sector means that the accountability frameworks in use focus more heavily on financial performance rather than quality performance, tempered because the high contestability in urban areas may make it easier to replace providers who do not wish to work within these rules. Expenditure tools additionally come with a number of negative unintended consequences that can affect quality service delivery, particularly for small agencies. Careful attention to how these accountability tools are implemented may be able to minimize some of these unintended consequences while retaining the advantages.




Obligation de rendre compte dans le secteur des soins à domicile et en milieu communautaire


Cet article porte sur l'obligation de rendre compte dans le secteur des soins à domicile et en milieu communautaire (SDC) en Ontario. Ce secteur est très hétérogène étant donné les divers types de prestation de services, de méthodes de financement et d'organisations qui offrent de tels services. Les conclusions d'une analyse du secteur font voir que les organisations qui offrent des services de SDC font face à de nombreuses exigences, provenant d'une vaste gamme d'intervenants, quant à l'obligation de rendre compte. Les représentants du gouvernement sont plus enclins à compter sur les instruments réglementaires et de dépenses pour tenir les organisations responsables de la prestation de services. Les personnes interrogées dans le cadre d'entrevues semi-dirigées indiquent que les outils de responsabilisation axés sur les dépenses ont des répercussions non intentionnelles, qu'elles soient positives ou négatives. Cela comprend l'accroissement des efforts visant la qualité, le déplacement du temps consacré aux clients (un effet particulièrement problématique pour les petites organisations), l'entrave à l'innovation et l'utilisation d'indicateurs de rendement qui n'appuient pas adéquatement la prestation de soins de qualité supérieure.

About the Author(s)

Carolyn Steele Gray, PhD, Post-Doctoral Fellow, Bridgepoint Colllaboratory for Research and Innovation, Health System Performance Research Network, University of Toronto, Toronto, ON

Whitney Berta, PhD, Associate Professor, Institute of Health Policy, Management & Evaluation, University of Toronto, Toronto, ON

Raisa B. Deber, PhD, Professor, Institute of Health Policy, Management & Evaluation, University of Toronto, Toronto, ON

Janet Lum, PhD, Professor and Associate Dean of Arts, Research and Graduate Studies, Ryerson University, Co-Chair, Canadian Research Network for Care in the Community, Toronto, ON

Correspondence may be directed to: Carolyn Steele Gray, PhD, Bridgepoint Collaboratory for Research and Innovation, 14 St. Matthews Rd., Toronto, ON M4M 2B5; tel.: 416-461-8252, ext. 2908; e-mail:


This study was funded by CIHR-PHSI Grant (CIHR Grant Number PHE-101967).

We would like to acknowledge support from Anne Wojtak, Bill Manson, Debra Bell, Shaheena Mukhi and Angèle Albert-Ritchie for sharing insights and documents with the authors; to Seija Kromm for double coding interviews; and to Stephanie Ma for supporting the environmental scan. We would also like to acknowledge research funding from the Ontario Ministry of Health and Long-Term Care.


Anderson, M. and K. Parent. 2000. Care in the Home: Public Responsibility – Private Roles? Paper prepared for the Dialogue on Health Reform, June, Toronto, Ontario. Retrieved March 20, 2014. <>.

Baranek, P.M., R. Deber and A.P. Williams. 2004. Almost Home: Reforming Home and Community Care in Ontario. Toronto: University of Toronto Press.

Bergsteiner, H. and G.C. Avery. 2009. "A Generic Multiple Constituency Matrix: Accountability in Private Prisons." Journal of Public Administration Research and Theory 19(3): 631–60. doi: 10.1093/jopart/mun011.

Brinkerhoff, D. 2003 (January). Accountability and Health Systems: Overview, Framework and Strategies. Bethesda, MD: PHR Plus, Abt Associates, U.S. Agency for International Development. Retrieved March 20, 2014. <>.

Brinkerhoff, D.W. 2004. "Accountability and Health Systems: Toward Conceptual Clarity and Policy Relevance." Health Policy and Planning 19(6): 371–79. doi: 10.1093/heapol/czh052.

Canadian Healthcare Association. 2009. Home Care in Canada: From the Margins to the Mainstream. Retrieved March 20, 2014. <>.

Cresswell, J.W. 2003. Research Design: Qualitative, Quantitative and Mixed Methods Approaches (2nd ed.). Thousand Oaks, CA: Sage Publications.

Deber, R.B. 2014. "Thinking about Accountability." Healthcare Policy 10(Special Issue): 12–24.

Doern, G.B. and R.W. Phidd. 1992. Canadian Public Policy: Ideas, Structure, Process (2nd ed.). Toronto: Nelson Canada.

Dumont-Lemasson, M., C. Donovan and M. Wylie. 1999. Provincial and Territorial Home Care Programs: A Synthesis for Canada. Ottawa: Health Canada. Retrieved March 20, 2015. <>.

Hollander, M.J. and E.R. Walker. 1998 (March). Report of Continuing Care Organization and Terminology. A report prepared for the Division of Aging and Seniors. Ottawa: Health Canada. Retrieved March 20, 2014. <>.

Hood, C.C. and H.Z. Margetts. 2007. The Tools of Government in the Digital Age (2nd ed.). Hampshire, UK and New York: Palgrave Macmillan.

Howlett, M. and M. Ramesh. 1993. "Patterns of Policy Instrument Choice: Policy Styles, Policy Learning and the Privatization Experience." Policy Studies Review 12(1–2): 3–24. doi: 10.1111/j.1541-1338.1993.tb00505.x.

Marchildon, G.P. 2013. Canada: Health System Review (vol. 15). Copenhagen: European Observatory on Health Systems and Policies. Retrieved March 24, 2014. <>.

Ontario Home Care Association (OHCA). 2010. "Private Home Care – A Vital Component of the Health Care Continuum in Ontario." Retrieved March 20, 2014. <>.

Patton, M.Q. 2001. Qualitative Research and Evaluation Methods (3rd ed.). Newbury Park, CA: Sage Publications.

Preker, A.S., A. Harding and P. Travis. 2000. "'Make or Buy' Decisions in the Production of Health Care Goods and Services: New Insights from Institutional Economics and Organizational Theory." Bulletin of the World Health Organization 78(6): 779–89. doi: 10.1590/S0042-96862000000600010.

Steele Gray, C. 2014. Accountability in the Home and Community Care Sector in Ontario. Doctoral dissertation, University of Toronto. Retrieved March 20, 2014. <>.

Teddlie, C. and F. Yu. 2007. "Mixed Methods Sampling: A Typology with Examples." Journal of Mixed Methods Research 1(77): 25. doi: 10.1177/2345678906292430.

Thomas, P.G. 1998. "The Changing Nature of Accountability." In G. Peters and D.J. Savoie, eds., Taking Stock: Assessing Public Sector Reforms (vol. 2) (pp. 348–93). Montreal: McGill–Queen's University Press.

Williams, A.P., D. Challis, R. Deber, J. Watkins, K. Kuluski, J.M. Lum et al. 2009. "Balancing Institutional and Community-Based Care: Why Some Older Persons Can Age Successfully at Home While Others Require Residential Long-Term Care." Healthcare Quarterly 12(2): 95–105. doi: 10.12927/hcq.2009.3974.


1. See

2. Codes for the document analysis were derived from the accountability literature. Key domains included the purpose of accountability, responsibilities of each party, policy instruments used and sanctions for non-compliance. Codes for the interview analysis relevant to this paper pertained to unintended consequences of accountability.


Be the first to comment on this!

Related Articles

Note: Please enter a display name. Your email address will not be publically displayed