Healthcare Policy

Healthcare Policy 4(3) February 2009 : 87-102.doi:10.12927/hcpol.2009.20538
Research Papers

More Than "Using Research": The Real Challenges in Promoting Evidence-Informed Decision-Making

Sarah Bowen, Tannis Erickson, Patricia J. Martens and Susan Crockett

Abstract

Objectives and Methods: Seventeen focus groups and 53 semi-structured individual interviews involving 205 planners and decision-makers were conducted in all 11 Regional Health Authorities (RHAs) in the province of Manitoba, Canada. Objectives were to explore perspectives on the nature and use of "evidence," and barriers to evidence-informed decision-making (EIDM).

Results: In spite of almost universal support in principle for using evidence in decision-making, there was little consensus among participants on what evidence is, what kind of evidence is most appropriate and how "using evidence" can best be demonstrated. Significant skepticism about EIDM was expressed. Issues related to workload, politicized decision-making and organizational factors dominated the discussion of decision-makers. Barriers to EIDM were commonly attributed to factors external to the RHAs.

Conclusion: Effective strategies to promote EIDM must address the multiple barriers experienced by decision-makers in a complex decision-making environment. Rather than simply focusing on issues of access to evidence or development of individual capacity, strategies must focus on changing decision-making processes to support appropriate use of evidence.

This paper summarizes Phase 1 results of From Evidence to Action, a project that explored perspectives of Regional Health Authority (RHA) planners and decision-makers on the nature of "evidence," the use of evidence in decision-making and barriers to evidence-informed decision-making (Bowen and Erickson 2007). From Evidence to Action (funded by the Canadian Institutes of Health Research, 2005-2008) evolved from our earlier CIHR-funded The Need to Know project, which engaged researchers at the Manitoba Centre for Health Policy, the Department of Health and Manitoba RHAs in creating new knowledge of relevance to RHAs, increasing capacity and disseminating and applying research findings. The evaluation component of this project highlighted the importance of not simply involving individuals in capacity building and research activities, but of addressing organizational barriers to research use in RHA planning and decision-making - of moving from evidence to action (Bowen et al. 2005; Bowen and Martens 2006).

There is an emerging literature providing evidence on the optimal management of people and performance in health services organizations (Michie and West 2004). Studies have identified organizational factors - such as employee involvement, creation of a learning culture and institution of good management - that promote better decision-making, as revealed in improved organizational performance (Bradley et al. 2004; Mitton and Patten 2004; Michie and West 2004; Carney 2006). As well, although there is lack of consensus on the concept of organizational culture (Scott et al. 2003), some studies have suggested that the culture of senior management affects health system performance (Gerowitz et al. 1996; Mannion et al. 2005). Mitton and Patten (2004) identified management operations as a factor in managers' ability to apply evidence effectively. Some studies have also explored what types of research are most likely to be utilized by decision-makers; for example, social science research appears to face greater barriers to utilization than natural science research (Hanney et al. 2003). On the other hand, research considered to be part of a larger policy trajectory and linked with broad organizational agendas (such as improving patient safety) may be more likely to be used (Lavis et al. 2002; Rosenheck 2001). However, compared to the large body of research on evidence-based clinical decision-making, there has been little research on evidence-informed management (CHSRF 2004; Lavis et al. 2002; Walshe and Rundall 2001).

Past research has identified both similarities and differences in the barriers to using evidence in clinical versus policy and planning decisions. For example, time and workload, user capacity and evidence availability emerge as key factors in both forms of decision-making. However, there are important differences between clinical and management decision-making in culture, research base and decision-making processes (Walshe and Rundall 2001). In addition, organizations are complex, different kinds of decisions are made at different levels and many types of evidence may be used (Lomas 1990; Lavis et al. 2003; Walshe and Rundall 2001). Because RHAs are responsible for the implementation of policies and allocation of resources within a framework established at the provincial level, they can be seen as making decisions at the administrative policy level as well as at various program planning levels. Decisions may be related to core business transactions, operational management or strategic management (Kovner and Rundall 2006). Decision-making at the RHA board level should focus on strategic management; however, there may be considerable variability among boards in types of decisions made and the extent to which these decisions are informed by senior management.

Another source of complexity is the multiplicity of types of evidence that decision-makers might weigh. It is increasingly recognized that "evidence" in planning and policy decisions must include more than research, and that such factors as resource availability, political context, values, client/community experience, clinical expertise and context-specific evidence such as performance measurement or evaluation activities must also be considered (Baker et al. 2004; CHSRF 2006; Rycroft-Malone et al. 2004). There are important limitations of a strictly rational approach to "evidence-based" decision-making in the complex world of organizational policy and planning decisions (Baker et al. 2004).

Initiatives to increase use of evidence in decision-making have tended to focus on making information more available, accessible and attractive to decision-makers, and more recently, on increasing decision-maker capacity to use research. This approach reflects the assumption that the major barriers to decision-makers' use of evidence are data availability, accessibility and user capacity. However, as the organizational research described above suggests, the situation may be much more complex.

While there has been some research on Canadian RHA decision-makers' and managers' use of evidence in decision-making (CHSRF 2005; Lavis et al. 2005; Mitton and Patten 2004), there has been limited exploration of how these managers view evidence or experience barriers to its use, and the extent to which this research has informed decision-makers' understanding of evidence use. Because the purpose of the From Evidence to Action proposal was to develop strategies for addressing barriers to evidence-informed decision-making faced by decision-makers in RHAs, it was critical to understand these barriers from their perspective.

Methods

Project partners included all 11 Manitoba RHAs as well as researchers with the Manitoba Centre for Health Policy and Department of Community Health Sciences.

Following the official project launch in fall 2005, consultations were held in Manitoba's 11 RHAs. A project coordinator (TE) was hired to undertake the interviews and focus groups. The Need to Know team members were incorporated into the project as "knowledge translation experts" for their region and served as the project's advisory committee. Ethical approval was obtained from the Health Research Ethics Board of the University of Manitoba.

Between November 2005 and April 2006, 17 focus groups and 53 semi-structured individual interviews were conducted with a total of 205 participants. (Table 1 presents the interview/focus group questions that are the focus of this report; other questions focused on perspectives of RHA accomplishments and suggestions for development of an assessment instrument and project evaluation.) Because the intent was to understand how participants perceived evidence and its use, questions were open-ended. The vast majority of participants were senior managers; however, some middle managers and board members were also represented. Focus groups were audiotaped and transcribed; interview notes were taken and transcribed. Both principal investigators (SB, PM) and the project coordinator were involved in the analysis of data. Transcripts were independently analyzed by two researchers (SB, TE), and the themes and emphases were compared. Analyses consisted of both cross-case analysis (comparing responses to specific questions) and open-coding to identify unique themes. Finally, following development of the draft report, one researcher (PM) compared conclusions and themes with the original transcripts.


 

Table 1. Focus group/interview guide
Conceptualization of EIDM
  1. The term evidence-informed decision-making is used a lot these days. What does this term mean to you?

    Assessment of Current EIDM Practice
     

  2. In your opinion, to what extent is EIDM demonstrated in the day-to-day operations of your RHA?
    1. In what ways does your RHA practise evidence-informed decision-making?
    2. If the board/senior management was faced with a decision (e.g., whether or not to institute a certain program or service), what information would be used to assist in decision-making?
  3. What actions has your RHA taken to date to support evidence-based planning throughout the organization?
    1. How does the organizational structure in your RHA facilitate/support evidence-based decision-making? Are there any ways in which the structure hinders EIDM?
    2. What supports are in place to promote EIDM? (Probes, first note what they say, then probe, i.e., access to reports, library resources, Internet access, training opportunities, environment that encourages discussion/debate, etc.)

    Barriers to EIDM
     

  4. What are the barriers to effective decision-making that you have experienced, either in your current role, or in previous positions?


 

Findings

Responses by different types of participants

While it was recognized that decision-makers at different levels are responsible for different types of decisions and may use evidence in different ways, no differences were observed in the responses of different types of participants. This finding could be attributable to the general nature of the questions, as well as to the difficulty of categorizing managers given the significant variation in size and complexity of participating RHAs.

Perspectives on evidence and evidence-informed decision-making

In spite of almost universal support in principle for the importance of using evidence in decision-making, there was little consensus among participants on what evidence is, what kind of evidence is most appropriate and how "using evidence" can best be demonstrated. Although there was good recognition of the concept of evidence-based clinical decision-making, evidence-informed decision-making (EIDM) at the organizational (planning/policy) level was poorly understood. It was commonly assumed that only "research" was considered evidence. This assumption, combined with awareness of the limited research available to guide key decisions facing the healthcare system and the need for "context-sensitive" evidence, appeared to contribute to reluctance to fully embrace the concept of EIDM.

Many different sources of evidence, commonly used in planning, were identified: most often cited were Manitoba Centre for Health Policy (MCHP) reports, information provided by Manitoba Health and Community Health Assessment reports. However, there was significant variation in perspective regarding the extent to which evidence is currently being used. Most commonly, evidence was defined simply as quantitative data. Many participants appeared unaware that qualitative methods also require systematic evaluation of data, or that they were appropriate for exploring many of the questions facing the health system. In fact, many respondents appeared to equate qualitative evidence with anecdotal evidence. This "data driven" versus "evidence-informed" approach was described by some as having the effect of privileging some health areas (e.g., health services with already established data collection systems) over others (e.g., community-based or preventive health issues), contributing to the tendency for "new money in the system going to support the status quo" rather than new areas, and pressure not to ask questions for which there is no "answer," i.e., no quantitative data were available.

Barriers to evidence-informed decision-making

Participants readily identified a number of barriers to using evidence at the practice, program and policy levels. In addition, analysis of consultation data across all 11 regions provided insight into the complexity of these barriers as perceived and experienced by senior RHA decision-makers.

(a) Politics trumps evidence

A theme raised consistently throughout the consultations was that of the political context of decision-making. While not the most common barrier identified, this perception provides a context in which the other barriers were framed. Reactivity to public perception ("government is more concerned with public views than good patient care"; "the minute someone makes a fuss about something there is hesitancy to make a decision") and the impact of the media, professional organizations, unions and special-interest groups were described as creating a political context that worked against an RHA's ability or willingness to practise EIDM.

There was also significant cynicism about "using evidence" and skepticism about whether, at higher levels, evidence was actually used. There was a feeling that decisions were made "at the top" and that using evidence was an expectation but that it could be "gamed."

You really can't get anything unless you have some kind of documentation to support your proposals, so any of our briefing notes and stuff like that are based on a review of situations ... to support it. Mind you, you can probably cheat on this evidence too, because you try to get the evidence that supports you so it could be skewed. So it's always a danger.

I thought it was to use evidence to support decisions - as it turns out a lot of decisions are already made. Now it's about finding evidence to support the decisions that have already been made.

(b) Lack of time and resources

Lack of time and resources emerged as key barriers. Under-resourcing was described as resulting in poor decisions ("what makes sense is too expensive"), an inability to allocate resources to research or evidence-related positions and (perhaps most importantly) workload pressures that were described as actively working against the thoughtful reflection essential for EIDM. This lack of time for researching, weighing and reflecting on evidence emerged as a significantly more important issue than lack of relevant research or research capacity. Further "drilling down" within this theme provided other insights on the theme of time and resources. There appeared a tendency to view EIDM as an "add-on" requiring additional time, rather than a change in the way business is done. The "crisis-management" culture within healthcare, so often referenced by informants, makes it difficult for decision-makers to prioritize important but non-urgent issues. A minority of respondents, however, did recognize that the issue of time was also an issue of organizational priorities: that appropriate resources would be allocated if EIDM were an organizational priority.

An additional need identified by participants was to address the gap between "making a decision" and the "implementation" of a concrete plan, highlighting the challenges in getting a decision translated into effective action:

To develop an action plan is not the issue. To find resources, the time and resources to implement the way it's supposed to be implemented and not just pay lip service on paper, is what I find challenging sometimes.

There is not a good recognition of what it takes to implement a new initiative.

(c) External versus internal barriers

In the vast majority of cases, barriers to EIDM were identified as being external to the organization. However, further analysis indicates that these so-identified external barriers often have aspects that are both external (not readily amenable to intervention by an individual RHA) and internal (issues that an individual RHA does have some power to address). For example, lack of time and resources was a barrier for which government was usually blamed, with less attention directed to the issue of how RHAs allocate the resources they have at their disposal.

(d) Leadership, communication and organizational structures

A number of factors related to leadership were identified. Centralized decision-making, lack of appropriate consultation and lack of senior-level support for EIDM were identified as key barriers. A few respondents noted that unlike managers in many other areas, healthcare managers often "rose through the ranks" of various disciplines and may not have management training.

Closely related to the issue of leadership is that of communication. A key issue in this category was identified as "lack of clear channels for input." However, broader "communication processes" were also identified.

Getting info filtered down to field staff level; ... they [managers] parcel it out, and by the time it gets down to that person that's actually going to meet that standard or do that thing, it's lost somewhere.

Factors related to organizational structure and process were also identified. Sometimes these were generally worded (e.g., "structural barriers to smooth decision-making [waiting for approval]"); in other cases, specific examples of barriers were given, including:

  • a matrix organizational structure, common to many RHAs;
  • lack of research structure, research, planning or decision-support positions;
  • issues related to RHA boards (e.g., role, models of board functioning, agendas of board members);
  • planning processes, including the relationship of decision-making and financial models;
  • program "silos" and variability among programs.

Many respondents felt they did not have the authority to make decisions, an interesting finding given that the majority of participants were senior managers. Some of this was attributed to incomplete regionalization - devolution of responsibility for health services planning and management to the regions without the accompanying authority to make the decisions that would enable them to do so effectively.

(e) Crisis management, constant change

A number of subthemes related to "organizational factors" were also identified. Overall, the key organizational barrier relates to what many informants referred to as a "crisis management" culture, where people were "too busy dealing with the urgent, can't get to the important." In a crisis management culture, "research," or more broadly, "developing processes for ensuring use of evidence in decision-making," is a lower priority. This culture also was viewed as resulting in constantly changing priorities, consequent fatigue and an environment that did not support EIDM.

A number of respondents (including both staff and management) also referenced the challenge of promoting a culture of evidence, and fear of, or resistance to, change:

[There is an] old mindset thinking from way, way back ... because we've always done it that way.

[There is] nervousness in senior management in the area of research. ... Convincing staff that things need to be evidence-based [is a barrier].

(f) More than workload

Workload and a resulting inability to focus were identified as interacting in important ways. The theme of workload was described as more than simply the amount of work. A critical factor was the fracturing of attention by multiple and competing projects and activities.

People are expected to do 100 things badly versus one or two things well.

I have far too many plates in the air and one of these days they may crash.

There are so many things coming down the pipe sometimes.

In doing research in client service planning, it was very clear that you don't want to overwhelm people, and so you should be at maximum only working on two to three goals, projects, outcomes, whatever at a time. And comments from staff were, why don't we do that?

(g) Technology - too much, too little?

Exploration of issues around information technology identified two major, yet distinct themes. The first related to the lack of IT resources. This included lack of databases or staff to support them and ensure data quality, lack of IT staff in smaller RHAs to provide direct desktop and system support, and lack of computer hardware and software. The other, less anticipated theme related to "too much IT" and its intrusiveness. Modern technology, particularly e-mail and Blackberry technology, was identified as contributing to an additional fracturing of attention, leaving "no time to think." Some felt they spent an inordinate amount of time "keeping up" with e-mail, and that the e-mail culture demanded an instant, rather than thoughtful, response. The common practice of having senior managers always connected (via cellphone and Blackberry), even during meetings where important decisions were being made, was viewed by many as antithetical to EIDM.

(h) Research capacity and data availability

Research capacity and data availability were also recognized barriers, but were not emphasized. Lack of understanding of research, and of the benefits of research and its applicability to the "real work" people were doing, was commonly expressed. Sometimes research-related activities were described as being viewed as "administrative workload." Analysis of issues related to data resulted in identification of four main components:

  1. lack of data (availability and timeliness);
  2. lack of systems and resources for tracking, organizing and retrieving data;
  3. data overload ("we're drowning in paper"); and
  4. lack of access to library resources, or capacity to conduct literature searches.

The effect of RHA size on barriers to EIDM

Little difference was found either in perspectives on evidence or in barriers to evidence-informed decision-making among RHAs of varying size and complexity. We had anticipated that issues facing the Winnipeg Regional Health Authority (WRHA) might be distinct from those facing other regions, as it is home to well over half the province's residents and most of the tertiary and specialized services. Contrary to expectation, however, we found that while there are some important differences between the WRHA and other RHAs, there are more similarities, and that many of the differences relate more to scope and intensity than to substance.

Discussion

While the barriers identified by RHA decision-makers showed some consistency with the published KT literature, there were also some important differences. Issues related to workload, politicized decision-making and organizational factors dominated the discussion of decision-makers, whereas data availability and research-related capacity were given relatively less weight, suggesting that while strategies to increase data availability, research relevance and user capacity may be important, they are unlikely to be successful unless barriers identified as more important, and the interacting nature of many barriers, are addressed. The politicized nature of decision-making was viewed as a pervasive barrier to evidence-informed decision-making: the tone of many responses indicated profound skepticism about the decision-making process, suggesting a need not only for further exploration of how and when political judgment may be legitimate in evidence-informed decision-making, but also an examination of the strategies that are needed to make the role of political judgment in decision-making transparent (CHSRF 2004).

While there was strong consensus among decision-makers that various forms of evidence beyond research were important, there was no evidence of awareness of the growing public discussion regarding the value of "evidence-based" thinking in the fields of health policy and management (Grypdonck 2006; Smith et al. 2001; Walshe and Rundall 2001) and recent initiatives such as the CHSRF workshop Weighing Up the Evidence: Making Evidence-Informed Guidance Accurate, Achievable and Acceptable (CHSRF 2006) and related work (Bowen and Zwi 2005).

The lack of awareness of the potential role of program evaluation as a source of evidence was evident throughout this consultation. Because "evaluation research" can combine research rigour with the need of decision-makers for context-sensitive information, more attention should be directed to building capacity for program evaluation.

One finding of concern was the common attribution of most barriers to EIDM to factors external to the RHA. Because there will always be limitations on resource availability in a complex health system, one strategy to promote EIDM is to encourage RHAs to direct attention to those issues they do have the authority to address.

The "crisis management" culture described as pervasive in healthcare was often viewed as "given" by participants. It would perhaps be useful to attempt to disentangle workload (which at the current time individual RHAs may have limited ability to address) and acceptance of a crisis management culture.

Many participants had difficulty applying the concepts of evidence-informed decision-making to their own work, instead focusing on clinical issues. This tendency may arise in part because of the limitations of evidence-based decision-making referenced earlier. Some participants, however, indicated an interest in more evidence on management practices, specifically evidence related to individual and organizational ability to undertake effective decision-making.

The issue of evidence-informed implementation (as opposed to evidence-based decision-making) requires further attention. The actual capacity to carry out a decision effectively was identified as a concern, and has been a neglected area of research to date (Bowen and Zwi 2005).

It is not known to what extent factors unique to the Manitoba environment may have contributed to our findings. The Need to Know project activities, combined with the nine-year history of MCHP-sponsored Rural and Northern Healthcare Days (and the role of these seminars in increasing decision-makers' awareness of resources and increasing capacity with key individuals) may have contributed to the finding that need for data and research capacity were not emphasized. The same activities could also potentially contribute to the finding that there was a common assumption that research meant "numbers," as well as some of the concern that decision-makers expressed around this issue. As MCHP (which specializes in secondary analysis of administrative claims data) had sponsored The Need to Know project, the "capacity-building" had focused on quantitative methodology, and the collaboratively developed research reports had relied on administrative data (Fransoo et al. 2005; Martens et al. 2003). Because no other similar health research initiative had been undertaken, there has been less development of capacity in other areas. This finding has, however, been observed by other authors (Jack 2006).

It is important to stress that the purpose of this research was to understand barriers from the perspective of decision-makers, not to provide an objective analysis of all evidence on barriers to evidence-informed decision-making. We propose that any strategies to address barriers to EIDM must take into account and respond to these decision-makers' perspectives. An important limitation of this research, however, is its reliance on self-reported data related to the extent that strategies to address barriers to EIDM are being used. Therefore, the findings may be biased by our informants' perceptions of social desirability, particularly as they reported that EIDM is considered "an expectation."

Conclusions

The "real challenges" to using evidence are structural/contextual/system-level barriers, not simple barriers to research transfer. Findings support the position that knowledge translation is not a single event, but a process (Bowen and Zwi 2005; Lomas 1997) that must include recognition of the varied sources of appropriate evidence, and the complexities of applying research in a specific setting in the face of multiple and interacting barriers. Our results redirect attention from individual decision-making, and use of results from individual research studies, to issues of organizational design - the culture, structure and processes that are needed to support EIDM. Evidence-informed decision-making requires a change in how business is done, and the environment in which this business is conducted: a far more complex undertaking than simply promoting research utilization. While a common strategy to date has been to address data/research accessibility and relevance, or individual capacity to use research (or both), our research suggests that a significant shift in emphasis and orientation is needed.

Decision-makers describe an environment where there is confusion about the nature and appropriate use of evidence - and where they often feel that "using evidence" means simply "using formal research findings and quantitative data" to support their position. While they recognized that evidence is "more than research" and that relevant research is often not available, they did not feel this view was supported. However, our findings also indicate a need for managers to develop (a) skills in weighing various types of evidence, (b) tools that facilitate appropriate use of evidence, (c) strategies for combining various sources of evidence and (d) resources to provide supplementary sources of evidence appropriate to the local context (such as program evaluation). Equally important is the recognition that the "evidence" needed by decision-makers is not limited to health services or clinical research; it also includes evidence related to organizational design and management.

Phase 1 of the From Evidence to Action project has resulted in a redefinition of the research problem from "using research to support decision-making" to "establishing and using processes that facilitate evidence-informed decision-making": a significant shift. Phase 2 is focused on developing and evaluating strategies to address the barriers identified. Rather than developing a tool to assess barriers to EIDM (as was identified in the original proposal), project objectives have been refocused to the development of a "toolkit" of resources to address barriers, as experienced by managers. Some examples of strategies can be found in Table 2. Results will be reported in a subsequent publication.

The extent to which healthcare regionalization has provided a potential to promote evidence-informed decision-making (e.g., consolidation of resources that facilitates creation of roles with research or decision-support functions that would not be possible in a single facility), or conversely, created additional challenges (e.g., increasing the number of projects for which an individual is responsible), requires further exploration, as does the issue of the optimal size of regional health authorities to support this work.


 

Table 2. Summary of findings, implications and potential actions
Key findings Implications for next steps Examples of action taken
Perception that evidence-informed decision-making equates with "using research" (primarily quantitative) results Develop strategies/tools to promote more comprehensive understanding of meaning of "evidence" in decision-making
Focus on process of decision-making vs. specific content (research used)
Reframing of research question for the research project "What is Evidence" (one-page tool developed to address these perceptions directly) developed and circulated through participating RHAs
Skepticism because "politics trumps evidence" Develop strategies to frame political judgment as a recognized form of evidence in decision-making, while promoting transparency on how various forms of evidence are used in decision-making See above
Lack of time and resources major barrier Develop strategies (e.g., redefine roles) to allow "protected time"
Develop strategies to integrate evidence into existing processes vs. viewing as "add-on"
In one RHA, revising resource allocation processes to promote evidence use; developing tools to aid in this process
Focus on "external" barriers - issues that individual RHAs cannot address alone Develop tools to differentiate between internal and external barriers, and encourage RHAs to focus on barriers they can affect Internal/External Barriers framework presented at Rural and Northern Health Care Day
Issues related to leadership, communication and organizational structure Increase awareness of importance of these factors Presentation of Phase 1 report at senior management tables
Culture of crisis management, constant change Promote questioning of inevitability of crisis management approach; disentangle workload from acceptance of crisis management culture As above
More than workload - fractured attention Provide protected "space" for reflective decision-making As above
Technology - too much, too little Ensure that both strategies to (a) improve IT support and (b) minimize potential disruptive effects of communication technology are promoted One RHA instituted "no cellphone/no Blackberry" rule at senior management meetings
Research capacity and data availability viewed as less important barriers to evidence-informed decisions Strategies to increase use of evidence should focus on barriers viewed as more important by RHA planners and decision-makers Library access identified as key issue: trial membership with university library instituted
Need for skills in weighing evidence identified: guide developed
Few differences in identified barriers related to RHA size, complexity Further research required to explore transferability of findings  


 


Résumé

Objectifs et méthodologie : Dix-sept groupes de discussion ainsi que 53 entrevues individuelles semi-dirigées ont eu lieu auprès de 205 planificateurs et décideurs dans les 11 offices régionaux de la santé du Manitoba (Canada). L'objectif était d'étudier les points de vue sur la nature et l'utilisation des "données", ainsi que les obstacles à la prise de décision éclairée par les données probantes.

Résultats : Malgré un appui presque unanime envers le principe d'utilisation des données dans la prise de décision, il y a peu de consensus parmi les participants à savoir ce que sont les "données probantes", quel type de données est le plus adéquat et quelle est la meilleure façon de démontrer comment "utiliser les données". On a exprimé un scepticisme substantiel envers le concept de prise de décision éclairée par les données probantes. Les discussions où étaient présents les décideurs ont surtout porté sur la charge de travail, la politisation de la prise de décision et les facteurs organisationnels. Les obstacles à la prise de décision éclairée par les données probantes ont surtout été attribués à des facteurs externes aux offices régionaux de la santé.

Conclusion : Les stratégies efficaces de promotion de la prise de décision éclairée par les données probantes doivent tenir compte des multiples obstacles auxquels font face les décideurs dans un environnement décisionnel complexe. Au lieu de porter simplement sur les questions de données, d'accès à la recherche ou de renforcement des capacités, ces stratégies doivent viser un changement des processus de décision afin d'appuyer une utilisation adéquate des données.

About the Author(s)

Sarah Bowen, PhD
School of Public Health
University of Alberta
Edmonton, AB

Tannis Erickson, BComm
Project Coordinator
Interlake Regional Health Authority
Eriksdale, MB

Patricia J. Martens, PhD
Manitoba Centre for Health Policy
Department of Community Health Sciences
University of Manitoba
Winnipeg, MB

Susan Crockett, PRS
Executive Director, Planning, Research and Development
Nor-Man Regional Health Authority
Flin Flon, MB

Correspondence may be directed to: Sarah Bowen, PhD, Department of Public Health Sciences, 13-103 Clinical Sciences Building, University of Alberta, Edmonton, AB T6G 2G3; e-mail: sbowen@ualberta.ca.

Acknowledgment

Ms. Crockett represents The Need to Know team, active participants in this research.

References

Baker, G.R., L. Ginsburg and A. Langley. 2004. "An Organizational Science Perspective on Information, Knowledge, Evidence and Organizational Decision-Making." In L. Lemieux-Charles and F. Champagne, eds., Using Knowledge and Evidence in Health Care: Multidisciplinary Perspectives. Toronto: University of Toronto Press.

Bowen, S. and T. Erickson. (2007). From Evidence to Action: Review of Phase 1 Activities. Winnipeg, Manitoba Centre for Health Policy. Retrieved January 12, 2009. < www.rha.cpe.umanitoba.ca/E2A/Phase_1_full _report_PS_edit_50707.pdf > .

Bowen, S. and P.J. Martens. 2006. "A Model for Collaborative Evaluation of University-Community Partnerships." Journal of Epidemiology and Community Health 60: 902-7.

Bowen, S., P.J. Martens and The Need to Know Team. 2005. "Demystifying Knowledge Translation. Learning from the Community." Journal of Health Research and Policy 10(4): 203-11.

Bowen, S. and A.B. Zwi. 2005. "Pathways to 'Evidence-Informed' Policy and Practice: A Framework for Action." PLoS Medicine 2: e166.

Bradley, E.H.,T.R. Webster, D. Baker, M. Schlesinger, S.K. Inouye, M.C. Barth, K.L. Lapane, D. Lipson, R. Stone and M.J. Koren. 2004 (July). "Translating Research into Practice: Speeding the Adoption of Innovative Health Care Programs." Issues Brief. The Commonwealth Fund. Retrieved December 16, 2008. < www.commonwealthfund.org/usr_doc/Bradley _translating_research_724_ib.pdf?section=4039 > .

Canadian Health Services Research Foundation (CHSRF). 2004. "What Counts? Interpreting Evidence-Based Decision Making for Management and Policy." Retrieved December 16, 2008. < www.chsrf.ca/knowledge_transfer/pdf/ 2004_workshop_report_e.pdf > .

Canadian Health Services Research Foundation (CHSRF). 2005. Conceptualizing and Combining Evidence for Health System Guidance. Retrieved January 12, 2009. < www.chsrf.ca/other_documents/pdf/ evidence_e.pdf > .

Canadian Health Services Research Foundation (CHSRF). 2006. "Weighing Up the Evidence. Making Evidence-Informed Guidance Accurate, Achievable and Acceptable." Retrieved December 16, 2008. < www.chsrf.ca/other_documents/pdf/ weighing_up_the_evidence_e.pdf > .

Carney, M. 2006. "Understanding Organizational Culture: The Key to Successful Middle Manager Strategic Involvement in Health Care Delivery?" Journal of Nursing Management 14: 23-33.

Fransoo, R., P. Martens, The Need to Know Team, E. Burland, H. Prior, C. Burchill, D. Chateau and R. Walld. 2005. Sex Differences in Health Status, Health Care Use, and Quality of Care: A Population-Based Analysis for Manitoba's Regional Health Authorities. Manitoba Centre for Health Policy. Retrieved December 16, 2008. < http://mchp-appserv.cpe.umanitoba.ca/ reference/sexdiff.pdf > .

Gerowitz, M.B., L. Lemieux-Charles, C. Heginbothan and B. Johnson. 1996. "Top Management Culture and Performance in Canadian, UK and US Hospitals." Health Services Management Research 9: 69-78.

Grypdonck, M.H. 2006. "Qualitative Health Research in the Era of Evidence-Based Practice." Qualitative Health Research 16: 1371-85.

Hanney, S., M. Gonzalez-Block, M. Buxton, and M. Kogan. 2003 (Jan 13). The Utilization of Health Research in Policy-Making: Concepts, Examples and Methods of Assessment. Health Research and Policy Systems Jan 13; 1(1): 2

Jack, S.M. 2006. "Utility of Qualitative Research Findings in Evidence-Based Public Health Practice." Public Health Nursing 23: 277-83.

Kovner, A.R. and T.G. Rundall. 2006. "Evidence-Based Management Reconsidered." Frontiers of Health Services Management 22(3): 3-22

Lavis, J.N., D. Robertson, J.M. Woodside, C.B. Mcleod and J. Abelson. 2003). How Can Research Organizations More EffectivelyTransfer Research Knowledge to Decision-Makers? Milbank Quarterly, 81, 221-222.

Lavis, J., H. Davies, A. Oxman, J.L. Denis, K. Golden-Biddle and E. Ferlie. 2005. "Towards Systematic Reviews that Inform Health Care Management and Policy-Making." Journal of Health Services Research and Policy 10(Suppl. 1:) 35-48.

Lavis, J.N., S.E. Ross, J.E. Hurley, J.M. Hohenadel, G.L. Stoddart, C.A. Woodward Scott, T., R. Mannion, H.T.O. Davies, M.N. Marshall. 2003. 2002. "Examining the Role of Health Services Research in Public Policymaking." Milbank Quarterly 80(1): 125-54.

Lomas, J. 1990. Finding Audiences, Changing Beliefs: theStructure of Research use in Canadian Health Policy. J Health Polit Policy Law 15(3): 525-42.

Lomas, J. 1997. Improving Research Dissemination and Uptake in the Health Sector: Beyond the Sound of One Hand Clapping. Hamilton, ON: McMaster University Centre for Health Economics and Policy Analysis.

Mannion, R., H. Davies and M. Marshall. 2005. Cultures for Performance in Health Care. Maidenhead, Berkshire: Open University Press.

Martens, P.J., R. Fransoo, The Need to Know Team, E. Burland, L. Jebamani, C. Burchill, C. Black, N. Dik, L. MacWilliam, S. Derksen, R. Walld, C. Steinbach and M. Dahl. 2003. The Manitoba RHA Indicators Atlas: Population-Based Comparisons of Health and Health Care Use. Manitoba Centre for Health Policy. Retrieved December 16, 2008. < http://mchp-appserv.cpe.umanitoba.ca/ reference/rha2.pdf > .

Michie, S. and M.A. West. 2004. "Managing People and Performance: An Evidence Based Framework Applied to Health Service Organizations." International Journal of Management Reviews 5/6: 91-111.

Mitton, C. and S. Patten. 2004. "Evidence-Based Priority-Setting: What Do the Decision-Makers Think?" Journal of Health Services Research and Policy 9: 146-52.

Rosenheck, R.A. 2001. Organizational Process: A Missing Link between Research and Practice. Psychiatr Serv 52: 1607-12.

Rycroft-Malone, J., K. Seers, A. Titchen, G. Harvey, A. Kitson and B. McCormack. 2004. "What Counts as Evidence in Evidence-Based Practice." Journal of Advanced Nursing 47(1): 81-90.

Scott, T., R. Mannion, H.T. Davies and M.N. Marshall. 2003. Implementing Culture Change in Health Care: Theory and Practice. Int J Qual Health Care: 15, 111-18.

Smith, G.D., S. Ebrahim and S. Frankel. 2001. "How Policy Informs the Evidence: 'Evidence-Based' Thinking Can Lead to Debased Policy Making." British Medical Journal 322: 184-85.

Walshe, K. and T.G. Rundall. 2001. "Evidence-Based Management: From Theory to Practice in Health Care." Milbank Quarterly 79: 429-57, IV-V.

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed