Вы находитесь на странице: 1из 40

Evidence-Based Practice

Page 1 of 40

The Nature of Practice in Evidence-Based Practice

Jane F. Gilgun

University of Minnesota, Twin Cities, USA

Running Head: Evidence-Based Practice

Jane F. Gilgun, Ph.D., LICSW, is a professor, School of Social Work,

University of Minnesota, Twin Cities, USA. jgilgun@umn.edu This is a

paper to be presented at the Theory Construction and Research

Methodology Pre-Conference Workshop, National Council on Family

Relations, Minneapolis, Minnesota, November 3, 2010.


Evidence-Based Practice
Page 2 of 40

ABSTRACT

Best research evidence is a cornerstone of evidence-based practice

(EBP). Few dispute the importance of research to applied fields such as

family policy and practice. Restricted understandings of the nature of

practice in EBP, however, are having damaging effects on individuals

and families in the United States, on what counts as credible research,

on which programs and policies are funded, and on whether insurance

companies pay for services. The purpose of this paper is to review

assumptions about the nature of practice in EBP and to show how

symbolic interaction theory, pragmatism, reflective practice, and the

common factors model of practice can contribute to a more full model

of practice. With a more full model of practice in mind, researchers

may increasingly produce research that is compatible with practice.

Practitioners, in turn, may increase their use of research in practice,

which is the point of EBP.


Evidence-Based Practice
Page 3 of 40

The Nature of Practice In Evidence-Based Practice

Evidence-based practice (EBP) is a movement whose purpose is

to encourage the inclusion of research in practice for the purpose of

improving practice and to discourage interventions that do not work or

that have damaging effects. Few dispute the potential of research to

contribute to effective practice within fields such as public policy,

family therapy, family and children’s health and mental health, child

welfare, family education, and prevention studies. Many aspects of

evidence-based practice (EBP), however, are characterized restricted

understandings of practice that are having damaging effects on

individuals and families in the United States and on what counts as

credible research.

In this paper, I broaden conceptualizations of practice in EBP by

including symbolic interaction theory, pragmatism, reflective practice,

and the common factors model. Except for the common factors model,

these are a theory, a philosophy of science, and reflections on practice

that are foundational to or have a place in the human sciences. Briefly,

the human sciences are concerned with the meanings that persons

attribute to the events in their lives and are premised on the following:

human beings are best understood within the situations in which their

actions take place, beliefs are actions and actions show beliefs, human
Evidence-Based Practice
Page 4 of 40

experience is a proper focus of intellectual inquiry, experience is not

only personal but also representative of culture-wide themes and

practices, understanding arises through researcher immersion in the

situations of interest, and that research serves emancipatory purposes;

that is, improves human lives (Gilgun, in press).

I apply perspectives from the human sciences to begin to

articulate a model of practice that incorporates how practitioners

experience practice. This more richly conceptualized model feeds into

a critique of EBP and its underdeveloped understanding of practice. An

assumption about this augmented model of practice is that the kinds of

research available to practitioners must fit the complexities of practice.

When and if models of EBP incorporate an accurate picture of practice,

then integration of research with practice may result to a greater

degree than exists today. This can only happen by identifying,

acknowledging, and respecting the complexities of practice.

A second assumption of this paper is that the kinds of research

that restricted models of EBP offer, which can be thought of as

rational-technical research, has a place in an augmented model of

practice when experienced practitioners have the capacities to adapt

rational-technical research findings to the complex contingencies of

practice.

Restricted Understandings of EBP

Restricted understandings of EBP and of practice itself provide


Evidence-Based Practice
Page 5 of 40

guidelines for which programs and policies are created and funded and

which medical and social services insurance companies fund

(Tannenbaum, 2005). Evidence-based practice can appear to be what

Tannebaum has called a “knowledge regime” (p. 163) that not only

affects funding and dissemination of findings in peer-reviewed journals,

but that also privileges some epistemological perspectives over others.

The epistemology of restricted definitions is of concern to many

(Aisenberg, 2008; Yunong & Fengzhi, 2009). This epistemology is

rational and technical and does not account for the complex

phenomenological aspects of practice. Tannebaum noted that some

proponents of EBP claim to advocate for the replacement of practice

based on authority with practice based on best research evidence, but

in the process they are asserting authority themselves.

For practitioners, EBP has a top-down feel, where academic

researchers tell practitioners that they must use research in their

practice (Aisenberg, 2008; Pollio, 2006). Indeed, practitioners have

responded to calls for EBP with what Zayas, Gonzalez, & Hanson

(2003) have termed “lower than anticipated enthusiasm” (p. 63). There

is ample documentation to show that a lag between development of

new knowledge and practitioner uptake across a range of disciplines

(Jenson & Foster, 2010; Tannenbaum, 2005). This lag could be as long

as 17 years in medical practice (Institute of Medicine, 2001). Epstein

(1995) saw a crisis of relevance of research to practice that leads to


Evidence-Based Practice
Page 6 of 40

barriers to integration of research into practice. Despite descriptions of

the complexities of practice (Schön, 1983) and strategies for bridging

the gap (Hess & Mullen, 1995), the gap remains (Henderson et al,

2006; Jensen & Foster, 2010; McNeill, 2006).

Scholars have identified additional difficulties within current

models of EBP. Aisenberg (2008), for example, noted issues that arise

from restricted understandings of practice, including epistemological

narrowness in assumptions that exclude the norms and values of

ethnic minority communities, exclusion of ethnic minority people in

sufficient numbers in research, and in the assumption that studies

based on majority populations will fit ethnic minority populations. The

latter, Aisenberg said, represents an over-stepping of evidence and

science. He concluded that restricted understandings of EBP are a

continuation of the domination of indigenous communities by Western

cultures. Other practitioners, too, are concerned with the fit of research

with worldview of clients. Blow, Sprenkle, and Davis (2007), writing in

the context of marital and family therapy, pointed out the importance

of fit between world views of service users in addition to a fit with the

worldview of professionals when deciding upon a model of practice.

The worldview of service users is beyond the scope of the present

paper, but it is a topic of importance in EBP.

Some wonder why researchers do not make more of an effort to

understand practitioner perspectives. Yunong & Fengzhi (2009), for


Evidence-Based Practice
Page 7 of 40

example, pointed out that when consumers do not buy products from

businesses, the businesses seek to make their products more

attractive. They do not blame consumers for not buying their what

they sell.

While these concerns and others continue to be expressed,

insurance companies, governmental and foundation funders, and

administrators are requiring the use of EB practices, despite ample

evidence that, besides lack of practitioner uptake in many practice

domains, EB practices required for specific conditions do not fit every

service user with that condition because service users often have other

conditions that may be unique to them (Blow et al, 2007). In addition,

some practices are effective, according to practitioner and service user

judgment, with certain service users but they have not had the

randomized controlled trails that are required for the label evidence-

based, or when there is little if any evidence-based practices for

particular conditions. Lack of evidence and lack of relevant evidence

continue to dog practitioners in a variety of fields (Littell, 2008;

Simons, Shepherd, & Munn, 2008; Jensen & Foster, 2010).

The drive behind applications of restricted definitions of practice

in EBP may serve the interests of insurance companies who want to

fund the most efficient interventions in order to maximize profits.

Concern for service users are secondary at best. This drive also

appears to dovetail with an agenda of conservatives who want to


Evidence-Based Practice
Page 8 of 40

“starve the beast;” that is to shrink government services (Gilgun,

2010; Krugman, 2010). If so, researchers who advocate for restricted

understandings and applications of EBP are playing into the hands of

insurance companies and conservative politicians and pressure groups.

If the drive behind restricted models is political, political efforts in

addition to papers such as the present one, are necessary.

Despite these multiple issues—and undoubtedly, there are many

others—research can add important components to practice. Factors

that interfere with the adoption of research into practice must be

identified and dealt with in order to close the research-practice gap.

Jensen & Foster (2010) made such an effort. They discussed many

factors that must be addressed, including

difficulties customizing the intervention to real-world practice

settings, intervention specificity to a particular setting,

differences between intervention demands of a pharmacologic

versus a psychotherapeutic intervention...as well as problems

with studies’ research designs (e.g., study populations not

relevant or representative….)(p. 112).

This present paper focuses on the nature of “real-world practice

settings,” a discussion to which symbolic interactionism, pragmatism,

reflective practice, and the common factors model can contribute.

McNeill (2006) suggested that tensions between restricted

understandings of EBP and the nature of practice can be reframed in


Evidence-Based Practice
Page 9 of 40

terms of tensions between the epistemologies of the natural sciences,

on which much thinking on EBP rests, and the human sciences, which

provides an epistemology of human experience, including the

experience of practice. McNeill’s perspectives dovetail with mine in the

present paper. Instead of McNeill’s term “natural sciences,” I use the

term “rational-technical” perspectives or models and use the term

“human sciences,” as does McNeill.

Four Broad Understandings of Evidence-Based Practice

Scholars define evidence-based practice in many ways (Gambrill,

2006; Satterfield et al, 2009). I have developed a four-part typology

that summarizes these definitions: the four-cornerstone, the three-

cornerstone, the two-cornerstone, and the one-cornerstone definitions.

The four cornerstones are research and theory, service user situations

and preferences, practitioner expertise/practice wisdom, and the

personal and professional values and personal experiences of

practitioners that include reflective practice (Gilgun, 2005b). I coined

the term “four cornerstones of evidence-based practice” after a review

of the discussions present in medical journals. Physicians originated

EBP, calling it evidence-based medicine (EBM).

The four-cornerstone model has a model of practice based on the

human sciences, but that model requires further elaboration. It

assumes that practitioners are active, reflective agents who continually

evaluate and test their assumptions and decisions throughout the


Evidence-Based Practice
Page 10 of 40

course of their practice. When warranted by service user responses

and other contingencies, they modify their actions to fit their

interpretations of what is going on. Within this understanding, EBP

becomes a process with the four cornerstones as interactive

components. These interactions occur within a complex set of

influence that compose the ecologies of practitioners and service

providers.

The processes of practice take place in particular situations at

particular times with particular persons within a complex set of

contingencies that affect practitioners, service users, their

interpretations and meanings, and how they engage with each other.

This view of evidence-based practice fits well within symbolic

interaction theory (Blumer, 1969), pragmatism (Menand, 1997),

reflective practice (Schön, 1983), and the common factors model of

social services (Blow et al, 2007; Cameron & Keenan, 2010; Drisko,

2004; Lambert, 1992). The human sciences draw upon pragmatism

and symbolic interaction theory (Gilgun, in press), and reflective

practice fit well with these perspectives. The common factors model

has elements of rational-technical assumptions, but its complex

understandings of practice indicate and appreciation of human

sciences perspectives.

The three-cornerstone definition excludes the fourth cornerstone

of the above definition and uses different language, such as “best


Evidence-Based Practice
Page 11 of 40

research evidence,” “clinical expertise,” and “client” or patient.” The

American Psychological Association’s statement on EBP (2006) has a

three-cornerstone definition, which is the following: “the integration of

the best available research with clinical expertise in the context of

patient characteristics, culture, and preferences” (p. 273). Within their

definition of clinical expertise is acknowledgment of reflective practice

and the significance of personal experience, although the statement

does not discuss the roles and importance of practitioner values. In the

framework I developed, the latter is part of a separate, fourth

cornerstone. Epstein (1999), a physician who writes in the medical

literature, also advocated for a fourth cornerstone that he called

mindful practice, which is similar to reflective practice.

Most academic researchers espouse the three-cornerstone

definition, and it is, therefore, common in scholarly discussions. The

three cornerstones are best research evidence, client preferences,

values, and wants, and practitioner expertise (APA, 2006; Gibbs, 2003;

Straus et al, 2005; Satterfield et al, 2009). Like the APA statement,

many who hold to the three-cornerstone definition nonetheless

recognize reflective practice, but I have yet to see any definitions that

acknowledge that personal values influence practitioner actions. The

APA is one of the few instances that I found that acknowledges that

practitioners weave personal experiences into their work.

The two-cornerstone definition includes best research evidence


Evidence-Based Practice
Page 12 of 40

and clinical expertise, while the one-cornerstone definition

encompasses research evidence only. The one cornerstone definition

states that EBP is composed of the judicious use of best research

evidence (Gambrill, 2001, 2006; Sackett et al, 2000). The two-

cornerstone definition mentions “clinical expertise” but does not

elaborate. The one- and two-cornerstone definitions appear to be the

most common in contemporary public policy that influences funding,

program development, and training.

The one- and two-cornerstone definitions--and sometimes

proponents of the three-cornerstone definition--take rational-technical

perspectives, where practitioners are recipients of knowledge, and

their jobs are to carry out the mandates of the agencies in which they

work. These mandate also come from funders such as governmental

bodies, foundations, and insurance companies, that subsidize services

and pay for research. Rational-technical models view practice as a

simple exchange between service users and practitioners without

recognition that practice is messy, ambiguous, contextual,

constructed, complicated, and negotiated (Gilgun, 2005b, 2010;

Parton, 2008; Ruch, 2005; Schön, 1983).

The “top-down” sense that practitioners have identified in

restricted models are reflected in language. For instance, clinicians

deliver services to clients or to patients, just as researchers deliver

knowledge to practitioners. The language of “clinicians,” “patients,”


Evidence-Based Practice
Page 13 of 40

and “clients” implies a hierarchical arrangement where practitioners

are powerful and knowledgeable and clients are subordinated and at

the receiving end of services (Mantzoukas, 2008). This view overlooks

the significance of relationships and motivations for successful

overcomes that include service user willingness to engage with

practitioners, service user trust of practitioners, practitioner

trustworthiness, and myriad other issues related to autonomy, respect,

and freedom of choice.

Further Observations on Restricted Models

While much has been written about EBP, all of the versions

require further thought and observations of practice itself, such as

implementation processes and how and where to attribute change and

lack of change. Perspectives of service providers and service users

would carry some weight in such observations, but other factors are

significant as well. Some scholars who hold to the three-cornerstone

definition are working on taking into account the multiple contextual

issues that shape how and whether research evidence finds its way

into practice. For instance, the APA statement on evidence-based

practice (2006), Gambrill (2006) Jensen and Foster (2010), and

Henderson, MacKay, and Peterson Badali (2006) go into detail about

these factors that include access to research, training in the use of

particular findings, who makes decisions about what research is

available to practitioners and which is not, and how use of particular


Evidence-Based Practice
Page 14 of 40

kinds of research is rewarded or not. Henderson et al cited

dissemination issues as factors, as well, such as the technicality and

density of research reports and conference papers, the training and

control necessary to implement some forms of validated treatments,

and, in general, lack of resources to fund the efforts necessary to

disseminate and to foster uptake of findings. Surprisingly absent in

these discussions of contextual issues is the possibility that fit between

research findings and practitioner and service user experiences of

practice might also be an issue.

These more restricted understandings of EBP rarely address the

necessity of accurate assessments or for understanding service users’

and service providers’ experience of interventions. Best research

evidence is addressed as evidence for intervention only, or techniques

that practitioners use to bring about client change.

The one, two, and many proponents of three-cornerstone

definitions emphasize intervention over assessment. These types of EB

practices have developed a hierarchy of research methods thought to

yield the most valued research evidence. Randomized controlled trials

(RCTs) are the “gold standard.” (Straus, Richardson, Glasziou, &

Haynes, 2005). At the bottom are qualitative studies, which proponents

say, give little information about the effects of interventions on large

samples of randomized groups, which, of course, is correct. Few

quarrel about the usefulness of RCTs for testing effectiveness on large


Evidence-Based Practice
Page 15 of 40

samples and for quantifying these effects, but information on

technique whose worth resides in effect sizes is insufficient for

practice.

Little or no attention is given to understanding service users and

service user situations in all their multiple dimensions, perspectives

that practitioners pay attention to as they work with service users

(Gilgun, 2005a). Thus, research evidence for assessment—whose root

meaning is “sit beside”—goes largely unaddressed. Such research is

descriptive, which has a denigrated status in EBP, because of the

overestimation of the importance of intervention and the

underconceptualization of the importance of practitioners’

assessments of service users’ situations, or understanding the multiple

influences on service users situations and functioning.

In addition, RCTs cannot answer important questions about

interventions, such as what works for whom under what conditions.

These questions require observations of interactions between service

users and service providers, observations of service users as they live

their everyday lives and interviews with service users and services

providers. This kind of evidence will inform practice in ways that

rational-technical evidence cannot.

Tutorials for Dissemination

Tutorials disseminated on the internet epitomize common

conceptualizations of EBP, conceptualizations that pay little attention


Evidence-Based Practice
Page 16 of 40

to the interactive nature of service provision and how much service

user issues and contextual factors contribute to practice processes and

outcomes. These tutorials also do not realize that practitioners make

many decisions on the spot and they may not have the time to devote

to follow the rather leisurely steps linked to doing EBP. The official EBP

website (http://www.cebm.utoronto.ca/) at the Centre for Evidence-

Based Medicine at the University of Toronto (Straus, Richardson,

Glasziou, & Haynes, 2005) has one such tutorial that many other

websites have replicated. (See a tutorial at the Bio-Med Library at the

University of Minnesota, Twin Cities, as one example of many:

www.biomed.lib.umn.edu/learn/ebp/). The definition of EBP at the

official website is “is the integration of best research evidence with

clinical expertise and patient values.” The tutorial then describes how

to do EBP, beginning with formulating a researchable research

question, locating articles and reports, critically appraising them,

integrating them with clinical expertise and patient biology, values,

and circumstances, and evaluating the effectiveness and efficacy of

the application.

This sounds simple, and in some cases, the process is. These

steps require the time to pursue this information, the training to

interpret it, and the skills to apply it. Practitioners in social services

rarely have time to do this kind of search and they typically do not

have access to scholarly journals. Interventions for the human services


Evidence-Based Practice
Page 17 of 40

may take considerable training to apply well. Training funds are not

readily available in most social service settings.

Even if practitioners had these resources, the guidelines for EBP

do not help practitioners with questions that (Zayas, Gonzalez, &

Hanson, 2003) identified, such as “What do I do now?” The rational and

technical models of EB practices assume do not match the streams of

interactions and interpretations practitioners and service users

experience in their day-to-day work together. Certainly, making an

accurate diagnosis or assessment requires the time and effort involved

in the identification of best research evidence, so as to formulate broad

treatment goals and strategies. Service users have the right to the

most up-to-date information on their conditions.

The actual implementation, however, happens on a micro,

experiential, and interactive level, where a multitude of macro-level

influences impinge on every micro-action and interpretation. Pollio

(2006), a proponent of EBP and a practitioner (Howard, McMillan,

Pollio, 2003; Pollio, 2002), illustrated this point when he reported he

had to answer “It depends” (p. 225) when his students asked him how

EBP applies in particular situations. Furthermore, practitioners of all

sorts revise their diagnoses, assessments, and actions (interventions)

when new information arises during the course of interactions with

services usersJacqueline Campbell, who spends considerable effort

working to integrate her research on woman abuse to medical and


Evidence-Based Practice
Page 18 of 40

nursing practice, noted that practitioners “know how to solve problems

on the fly” and “are looking for short-cuts based on research”

(Campbell, personal communication, September 20, 2010). What

models of practice and models of evidence-based practice fit these

observations?

A More Fully Conceptualized Model

A more fully conceptualized model of practice will result in

increased practitioner use of research. The principle on which I base

this conviction is “start where practitioners are,” which, in the present

discussion means if researchers are to influence practitioner actions,

they have to understand practitioners’ experiences of practice, the

processes in which practitioners engage during interactions with

service users, and what these experiences mean to them.

Symbolic Interaction Theory

Symbolic interaction theory (SI) provides language, concepts,

and principles that can enlighten processes of practice. SI focuses on

the meanings and interpretations that individuals make of their

interactions with others in particular situations and particular times in

particular places (Blumer, 1986). SI assumes that individuals are

embedded in multiple, interacting contexts and events. They interpret

events through attribution of meanings. On the basis of their

interpretations and meanings, individuals act. As Thomas & Thomas

(1928) said, “If men [sic] define situations as real, they are real in their
Evidence-Based Practice
Page 19 of 40

consequences” (p. 572).

Pragmatism

Pragmatism and SI share many ideas (Bulmer, 1984). In

addition, pragmatism has a moral dimension based upon concern for

individual and common good (Dewey, 1958; Menand, 1997; West,

1989), a principle that fits with the emancipatory perspectives of most

human services disciplines. It is not surprising that SI and pragmatism

are related. John Dewey, an originator of pragmatism, was a colleague

of some of the originators of SI, including George Herbert Mead and W.

I. Thomas at the University of Chicago. In addition, Dewey had a close

relationship with Mead at the University of Michigan before they both

came to Chicago. In addition, Robert Park, an early interactionist was a

student at Harvard of the pragmatist philosopher William James,

(Bulmer, 1984).

A central principle of pragmatism is the notion that actions--

which are inseparable from beliefs--are to be judged on their

consequences for the common good. Human actions often are

responses to problematic situations. Individuals must act in order to

solve problems. To know whether actions are effective, pragmatic

actors observe the consequences of their actions. They learn what

works and what does not, develop new beliefs, and modify their

actions in order to solve problems more effectively (Menand, 1997;

Rorty, 1982a, 1999). Rorty (1982b), a pragmatist philosopher, saw


Evidence-Based Practice
Page 20 of 40

these series of actions as part of Dewey’s notion of experimentalism,

where “knowledge-claims” are “proposals about what actions to try

next” (p. 204).

Oliver Wendell Holmes articulated these principles in his

interpretations of judicial and practical decision-making. He observed

that individuals make decisions and then reflect upon the principles

behind the decisions (Menand, 1997). He believed that the bases of

decisions are experience—not so much personal experience or

individual life histories, but the experiences of the culture in which one

lives. This is a more general understanding of experience that includes

the beliefs and assumptions of the collective experience that is the

equivalent of culture. Individuals often are unaware of the various

ways that cultural beliefs and practices influence their thinking and

actions. Although Holmes did not consider himself a pragmatist,

Menand said that his disciples considered him one. Years before he

became an attorney and then a Supreme Court judge, he was part of a

philosophical discussion group at Harvard that included the originators

of pragmatism, who were Charles Sanders Pierce and William James.

Pragmatism also emphasizes the precariousness and instability

of experience and the difficulties of understanding it (Dewey, 1958).

Understanding human experience can be so much work that some

philosophers have abandoned the attempt and substitute a

“theoretical security and certainty” (p. xi) (emphasis in original). The


Evidence-Based Practice
Page 21 of 40

result, according to Dewey, is the crafting of laws of nature, universals,

and systems that show unity among entities. Individuals who crave and

create certainty back away from pluralism, change, and particulars,

which are hallmarks of pragmatism.

Commentary. I do not wholly agree with Dewey. On the one

hand, understanding experience and then communicating these

understandings are difficult in my personal experience. On the other

hand, some features of experience can be constructed as mechanistic

and invariant, such as if you tell children they are worthless, they will

be hurt or if you throw a stone at a person’s head, you risk causing

serious injury. I believe that we need both mechanistic universals and

recognition of the instable, complex particulars that compose

experience.
Evidence-Based Practice
Page 22 of 40

Applications to Practice

These ideas have applications to practice. Practice happens

because individuals are caught in problematic situations. Professionals

provide service because they want to contribute to individual and

common good. Values, such as do no harm, underlie many practice

disciplines. As another example, social justice is a value in social work

that many believe undergirds social work practice in its many forms

(IFSW, 2000; Code of Ethics, 2008). Practice is particular, taking place

with particular people at particular times in particular situations.

Professionals continually hypothesize about how to respond to

service user issues, perform actions meant to be responsive, observe

consequences, and then either continue these actions if consequences

are favorable, or revise them if not. They call upon stores of knowledge

they have developed over years of practice and of living their own

lives. Professionals know, as does anyone who reflects upon the events

and interactions in their everyday lives, that experience is

complicated, difficult to understand, indeterminate, and pluralistic.

Professionals cannot reduce these complexities to simple,

straightforward modes of thinking and acting. If we did, we would

overlook significant aspects of service users’ lives and be closer to

robots than to thinking, feeling, acting human beings who continually

attempt to make meanings, communicate meanings, and to respond to

situations in ways that contribute to enhancements of service users’


Evidence-Based Practice
Page 23 of 40

lives and to the common good.

Human services practice, then, is an application of this general

model of interaction and meaning-making that is at the center of SI

and pragmatism. Schön (1983), who wrote about reflective

professional practice, appears to be within this view of practice and

with the SI-pragmatist tradition. He pointed out that practitioners work

with unique cases that they understand through interactions with

persons and settings involved, in drawing out the various dimensions

and meanings that impinge on practice issues, of listening and noticing

carefully and holding back on interpretations until there is sufficient

evidence for them, using a storehouse of knowledge based on previous

practice and training when they make interpretations, acting on the

basis of these interpretations, and being willing to revise

interpretations as new information arises.

Practitioners also must “set the problem,” meaning do a

thorough assessment or diagnostic work-up that shapes the kinds of

interventions and other actions that practitioners take. Assessments

and diagnoses are pivotal because the framing of issues guides

practitioners’ evaluations of the effects of their interventions and the

adjustments they make in their actions and in the frameworks

themselves when they perform actions. In other words, assessments

become guidelines for the evaluation of whether interventions are

effective. Schön (1983) joined others who are concerned that some
Evidence-Based Practice
Page 24 of 40

practitioners move into intervention without understanding the issues.

He also is clear that some practitioners make interpretations that are

not based on the available information about particular situations but

are based upon practitioner expectations. This observation is one of

the major justifications for EBP, with which I concur. Schön views the

joining of the particularities of practice with general knowledge to be a

skill that takes years to develop. He said repeatedly that particular

situations are messy and characterized by uncertainty, contingency,

and murkiness.

Practitioners notice the implications and consequences of their

interpretations and action. They continually adjust their interpretations

and actions in response. In other words, they continually reframe the

problem set. For Schön (1983), individual interpretations and actions

are “local experiments nested within larger ones” (p. 131).

Their sense of what practitioners view as activities that ‘fit”

situations is based upon their framing of the situation. They seek to

solve whatever problems that they have put in their frame. On the

basis of the frame, practitioners make interpretations and perform

actions, observe implications and consequences, and then reframe the

situation.

In summary, framing, acting, interacting, noticing implications

and consequences of actions, interpreting, reframing, acting, and

interacting within complex, “messy” particular situations compose a


Evidence-Based Practice
Page 25 of 40

cycle that appears to summarize much of Schön’s (1983) epistemology

of practice.

Technical-Rational Perspectives on Practice

Schön (1983) contrasted his epistemology of practice with

technical rationality, which he views as “the dominant epistemology of

practice” (p. 21) that has shaped how we think about professional

practice and the relationships between research and theory, education,

and practice. Technical-rational perspectives view professional practice

as building upon a body of scientific research. The “major” professions

such as medicine and engineering have a huge repository of

“specialized scientific knowledge” on which they build their practice,

and they operate within institutions that have stable purposes and

goals, such as physical health and bridges that do not fall down. These

major professions are grounded in knowledge that has four

characteristics: “specialized, firmly bounded, scientific, and

standardized” (p. 23).

Other professions, such as social work, clinical psychology,

psychotherapy, and education have a weak research base, operate

within unstable, shifting institutions with ambiguous ends and

therefore are “minor professions” (pp. 21-22). Thus, there is a

hierarchy of the professions, based upon the assumed worth of their

knowledge base.

In rational-technical perspectives, competent professional


Evidence-Based Practice
Page 26 of 40

practice occurs through knowledge transfer where practitioners draw

upon systematic, scientific research and apply research to the

situations with which they deal. Applied sciences build upon basic

sciences, with basic scientists having more status than applied

scientists. Practitioner skills are based upon both basic and applied

sciences. Learn the basics and principles of application, the thinking

goes, and practitioners become scientific practitioners. In rational

technical thinking, however, skills are lower in worth than basic applied

science. The ambiguity of skill sets earns this lower status. Thus, in

higher education, professional schools came to have lower status than

departments that created basic science and other non-skills based

products.

Yes, from the point of view of pragmatist philosophy, the “know-

how” that individuals can claim is of value. Furthermore, “know-how” is

a point of pride in American self-image and has a place beside rational-

technical knowledge, a perspective that the human sciences bring to

discussions of EBP.

Applications to EBP

These two views, one of reflective practitioners who construct

their understandings through multiple, contingent, messy, cyclical

interactions and processes that take place in contexts, and the other of

technicians who have basic and applied knowledge and transform this

knowledge into skills that bring about desired ends, underpin tensions
Evidence-Based Practice
Page 27 of 40

between restricted models of EBP and practitioner actions. Schön

(1983) wrote long before researchers coined the terms evidence-based

medicine and evidence-based practice, which occurred in the late

1990s, showing that these tensions are long-standing.

The rational-technical model is helpful in human services

practice situations where such models fit, such as in behavior

modification programs when service users respond favorably to

systems of rewards, and in medicine where patients want the most-up-

to date rational techniques as part of their care. Even then, however,

relationships between practitioners and service users and the person

of the practitioner, which are ambiguous by nature and difficult to

measure, are important factors in favorable outcomes in technical

models (Blow et al, 2007). In summary, rational-technical knowledge is

important to practice in multiple domains, but practice also requires

relationship building, empathy, and capacities for recognizing the

major influences of environmental events on service users’ situations

and responses to intervention. The common factors model clarifies this

point.

The Common Factors Model

The common factors model contributes to an expanded

conceptualization of practice. Based upon the metanalysis of hundreds

of research reports on the outcomes of psychotherapy, the common

factors model (Blow et al, 2007; Cameron & Keenan, 2010; Drisko,
Evidence-Based Practice
Page 28 of 40

2004; Lambert, 1992) shows that the largest source of positive change

in therapy is external influences. These influences include positive and

negative changes in service user lives, institutional factors such as

agency policies, social policy, economic conditions, opportunity

structures, and natural disasters. The next largest source is the

relationship between service users and service providers. Together,

relationships between service users and service providers and external

influence, which are complex, contingent, and messy contextual

factors that impinge on particular situations, account for 70% of

outcome. Another 15% can be attributed to service user and service

provider optimism and service user motivation to make use of services.

These common factors are rough estimates of contributions to

outcomes and the percentages may vary according to situations. The

model, however, has held up over decades of research on therapy

(Blow et al, 2007) and other practice situations as well (Cameron &

Keegan, 2010). It is unlikely that technique will every account for most

of the variance in outcomes.

In fact, there is wide-spread recognition that many factors

account for outcomes besides interventions themselves. Jensen,

Weersing, Hoagwood, & Goldman (2005), for example, did a

systematic review of child psychotherapy outcome studies that found

that the studies did not account for factors other than treatment that

might have accounted for outcome. They listed factors such as


Evidence-Based Practice
Page 29 of 40

treatment intensity, the therapeutic alliance, and other processes that

might affect outcome, factors that the common factors model has

identified.

These empirical findings support the principles under discussion,

namely, the centrality of interactions in context. These factors account

for five times more of outcome than technique. Yet, advocates of

restricted models of EBP advocate for guidelines and policies that

appear to support the idea that evidence-based interventions are the

most important factors in successful interventions.

Discussion

This paper critiqued contemporary understandings of practice in

EBP and offered a more full model of practice based upon symbolic

interaction theory, pragmatist philosophy, reflective practice, and the

common factors model. The first three bodies of thought are within

human sciences traditions, while the fourth combine technical-rational

and human sciences perspectives. Rather than pitting rational-

technical models of practice against human sciences models, I

attempted to show how a more full model of practice incorporates

both.

Within the domain of direct services, few service users would

chose providers who are incompetent technically, and they would

chose those who are excellent technicians and who also recognize and

are willing to work with a complex set of contingencies that influence


Evidence-Based Practice
Page 30 of 40

service users engagements in services. Relationships between

practitioners and services users and external contingencies on service

users appear to be the most powerful factors in successful service

provision, except perhaps when surgeons excise a tumor or place pins

in bones to hold broken pieces together, or a case manager locates a

food voucher for a homeless family. Technical problems require

technical solutions, but in most practice situations, especially social

services, the technical problems when they exist are embedded in

complex and messy contingencies that require the building of trusting

relationships for services to have the desired outcomes.

The common factors model shows that external influences on

service users and relationships between service users and service

providers are most powerful in shaping outcomes, a finding that SI and

pragmatism anticipated and that more recent empirical research on

practice demonstrates (Jensen et al, 2005). Practice, in fact, appears to

be just one of multiple interactions on which these two bodies of

thought shed light.

This paper focused on practice as service providers may

experience it. Much thought will have to go into constructing a model

of practice that attempts to account for service user experiences.

Research in this area, of course, is underway (cf., Galanter & Jensen,

2009; Perkins et al, 2007), but I believe that these efforts will be

enhanced by the incorporation of fuller models of practice as discussed


Evidence-Based Practice
Page 31 of 40

in this paper. Efforts to understand the nature of external influences

and service user perspectives, in addition to service provider

perspectives, may be helpful to researchers who want practitioners to

use their research and for service users to respond to it.

I am optimistic that proponents of EBP are open to efforts at

understanding the nature of practice as service providers and services

users experience it. Scholars continually revise their understandings of

EBP (Gilgun, 2005a; Mantzoukas, 2008) and how to disseminate

research to practitioners. For instance, the Centre for Evidence-Based

Medicine (http://www.cebm.utoronto.ca/) which is the official website

of EBM (Straus, Richardson, Glasziou, & Haynes, 2005) and was once

free-standing, has recently become a program of the Knowledge

Translation Clearing House (http://ktclearinghouse.ca/). Since its

origins at McMaster University, in Hamilton, Ontario, Canada, in the

early 1990s, EBM has undergone many transformations, one of the

latest the view that EBM is actually about the transfer of knowledge

from “the bench to the bedside,” hence the now widely-used terms

“knowledge translation” and “translational research.” EBM at the

University of Toronto is now a subsidiary of the Translation

Clearinghouse.

The subsuming of EBP into translational research is evident in

the United States, as well. The National Cancer Institute is one of many

governmental funding agencies that has taken on the tasks of


Evidence-Based Practice
Page 32 of 40

translational research through the Working Group on Translational

Research, to which they give the following definition.

Translational research transforms scientific discoveries arising

from laboratory, clinical, or population studies into clinical

applications to reduce cancer incidence, morbidity, and mortality

(http://www.cancer.gov/researchandfunding/trwg/TRWG-

definition-and-TR-continuum)

The National Institutes of Mental Health has a Division of

Developmental Translational Research. One of its goals is the “Design

and testing of innovative and personalized preventive and treatment

interventions”

(http://www.nimh.nih.gov/about/organization/ddtr/index.shtml).

The language describing translational research, like much of the

language used in discussions of EBP, is rational-technical and the

models of practice behind translational research may largely also be

rational-technical. Yet, as efforts continue to understand practice,

there is reason to hope that those who hold rational-technical

perspectives will realize that practitioners and service users may

experience practice as discussed in this paper: complex, messy,

contingent, and situated in particular places, with particular

individuals, in particular times. That practitioners adapt rational-

technical knowledge, as Pollio (2006) explained, to fit the

contingencies of situated practice may become a given.


Evidence-Based Practice
Page 33 of 40

Other contextual features of practice also must be

acknowledged, such as the need for practitioners to have the time to

read about and be trained on the most-up-to date research and the

desirability of funding of research that fits practice.

Any approach that has as its purpose to encourage practitioners

to use research must recognize that practice, whether it is policy,

community organization, advocacy, education, or direct practice, is

situated, contingent, and flexible in response to the issues that have

prominence at any given time. The integration of research into practice

requires that decision-makers and researchers recognize the particular

features of practice, such as those I discussed in this paper. If such

recognition translates into modification of existing policies and

practices, it is likely that practitioners will incorporate research into

practice. The more systematic and system-wide recognition is of how

practitioners do their work, the more systematic will be practitioners

use of research.

References

Aisenberg, Eugene (2008). Evidence-based practice in mental health

care to ethnic minority communities: Has its practice fallen short

of its evidence? Social Work, 53(4), 297-306.

American Psychological Association. (2006). APA presidential task

force on evidence based practice. Washington, DC: Author.

Blow, Adrian J., Douglas Sprenkle, & Sean D. Davis (2007). Is who
Evidence-Based Practice
Page 34 of 40

delivers the treatment more important than the treatment itself?

The role of the therapist in common factors. Journal of Marital &

Family Therapy, 333(3), 298-317.

Blumer, Herbert (1986). Symbolic interactionism: Perspective &

method. Berkeley: University of California Press.

Bulmer, Martin (1984). The Chicago School of Sociology:

Institutionalization, diversity, and the rise of sociological

research. Chicago: University of Chicago.

Cameron, Mark, & Elizabeth King Keenan (2010). The common factors

model: Implications for transtheoretical clinical social work

practice. Social Work, 55(1), 63-73.

Code of Ethicsof the National Association of Social Workers (2008).

http://www.naswdc.org/pubs/code/code.asp/.

Cooley, Charles Horton (1934). Human nature and the social order.

New York: Scribner’s. First published in 1902.

Dewey, John (1958). Experience and nature. New York: Dover.

Drisko, James W. (2004). Common factors in psychotherapy outcome.

Families in Society, 85 (1), 81

Epstein, Irwin (I995). Promoting reflective social work practice:

Research strategies and principles. In Peg M. Hess & Edward J.

Mullen (Eds.), Practitioner—researcher partnerships: Building

knowledge from, in, and for practice (pp. 83-1I2). Washington,

DC: NASW Press.


Evidence-Based Practice
Page 35 of 40

Epstein, Ronald M. (1999). Mindful practice. JAMA, 282(9), 833-839.

Galanter, Cathryn A. & Jensen, Peter S. (Eds.)(2009). DSM-IV-TR

casebook and treatment guide for child mental health. Arlington,

VA: American Psychiatric Publishing.

Gambrill, Eileen (2006). Evidence-based practice and policy: Choices

ahead Research on Social Work Practice, 16 (3), 338-357.

Gambrill, Eileen (2001). Social work: An authority-based profession.

Research on Social Work Practice, 11(2), 166-175.

Gibbs, Leonard (2003). Evidence-based practice for the helping

professions. Pacific Grove, CA: Brooks/Cole.

Gilgun, Jane F. (2010). The method to the madness: Republicans do

know what they are doing. Scribd.com.

http://www.scribd.com/doc/27340683/The- Method-to-the-

Madness-Republicans-Know-What-They-are-Doing

Gilgun, Jane F. (2006). The four cornerstones of qualitative research.

Qualitative Health Research, 16(3), 436-443.

Gilgun, Jane F. (2005a). Evidence-based practice, descriptive research,

and the resilience-schema-gender-brain (RSGB) assessment.

British Journal of Social Work. 35 (6), 843-862.

Gilgun, Jane F. (2005b). The four cornerstones of evidence-based

practice in social work. Research on Social Work Practice, 15(1),

52-61.

Henderson, J. L., MacKay, S., & Peterson-Badali, M. (2006). Closing the


Evidence-Based Practice
Page 36 of 40

research–practice gap: Factors affecting adoption and

implementation of a children’s mental health program. Journal of

Clinical Child & Adolescent Psychology, 35, 2–12.

Hess, Peg McCartt & Edward J. Mullen (1995). (Eds.). Practitioner—

researcher partnerships: Building knowledge from, in, and for

practice. Washington, DC: NASW Press.

Howard, Mathew O., Curtis, McMillan, & David E. Pollio (2003).

Teaching evidence-based practice: Toward a new paradigm for

social work education. Research on Social Work Practice, 13,

234-259.

International Federation of Social Workers (2000). Definition of

Social Work. Adopted by the IFSW General Meeting in Montréal,

Canada, July 2000. http://www.ifsw.org/f38000138.html.

Institute of Medicine. (2001). Crossing the quality chasm: A new health

system for the 21st century. Washington, DC: National Academy

of Medicine Press.

Jensen, Peter S. & Michael Foster (2010). Closing the research to

practice gap in children’s mental health: Structures, solutions, and

strategies. Administration and Policy in Mental Health and Mental

Health Services Research (37), 111-119

Jensen, Peter S., Robin Weersing, Kimberly Hoagwood, & Eliot Goldman

(2005). What Is the evidence for evidence-based treatments? A

hard look at our soft underbelly. Mental Health Services


Evidence-Based Practice
Page 37 of 40

Research. Vol.7(1), 53-74.

Krugman, Paul (2010). The bankruptcy boys. New York Times, Monday, February 22, p.

A17.

Lambert, Michael. (1992). Implications of outcome research for

psychotherapy integration. In John C. Norcross & Marvin R.

Goldfried (Eds.), Handbook of psychotherapy integration (pp. 94-

129). New York: Basic.

Lewin, Kurt (1946). Action research and minority problems. Journal of

Social Issues 2(4), 34-46.

Littell, Julia H. (2008). Evidence-based or biased? The quality of

published reviews of evidence-based practices. Children and

Youth Services Review, 30 (11), 1299-1317

Littell, Julia H., Jacqueline Corcoran, & Vijayan Pillai. (2008). Systematic

reviews and meta-analysis. New York: Oxford University Press.

Mantzoukas, Stefanos. (2008). A review of evidence-based practice,

nursing research and reflection: Leveling the hierarchy. Journal

of Clinical Nursing, 17(2), 214-223.

McNeill, Ted. (2006). Evidence-based practice in an age of relativism:

Toward a model for practice. Social Work, 51, 147-156.

Mead, George Herbert (1934). Mind, self, & society. Chicago: University

of Chicago Press.

Menand, Louis (Ed.) (1997). Pragmatism: A reader. New York: Vintage.

Parton, Nigel (2008).Changes in the form of knowledge in social work:


Evidence-Based Practice
Page 38 of 40

From the “social” to the “informational.” British Journal of Social

Work, 38, 253-269.

Perkins, Matthew B., Peter S. Jense, James Jaccard, Peter Gollwitzer,

Gabriele Oettingen, Elizabeth Pappadopolos, & Kimberly E.

Hoagwood. (2007). Applying theory-driver approaches to

understanding and modifying clinicians’ behavior: What do we

know? Psychiatric Services, 58(3), 342-348.

Pollio, David E. (2006). The art of evidence-based practice. Research

on Social Work Practice, 16, 224-232.

Rorty, Richard (1982a). Pragmatism, relativism, irrationalism. In

Richard Rorty, Consequences of pragmatism (pp. 160-175).

Minneapolis: University of Minnesota.

Rorty, Richard (1982b). Method, social science, social hope. In Richard

Rorty, Consequences of pragmatism (pp. 191-210). Minneapolis:

University of Minnesota.

Rorty, Richard (1999). Philosophy and social hope. New York: Penguin.

Ruch, Gillian (2005). Relationship practice and reflective practice:

Holistic approaches to contemporary child care social work. Child

and Family Social Work, 10, 111-123.

Sackett, David L., Sharon E. Straus, W. Scott Richardson, William

Rosenberg, & R. Brian Haynes (2000). Evidence-based medicine:

How to practice and teach EBM (2nd ed.) Edinburgh: Churchill

Livingston.
Evidence-Based Practice
Page 39 of 40

Satterfield, Jason M., Bonnie Spring, Ross C. Brownson, Edward J.

Mullen, Robin P. Newhouse, Barbara B. Walker, & Evelyn P.

Whitlock (2009). Toward a transdisciplinary model of evidence-

based practice. Millbank Quarterly, 87(2), 368-390.

Schön, Donald (1983). The reflective practitioner: How professionals

think in action. New York: Basic.

Simons, Kelsey, Shephard, Munn (2008). Advancing the evidence base

in social work in long-term care: The disconnect between

practice and research. Social Work in Health Care, 47(4), 392-

415.

Strauss, Sharon E., W. Scott. Richardson, Paul Glasziou, & R. Brian

Haynes (2005). Evidence-based medicine: How to practice and

teach EBM (3rd ed). Edinburgh: Elselvier.

Tannenbaum, Sandra A. (2005). Evidence-based practice as mental

health policy: Three controversies and a caveat. Health Affairs,

24(1), 163-173.

Thomas, William I. & Dorothy Swaine Thomas (1928): The child in

America: Behavior problems and programs. New York: Knopf.

West, Cornell (1989). The American evasion of philosophy: A

genealogy of pragmatism. Madison: University of Wisconsin.

Yunong, Huang & Ma Fengzhi (2009). A reflection on reasons,

preconditions, and effects of implementing evidence-based

practice in social work. Social Work, 54(2), 177-181.


Evidence-Based Practice
Page 40 of 40

Zayas, L. H., Gonzalez, M. J., & Hanson, M. (2003). “What do I do

now?”: On teaching evidence-based interventions in social work.

Journal of Teaching in Social Work, 23, 59-72.

Вам также может понравиться