March 2, 2013 in Featured Articles
By Luis Cabrera
When it comes to influencing government anti-poverty efforts, the policy climate matters, Fred Carden notes, but so does a researcher’s focus on actually having an impact.
“If you’re not trying to do it you are not very likely to do it,” said Carden, who heads evaluation and impact efforts at the Canadian government-sponsored International Development Research Centre (IDRC http://www.idrc.ca/EN/Pages/default.aspx ) in Ottawa. “People are often not very intentional. They want to address poverty but they don’t have a clear intent about what they want to do.”
Carden led team efforts to assess policy influence in 23 IDRC-sponsored research studies in developing countries worldwide. The findings were presented in his book, Knowledge to Policy: Making the Most of Development Research (Sage, 2009), and he has continued to refine the framework.
KEY IMPACT VARIABLES
Carden’s overall conclusion, from the case studies and subsequent work, is that two sets of contextual variables are crucial in determining whether impact-minded researchers will be able to influence policy outcomes. These are:
General Context: This includes a government’s actual capacity to apply research findings, the stability of decision-making institutions, how centralized governance is in the country. It also includes general economic conditions, and whether a country is in crisis or otherwise undergoing a dramatic transition, which can open opportunities for influence.
Decision Context. Here, the key is government appetite for research. In descending order of interest, Carden found situations in the case studies of clear demand from government, demand but a leadership gap in realizing it, and demand but a lack of resources to act on it. In a number of cases, he found great interest from researchers in sharing new findings, but small interest from policy makers. In some cases there was open hostility from the policy community.
In cases of strong demand, he said, “it was often where it was a brand new problem they didn’t know how to address. Often in IT [information technology] policy, a lot of countries didn’t know what to do about it. They were a lot more willing to ask researchers for advice, where they were less willing in areas like education and health where they purported to already know what should be done.”
Some cases found a very different climate, where policy makers simply weren’t receptive to research, regardless of the strength of its findings.
In Guatemala, for example, where IDRC funded research on unequal access to education by women and members of indigenous groups, the findings fell on deaf ears.“The government was actually in a mode where they were saying ‘we are one country’. They were coming out of civil strife, and they were putting out the message that ‘we are all the same, we are all Guatemalans,’” and findings that identified a need to devote more resources to particular groups were not well received, he said.
“They could have presented their research differently, and really taken the tack that in order to be one Guatemala we have to bring them in more directly” Carden said. “I think they just missed that. They didn’t actually sit down and think about, ‘what’s the ability of policy makers, what’s the capacity, and if they’re not asking us for advice on this, how are we going to frame it in a way that supports what they are trying to do?’”
Other cases were drawn from IDRC-funded studies in developing or lower-income countries worldwide – all led by nationals from those countries — including Peru, the Philippines, Bangladesh, India, Vietnam, Senegal, Tanzania, Uganda, Jordan, Tunisia, and Ukraine.
DIVERSE STUDY SUBJECTS
The subjects and aims of the studies varied widely. They included research on water resources and irrigation, mining, enhancing influence on international trade issues, health issues, promoting traditional knowledge, increasing access to new technologies, and addressing ‘brain drain’ issues.
In approaching an assessment of impact in such a variety of individual studies in diverse locations, Carden said, he sought to take as much input as possible on research design.“I brought together case study writers, IDRC programme staff. I didn’t give them a framework, but said ‘look at the cases.’ That’s how we developed a way to analyze across cases. We took detailed notes at workshops, looked at what’s coming out over and over again.”That process, and the ongoing findings around impact “has influenced how people ask questions at IDRC, and how they provide advice to researchers,” he said.
EVOLVING IMPACT-STUDY METHODS
Carden, who holds a PhD from the University of Montreal and joined IDRC in 1993, sees the policy influence work as a natural outgrowth of his evaluation design work for the center, including outcome mapping.
“That’s an approach to planning, monitoring and evaluation, relationships exposure and activities. A lot of work can’t be defined as direct impact, but you can look at what are the changes in relationships between the people — are they finding different ways to interact with policy makers or are they staying in their own little research world?” he said.
“How do they transmit their messages? How do they build the relationships they need to influence people — with media, policy makers and others? Outcomes are actually in them making those efforts and beginning to build those relationships. So, outcome mapping is actually a methodology for designing your work around those outcomes you are trying to achieve, that will support, you think, the change you want to see happen ultimately. A lot of that is around the boundaries, because you can only talk about changing the behavior and activity of those you actually are interacting with.”
BUILDING IMPACT IN FROM THE GROUND UP
Carden also encourages researchers to think about impact beyond specific policy influence, to include impact on civil society efforts and deep engagement with the subjects of research studies themselves. Such efforts can pay important dividends to the researcher, he said, in terms of strengthening a study but also in some cases realizing significant positive change.
“I really think researchers have to get more directly engaged with the people who are directly affected by the research,” he said. “People who are poor have a huge amount of intelligence about why they’re poor and what’s going on around them.”
He noted one case study from the book of a study aimed at enhancing the organization and sustainability of the Honey Bee Network, a grassroots group focused on support for India’s traditional small farmers.
The study highlighted ways to have impact “not in talking to policy makers directly but getting the community engaged and then getting community members to go talk to policy makers. It’s changing the mindset of researchers that’s key and making it legitimate for them, giving them permission almost to go out and talk to people in the community.”
Carden exhorts researchers to work closely with the groups and individuals they study, including in the research design process and data analysis, for gaining insight into the broader context in which findings are embedded.
“Avoid doing the research in isolation. Avoid big pronouncements and research studies about people that don’t involve those people. You’ll have numbers that are consistent but not necessarily a good understanding of the implications of that research,” he said. “A lot of times, the data all looks very clean but nobody actually sees the truth. That kind of back and forth, in a very iterative exchange, could be very valuable.”
That kind of deep engagement can be built into funding applications as well, he noted, and it often is well received by funders such as IDRC.
On applications, “don’t be afraid to expand beyond the typical academic response, of preparing policy briefs and doing presentations to ministers,” he said. “We get quite frustrated that what’s coming in doesn’t try to move beyond the typical response. I’d say be creative, say we actually want to get out there in the community.”