Saturday 4 May 2013

Quotes from the "Politics of Evidence" Conference

In late April, the "Politics of Evidence" conference brought together (at the IDS in Brighton) development and evaluation practitioners from a rich mix of professional backgrounds and countries. Some of the papers presented have been available on the internet, such as Rosalynd Eyben's article on the The power of results and evidence artefacts. Eyben's paper is about tools initially designed to stimulate critical thinking which have mutated into instruments of confusion: For example, the log-frame was built to encourage people to reflect on the assumptions behind their "project logic" - but in most contemporary versions, the "assumptions" column has disappeared.

The push to demonstrate quick results may encourage development agents to privilege actions which yield easily achievable results - over more complex work that aims for social transformation.
A few stories and "soundbites" from the conference (no attribution - Chatham House rules):
  • The move from a "trust me" to a "show me" culture may not be entirely unjustified. But: External validity is often sacrificed on the altar of internal validity - i.e., people sometimes feel pushed to conduct "super-rigorous" research even if it is not particularly interesting or useful for their work.
  • One participant described how several external consultants, in successive rounds, modified "her" project to match (perceived) donor expectations as to what was a proper proposal - "suddenly the whole project was a complete stranger to our themes and our organisation". I would suspect that was not the donor's intention. But in a situation of unequal power, organisations at the receiving end of development funding may be tempted to bend their projects to what they consider to be donor preferences.
  • A participant with extensive experience in evaluations using randomised controlled trials (RCT) said that RCTs worked well in certain situations: for example, to find out whether people would use mosquito nets more systematically if they paid for them (rather than getting them for free). But scientific institutes that use experimental designs turn down requests for RCT-based when the questions are complex - for example, whether the availability of toilets for girls at schools improves girls' educational achievement. That participant stated that in most cases (97% of those his institute was approached with), RCTs were inappropriate and systematic monitoring would be the most effective way to assess impact. Utility (what development practitioners need most) vs. accuracy (what academic researchers need most).  
  • The positive effects of evidence artefacts depend on the user - not on the artefact. Anyone who genuinely wishes to create opportunities for critical reflection and discussion will find a way to do that, regardless of the specific tool they may use. I would say, though, that some tools are more conducive to promoting reflection than others. "Don't start with the tools - start with the thinking!"
About one day at the (2-day) conference was devoted to experience exchange in small groups. The conclusions the groups came up with similar conclusions:
  • A constructive way to deal with the "politics of evidence" is to create systems that foster critical reflection - in the way one's own organisation or programme needs it. For example, one participant explained his (large) organisation measured "outputs" only - but each of its initiative was well documented as part of a "jolly good knowledge management system", which could easily respond to queries by donors or other stakeholders.
  • What is happening right now: scientists are brought into the discussion to reveal that there is more to science than just experimental designs (such as RCT), and to show the limits and risks (ethical issues, inter alia) of experiments.
  • Some "results artefacts" tend to be more adaptive and more likely to answer "how" and "why" questions than others. Participatory action research or the "action learning cycle" were quoted as efective ways of gathering, analysing and disseminating key information.
  • Creative compliance, rigorous relevance, relevant rigour - watch this space for more on that!

No comments:

Post a Comment