Saturday 20 April 2013

Evidence artefacts - an example

This is a case study prepared for the "Big Push Forward" conference next week (for links to the conference, see the extra post below). 
The example is from a real organisation, a group working on human rights.
 
The document which defined the organisation’s 3-year programme described the types of work it would carry out, within its mission and values.

The appraisal document and contract which governed the funding relationship between the human rights group and the funding organisation (donor) included a list of intended results (“results list”). All results came with a form of quantification. Many were very indirectly connected to the advocacy and legal aid carried out by the human rights group. For example (from memory – inaccurate, but close to the original):


  • At least 10,000 persons (30% female) will have basic knowledge of their civil rights.
  • The percentage of rape cases brought to court will increase to at least 30%.
  • No more extrajudicial killings will happen.
When I asked the human rights group as to how they recorded progress against those targets, defined upon the donor’s request, it took days until their copy of the “results list” was located. Annual reports to the donors did not refer to that list. Instead, they contained dozens of pages of log-frame like matrices, which presented every activity as a mini-results chain. For example (from memory i.e. inaccurate):
  • Activity: 200 awareness raising sessions
  • Output:  212 awareness raising sessions conducted
  • Outcome:     12,034 participants (31.4% female)
  • Objective:    80% have basic knowledge on civil rights
  • Goal:            Rights awareness has increased
When I asked why “they” sliced up their experience in that way, I was told that a different donor had asked them “to do results-based management”. Yet another organisation - a third donor - had dispatched a consultant for results-based management (RBM) training, which focused on that particular type of results chain. The course materials I saw were all about the correct use of terminology – for example, the difference between an "outcome" and an "objective", according to that particular agency (or that particular consultant). In those days, every single donor seemed to develop their own results chain.

All senior and middle managers of the human rights group had to participate in the week-long (5 full days) training course – a huge transaction cost for an organisation where the managers were involved in urgent human rights work every day. The only visible results of the course were annual progress reports full of matrices which I found hard to understand. The English was excellent, though. 

Independently from any donor intervention, the organisation ran its own, parallel knowledge management system around monthly meetings, where all professionals of that group came together for a full day to discuss external and internal developments. That is where the group made sure that (i) learning was shared, and (ii) plans would be made on the basis of experience and… evidence the way they saw it. The reports produced for the donors did not play any role in those meetings.

Apparently, no donor had ever asked the group about its own ways of planning and monitoring – let alone tried to organise reporting so that it would build on (or further improve) the existing system, which appeared quite well suited to the group's needs.

No comments:

Post a Comment