Tuesday, 19 July 2016

Homa Hoodfar

Some days ago a friend forwarded a message to my inbox - "Homa Hoodfar indicted on unknown charges", it said. 

I met Professor Homa Hoodfar in 2009, at a conference on gender and religion hosted by the Böll Foundation in Berlin. Impressed by the workshop facilitated by Homa, I wrote about that event here. The group #FreeHoma offers a site that shows a selection of her publications.

I do not understand why an internationally respected academic gets arrested when visiting her homeland. What are the charges brought against her? Why? 

My thoughts are with Homa and all those who want to see her freed.

Sadly, women human rights defenders are threatened throughout the world. For more reading, I recommend AWID's site on the issue, which includes recommendations for holistic protection. 

Thursday, 26 May 2016

Let's evaluate together

This is the time of the year when I would like to be able to clone myself, to respond to all those requests for evaluation proposals (RFPs) while busily working away on on-going jobs that need to be completed before the Northern hemisphere summer break breaks out. List servers publish new RFPs every day; as July approaches, the deadlines become increasingly adventurous. In late May, RFPs ask for offers to be submitted by the first week of June; the selected evaluation team would start working right away. It seems many of those who publish those last-minute quick-quick RFPs assume evaluation consultants spend their days sitting in their offices, twiddling thumbs, chewing nails or randomly surfing the web, waiting for that one agency to call them up and get them to work right away, tomorrow! Drop everything and work for us!

Many of those evaluations are mid-term or end-of-project evaluations, which tend to happen at highly predictable moments (in the middle or near the end of project implementation) and could be planned many months, even years ahead. But this is not what worries me most about the seasonal avalanche of RFPs. What worries me most is that they tend to produce evaluations of questionable value.

Often, those last-minute RFPs are about projects of modest size, with meagre resources for evaluation. In that situation, the evaluation terms of reference (TOR) would typically ask for 20-40 consulting days, to cover the entire set of OECD/DAC criteria - relevance, effectiveness, efficiency, and impact and sustainability, all that within 2-3 months and on a shoestring budget. As someone who has reviewed a couple of hundred evaluations, I know that the resulting evaluation reports tend to be a bit on the shoddy side. With some luck, the participants in the evaluation might have found the evaluation process useful. But don't look for ground-breaking evidence in quick and dirty single-project evaluations.

It does not have to be that way. For instance, organisations that receive money from several funders can convince their funders to pool resources for one well-resourced evaluation of their overall activities rather than a bag of cheap three-week jobs. Funders who support several complementary initiatives in the same geographical region, or who support the same kind of project in many different places, can commission programme evaluations to better understand what has worked and what hasn't, under what circumstances.

It makes more sense to take a step back and look at bigger pictures, anyway, because no development intervention happens in isolation. Project X of NGO Y might yield excellent results because NGO Z runs project ZZ in the same region, and project X wouldn't have the slightest chance to succeed if project ZZ wasn't there. You need time and space to find out that kind of things.

And last but absolutely not least, there is no reason why evaluation should only happen in the middle or at the end of an intervention. Some of the most useful evaluations I have come across have been built into the project or programme from the beginning, supporting programme managers in setting up monitoring systems that worked for those involved in the programme and for those evaluating it, and accompanying the project with on-going feed-back. This doesn't need to be more expensive or more complicated than the usual end-of-project 40-day job. But it can provide easy-to-use information in time to support well-informed decision-making while the project is being implemented - not just when it's over.

Monday, 18 April 2016

Work to be done: Ending violence against children

A recent report on the global prevalence of violence against children in the past year has shown that  more than half of the children in 96 countries across the world —1 billion children aged 2–17 years—experienced violence in the past year. Violence against children is a human rights violation. It makes people more likely to fall ill, and to  experience and perpetrate violence in their adult lives. In other words, violence is passed on through generations - even genetically, as it can alter a child's genes.

Sustainable Development Goals (SDG) call for an end to “abuse, exploitation, trafficking and all forms of violence against and torture of children” (SDG 16.2) and to “eliminate all forms of violence against all women and girls in the public and private spheres, including trafficking and sexual and other types of exploitation” (SDG 5.2). SDG 4 on education refers to the importance of promoting non-violence in several sub-goals, e.g. by calling for a non-violent environment for education (SDG 4.a).

With probably more than half of the world's children experiencing violence, major efforts are needed to attain the SDGs. For inspiration, have a look at UNICEF's Six Strategies for Action. If you know of any useful resources to share, please post a comment and share.

Friday, 1 April 2016

3500 evaluation reports for everyone - really everyone?

It is delightful to see that more and more agencies are publishing more and more evaluation reports on-line. Now, UNDP has announced, in a pretty infographic, its revamped Evaluation Resource Centre (ERC), which gives access to more than 3500 reports. A bounty for meta-evaluators!

But one thing that puzzles me: The video spot that explains the ERC, with its suave male speaker and friendly ambient music in the background, suggests that only men - or, say, short-haired trouser-wearing necktie-bearers - make decisions. Look at the visuals near minute 0.57 and 2.18. Little skirt-bearers are only acknowledged as members of "the public". What time and place do we live in? Dear UNDP! We know you can do much better on promoting gender equality, so why not flaunt it and show at least equal numbers of male and female decision-makers in your PR materials?

Thursday, 31 March 2016

More basic terminology

Here's another set of concepts that seem to cause a great deal of confusion. They are much used in results-oriented planning (often called results-based management or RBM). I like to explain them as follows:

Output = The direct result of an activity - something that is under your/ your project's control. For instance, I brush and floss my teeth several times a day, and the output is a clean set of teeth.

Outcome = Something that your activity is designed to help produce - but it takes some more factors for that kind of result to come about. For instance, I clean my teeth to avoid getting caries, so healthy teeth are my desired outcome. But my chances to have good teeth are much enhanced if I avoid eating sweets or very acid food, if I have healthy gums, if I have the right kind of genes, and so on. Even people with clean teeth get caries.

Impact = A long-lasting result that can be directly traced to an intervention. For example, if my dentist extracts a tooth, the impact is a gap in my mouth. 

Tuesday, 15 March 2016

Evaluation terminology

Today a friend has asked me about the difference between findings and conclusions. I put it this way:

Findings:
  • Dust has gathered into small woolly clouds in the corner of the room.
  • Crumbs are scattered all over the floor.
  • There are a couple of spiderwebs in the corners of the ceiling.
Conclusion: This room is dirty.
Recommendation: Clean it.

Also a nice way to explain indicators.

Busy!

Deep into the evaluation of this exciting project, www.womenonthefrontline.eu Will be back by April with new posts...

Thursday, 21 January 2016

Happy New Year, good new reading

Happy new year! For me, 2016 starts with an exciting evaluation assignment spanning some 30 organisations in 7 countries. Which makes that I have a whole collection of topics I would like to write about here, but no time to do so at this point.

So I would like to recommend good new reading: DFID has just published the guidance note Shifting Social Norms to Tackle Violence against Women and Girls that draws on the growing body of literature on the topic. In my view, the best part of it are chapters 3-6 on Social Norms Theory and how to integrate it into programme design, all explained in relatively clear, straightforward terms.

Saturday, 12 December 2015

Writing cultures

A few weeks ago I attended another public discussion on the (potential) role of evaluation in policy making. The brief conference - basically, a panel discussion followed by a question-and-answers session - was hosted in Berlin by the Hanns Seidel Stiftung and CEVAL, the Centre for Evaluation at Saar University. The panel was made up of German speaking evaluation specialists from Austria, Germany and Switzerland.

Policy uptake of evaluation findings has been a main topic of the International Year of Evaluation 2015– for instance at the Paris conference I wrote about in October. Evidence gathered in evaluations and research is supposed to support political decision making.

Monday, 23 November 2015

Virtual Workshopping

Earlier this week I facilitated an internal reflection and planning meeting with evalux, a Berlin-based evaluation firm which celebrated its 10th anniversary this year. One of the workshop participants was based in Beijing. It would have been too onerous to fly her over to Berlin, so we found a way to beam her into the workshop via the internet.

I like highly participatory workshops, where people work in alternating configurations –

Wednesday, 14 October 2015

Participatory research!

Wow - read this presentation of participatory research by 16-24-year-old girls and young women in Kinshasa. An exciting piece of work supported by the UK Department for International Development (DFID) and Social Development Direct (SDD). The initiative turns the "objects" of research into researchers. I trust it will yield much richer information than what you would get from a "top-down" externally-designed survey on young women in Kinshasa. And the young people who collect and analyse the information will gather skills, knowledge and strength in the process! I would expect their interviewees to benefit from the process, too.

Governments which fund development want to see "evidence-based" approaches, that is, research needs to be built into development. Fortunately, the widespread misconception that only large-scale quantitative surveys and experiments yield reliable evidence appears to be fading.

Monday, 12 October 2015

Good things happen in the short term and bad things happen in the long term

This is a long title but I love that sentence, culled from Elliot Stern’s intervention on the Benefits and Barriers to Evaluation Use at the recent evaluation conference in Paris. The one-day conference, convened jointly by the European Evaluation Society, France’s evaluation society, the United Nations Organisation for Education, Science and Culture (UNESCO) and the Organisation for Economic Cooperation and Development (OECD), took place at the quite extraordinary UNESCO headquarters in Paris.

Wednesday, 23 September 2015

My data

A couple of days ago a colleague working on an interesting new e-learning tool invited me to test an initial, yet unofficial version of that tool. I clicked on the link they had sent to me. A screen appeared which asked me for my full name, my e-mail address and my company. Every single field was mandatory, that is, I could not move to the subsequent screen without providing my name, my e-mail address and a company name.

That is a threshold. When you open a book or a newspaper, no-one asks you to send your name, your e-mail address or other personal data. You open the thing and you read it. The publisher can track the number of sold books - to some extent - the places where they have been sold, and that's it. Has anyone ever complained about that?

Wednesday, 16 September 2015

Interesting debate on evaluating human rights work

Who is evaluation of human rights work for? How about "strategic plausibility" as an evaluation criterion? How do we measure success when protecting civilians in conflict? These are the kinds of questions discussed in this web debate on evaluating human rights work. Very commendable!

Workshops that work: Six essential tips for facilitators

It is delightful to get plenty of positive feed-back on the workshops I design and/or facilitate. A few weeks ago one participant even described the workshop I facilitated as a "once in a lifetime experience"! Since I would love all workshops people attend to be useful, I have started asking participants to tell me what exactly they like about "my" workshops, so that I can share it here. Some of the points below have been made in earlier posts on this blog, others have come up in recent conversations.