Monitoring is also supposed to make international development more effective. The trouble is that many people seem to think you can enhance the effectiveness of virtually anything by producing tables with figures on them, preferably SMART ones. Opinions differ, even within a single agency, as to whether it's the objectives, the indicators, the assumptions, the results or something in-between that has to be SMART (specific, measurable, achievable, realistic and time-bound/ timely). The consensus appears to be: whatever we talk about, it better be SMART, and SMART is when there are figures attached.
In response to this trend, larger NGOs have designated specific people whose main task it is to produce SMART figures to impress donors. One large and much admired agricultural organisation I have worked with employed a small crowd of good-looking young women much appreciated in donor meetings and fluent in several languages. They wrote proposals and reports to the donors, in English, French, Spanish and even Italian. They set the numbers of farmers to be trained, of seedlings to be planted and of acres to be watered. The rest of the organisation, as a senior member once confessed to me, continued to do their work, to the best of their knowledge and skills, and in blissful ignorance of those figures. The report-writing team produced its reports to the donors, never to be translated into the local language, estimated the attainment of the targets at, say, 71% for the farmers, at 103% for the seedlings and maybe 94% for the acres. When donors visited, the people who did the actual work in the fields picked the sites they considered particularly successful or interesting and took the donors there. I was among the donors and found those visits far more enlightening and convincing than the figures in the reports. What the visits tended to leave unclear was what had not worked well, and why. In that respect, I guess was hardly more ignorant than the report writing staff. But the people who did the actual work knew very well what they were doing, and could see the differences between successful and less successful parts of their project. They were in touch with farmers, discussing what worked out nicely and what the difficulties were, and they looked around to spot any interesting opportunity, or some risk that needed to be reckoned with. That was real monitoring. The tables were fake.
I believe donors can be convinced to pay more attention to real monitoring. One key step is to recognise that personal accounts are a legitimate way to find out about facts, and that figures can only give a fragmentary and distorted image of reality. Numbers are useful for budgeting, accounting, physics, engineering and other activities that require maths. Numbers can give an idea of the size or scope of something - but they don't tell us what exactly that "something" is. A classical example: a support centre for survivors of violence receives 40 clients' visits in February and 20 in March. Does that mean that February is a more dangerous month than March? Or does it mean that the centre's activities have caused a 50% drop in violence in the community in one month's time? Does it mean that 20 out of the 40 February clients have been killed, leaving only 20 for March? You can spend weeks interpreting such figures, and twist them either way to match your argument.
But the support centre in this example could also organise monthly staff meetings where everyone shares good and bad experience from the past month. Such meetings are SMART if they focus on specific topics and are run at regular intervals. They don't necessarily produce figures, but they generate knowledge and disseminate it throughout the organisation. Regular exchanges between the women and men who actually do the work are an excellent way to ensure an organisation keeps learning and adapting itself to new challenges.
I know a human rights organisation which has played a consistently important role in its society over decades. I believe that a main factor for the organisation's success is its system of monthly day-long staff conferences, which maintain a constant flow of internal learning. In those meetings, people do not fill in charts of benchmarks and indicators. They monitor, in a qualitative way that is adapted to the complexity of reality, what happens within and around their organisation. Such regular, appropriate monitoring processes keep engaging all members of the organisation in reviewing their work, thus enhancing the organisation's effectiveness. Reports to donors that are based on knowledge distilled through such a process give a precise, comprehensive and eminently legible idea of an organisation's work. Of course you may still need to include figures - but only those that matter, and only as one part of a bigger picture.
Post a Comment