Indicators and data collection instruments are just one aspect of a good monitoring system. Most importantly, early in (or ideally before) the actual project, people should sit down and think:
Saturday, 25 May 2013
A rough guide to building monitoring systems
Today, I wrote up a few lines for a colleague to summarise the key steps of building monitoring systems. For the case the same question pops up again, I have saved my response here:
Wednesday, 15 May 2013
Practical tips: note-taking during interviews
Taking notes in "qualitative" interviews, for example at the occasion of evaluations, seems a fairly simple and straightforward thing to do. Over the years, I have discovered that people take notes in many different ways. The way that works best for me is demanding on the interviewer, but quite efficient: no/ little need for extra transcription time after the interview!
Monday, 13 May 2013
Free webinars on evaluation
...now available on the YouTube Channel of the American Evaluation Association.
The webinars are in the first row of the page that opens when you click on "YouTube Channel" above. The second row is filled with other interesting videos, such as Julia Coffman's speech on evaluator's mistakes and the importance of sharing them with others so that they don't repeat them. Enjoy!
The webinars are in the first row of the page that opens when you click on "YouTube Channel" above. The second row is filled with other interesting videos, such as Julia Coffman's speech on evaluator's mistakes and the importance of sharing them with others so that they don't repeat them. Enjoy!
Monday, 6 May 2013
What men can do to end gender violence
An impressive TED talk, not only for beginners - click HERE to see it. Jackson Katz encourages "bystanders" to stop "laughing along or pretending not to hear it". The speech is not totally respectful of feminist organisations (shame on you if you're more interested in the video now ;-) but it is definitely worth watching.
Saturday, 4 May 2013
Quotes from the "Politics of Evidence" Conference
In late April, the "Politics of Evidence" conference brought together (at the IDS in Brighton) development and evaluation practitioners from a rich mix of professional backgrounds and countries. Some of the papers presented have been available on the internet, such as Rosalynd Eyben's article on the The power of results and evidence artefacts. Eyben's paper is about tools initially designed to stimulate critical thinking which have mutated into instruments of confusion: For example, the log-frame was built to encourage people to reflect on the assumptions behind their "project logic" - but in most contemporary versions, the "assumptions" column has disappeared.
Labels:
big push forward,
IDS,
politics of evidence,
RCT,
rigour
Subscribe to:
Posts (Atom)