Friday 25 September 2020

More handy tips for videoconferences

An addendum to yesterday's post - ICA:UK, a reliable source of materials and training on highly participatory facilitation, has summarised 10 principles to prevent online fatigue. I've been using all of them. They work. 

At any rate, avoid text-filled slide shows with voices droning on in the background! Visual aids are great, but if you just show pages and pages of text that you read to your audience, they'll end up muting you and joining a different event on their other computer. I admit that was what I did in a recent conference full of half-hour text-rich presentations by invisible voices. I couldn't help it. Since the different conference followed the same mode, I still felt I would have been better off reading an article, in my own time, at my own (rather energetic) pace.

Tuesday 22 September 2020

Easy socialising in tight video conferences

How to recreate a sense of a "real life" team event in a video conference? In real life (IRL, as nerds put it), people usually linger near the coffee/tea kitchen or in the hallway for a quick chat - one reason why it tends to be so hard to get participants back from "real" breakout rooms. 

"Random" virtual breakout rooms - if they don't come with too burdensome assignments - can recreate this atmosphere.

Like most facilitators I know, I have facilitated more video conferences in 2020 than ever before. I have discovered that participants tend to hijack virtual breakout rooms: Before getting started on the small group assignment, they'd have an informal chat on totally different subjects. Or, in other cases, they'd get the assignment done as fast as possible so as to spend the rest of the small group chat on their own agendas.

Thursday 10 September 2020

Know what you need to know

Evaluations often come with terms of reference (TOR) that discourage even the most intrepid evaluator. A frequent issue are long lists of evaluation questions that oscillate between the broadest interrogations – e.g. “what difference has the project made in people’s lives” – to very specific aspects, e.g. “what was the percentage of women participating in training sessions”. Sometimes I wonder whether such TOR actually state what people really want to find out.

I remember the first evaluation I commissioned, back in the last quarter of the 20th century. I asked my colleague how to write TOR. She said, “Just take the TOR from some other project and add questions that you find important”. I picked up the first evaluation TOR I came across, found all the questions interesting and added lots, which I felt showed that I was smart and interested in the project. Then I shared the TOR in our team and others followed suit, asking plenty more interesting questions.

I wonder whether this type of process is still being used. Typically, at the end, you have a long list of “nice to know”-questions that'll make it very hard to focus on questions that are crucial for the project.  

I know I have written about this before. I can’t stop writing about it. It is very rare that I come across TOR with evaluation questions that appear to describe accurately what people really want and need to find out. 

If, as someone who commissions the evaluation, you are not sure which questions matter most, ask those involved in the project. It is very useful to ask them, anyway, even if you think you know the most important questions. If you need more support, invite the evaluator to review the questions in the inception phase – with you and all other stakeholders in the evaluation – and be open to major modifications.

But please, keep the list of evaluation questions short and clear. Don’t worry about what exactly the evaluator will need to ask or look for to answer your questions. It is the evaluator’s job to develop indicators, questionnaires, interview guides and so forth. She’ll work with you and others to identify or develop appropriate instruments for the specific context of the evaluation. (The case is somewhat different in organisations that attempt to gather a set of data against standardised indicators across many evaluations - but even then, they can be focused and parsimonious to make sure they get high quality information and not just  ticked-off boxes.) 

Even just one or two evaluation questions is a perfectly fine amount. Anything more than ten can get confusing. And put in some time for a proper inception phase when the evaluation specialists will work with you on designing the evaluation. Build in joint reflection loops. You’ll get so much more out of your evaluation.