Getting evaluation right

Evaluating should be part of the planning so that you know from the beginning what you are evaluating and why, and so that you can test progress along the way. Establishing objectives and a story or theory of change should guide what sort of evidence you collect, alongside the requirements of the tender.


Strategic evaluation

Commissioners will need to know impact and see an assessment against outcomes. Increasingly commissioners are looking for rigorous longitudinal research, which asks the same participants to feedback before and after delivery of a service. A couple of fundamental evaluation principles are to collect a baseline to show progress, and to establish what would have happened anyway, so as not to over claim. If you are working in partnership, attributing how much impact is due to whom is important too.

Depending on timescales, showing impact could again be through 'leading' or 'lagging' indicators. Lagging indicators are easy to evidence but come into play too late to make changes, for example how many people get a job in a skills programme. Leading indicators give you time to make improvements, for example seeing if people have the confidence to make an application part way through.

Another leading indicator might be a quality assessment. There are plenty of quality frameworks to look at from international standards like ISO9001, through non profit approaches like PQASSO, to culturally specific tools like the Arts Council England’s Quality Principles for work with children and young people, which are useful when looking at any participatory work.

Assuming there are good results, it’s worth getting visibility, maybe nationally, for what you've done. It all helps the sector overall. How you plan to disseminate your results and share your learning should feature in your communications plan.

Using the right approaches

The tools used for gathering evidence need to be appropriate to the communities you are working with, and allow everyone’s voice to be heard. Embedded activities are best for participants.

Increasingly blogs and social media can be a way to keep communities involved, especially with links to pictures and film. Website analytics are also useful to provide evidence of interest and reach.

At the same time, the evidence generated needs to be right for the commissioner, using health tools for health projects for example. Sometimes this will mean some compromise with methods that are right for participants.

You might want external expertise; there are lots of resources out there, both in the form of organisations offering guidance and specific tools to use.