Dr Daniel Turner, director of Quirkos

It might have not crossed your radar, but 2015 has been declared as the International Year of Evaluation, around which a series of events and publications aim to “to advocate and promote evaluation and evidence-based policy making at international, regional, national and local levels”.

In Local Authorities, appraisal, monitoring and evaluation exercises are a common source of requests for research and data. However, there is an increasing demand for evaluations of externally commissioned projects, as more services are being run by local community based organisations. These may link with activities being provided in any part of Local Authority responsibilities. The Whole Place Community Budgets programme is just one framework in which smaller service providers are being supported by local government.

Laudable though these efforts may be, a common worry from LA commissioners is that many local organisations they would like to collaborate with lack the capacity to engage with the funding and reporting requirements of essentially being government contractors. While financial reporting obligations are often familiar enough for established charities (which must be independently audited every year) and social enterprises (with annual Companies House submissions), demonstrating impact is a challenge for all stakeholders.

With small pilot schemes, demonstrating measurable quantitative improvement is very difficult to achieve, especially to the level of statistical significance that reassures researchers and funding panels. However, these projects can probably contribute good qualitative evidence in their evaluation, from testimonials of service users and staff, workshops, interviews and focus groups with stakeholders. Having a rigorous process for the analysis and presentation of this qualitative data can be a challenge to some groups, not as familiar working with qualitative data as they are quantitative data in a spreadsheet.

That was one of the reasons we designed Quirkos, as we felt there was a need for software to manage and analyse qualitative data, which was easy enough to be used by 3rd sector providers, commissioners and funding boards to explore and present qualitative data.

Yet there is also a need to share evaluations in an accessible format so that other departments and regions can learn from them. Mini-case studies are a good way to do this: a format that presents learning points as well as summary evaluations in one page summaries. I was involved in developing a series of case studies on commissioning health and social care services for minority ethnic populations. It neatly shows 10 examples of how largely qualitative insights demonstrated the effectiveness of changes to service provision.

Evaluations can be powerful tools for best practice, and integrating qualitative data can provide humanising detail as well as deep insight into the best ways to improving service delivery. There is no one approach for integrating qualitative data into evaluations, but there are tips for qualitative evaluations on our recent blog post article.

And a series of links to other useful resources and guides:

http://www.civilservice.gov.uk/wp-content/uploads/2011/09/Qualitative-Appraisal-Tool_tcm6-7385.pdf
http://ec.europa.eu/regional_policy/sources/docgener/evaluation/doc/performance/Vanclay.pdf
http://ncsu.edu/ffci/publications/2011/v16-n1-2011-spring/vaterlaus-higginbotham.php

Photo credit: London by Roberto Trm