Charities are increasingly required to demonstrate that they are achieving impact, based on robust evaluation of their work. While it is important to ensure resources are being used effectively, undertaking evaluation often poses challenges for charities, which may well have limited resources or experience with which to do this.
Funded by Nesta’s Centre for Social Action Innovation Fund, the National Institute of Economic and Social Research (NIESR) has recently been working with The Access Project (TAP), with the aim of strengthening TAP’s capacity to evaluate its own effectiveness. TAP supports young people from disadvantaged backgrounds to progress to top universities through one-to-one tutorials, aimed primarily at helping students to achieve better grades, along with multi-faceted university application support, enabling students to make informed choices. NIESR delivered guidance and tools for evaluating both of these elements. In this blog post, NIESR and TAP reflect on the project, with the aim of sharing their experiences with others embarking on similar work.
1. Why do you think impact evaluation is important in the social sector?
NIESR: Charities need to know about the impact of their activities so that they can ensure their efforts are having their intended effect. That’s not to say that impact evaluation will necessarily be able to capture all of the valuable outcomes produced as a result of a charity’s work – some outcomes may be extremely difficult or impossible to measure, and it doesn’t mean those outcomes are not important. However, just because it may be difficult to measure the impact of something, it doesn’t mean we shouldn’t try!
TAP: We really think that, as a charity, social impact is our bottom line. We exist to make a difference in the lives of the young people we work with and need to be able to evidence that difference. More generally, there’s been an increased push across the sector for evidence-based interventions. In education particularly, as a data-rich field, organisations are now expected to be able to show the effects of their work based on credible methodologies. Working with research institutions such as NIESR then becomes key; they have the expertise that small charities lack and can offer support not just with one-off evaluations but, more importantly for us, with building our own evaluation capacity.
2. What were some of the challenges you encountered during this project?
NIESR: A key challenge in this project was thinking about what form of evaluation would be feasible for TAP to carry out themselves and sustainable long after the project finished. This had to take into account data availability, selecting methods that were relatively straightforward to implement without requiring specialist software, and not placing undue burden on TAP’s staff. Meeting this challenge required close collaboration to fully understand TAP’s needs and operation. Given that TAP is continuing to evolve, we also needed to ensure our work would be relatively adaptable to changes that TAP faces in the future.
TAP: While we’ve invested considerable resources internally on monitoring and evaluation before, this has been the first time we looked externally for expertise. A couple of potential challenges we were particularly aware of, mainly speaking a different ‘language’ to the researchers and balancing robustness with the messy reality of delivering an intervention. In practice, these challenges were seamlessly ironed out in the course of our collaboration. NIESR were very sensitive to our working realities and our ability to implement the tools they were developing. In fact, most of our conversations ended up being around finding ways to marry science with the real world in terms of everything from data availability and statistics expertise to the challenges of surveying young people in a school context.
3. What was different/special about this project?
NIESR: TAP were great to work with as they are clearly very passionate about their extremely worthwhile work. It was relatively unusual to work on a project that was all about supporting another organisation to develop their skills and capacity to do their own evaluation, rather than conducting a one-off evaluation ourselves. However, TAP were already interested in and understanding of the issues involved in evaluation, which made our task much easier.
TAP: We patently did not seek a one off evaluation. What we wanted was a set of tools we could take on and integrate in our yearly operations, both in terms of impact measurement but also programme development and learning. In the end, what we discovered was that impact measurement can also be a great tool for impact management. Instead of only looking at our results once a year, many of the conclusions NIESR’s work allowed us to draw were put to use in tweaking and improving our model.
4. What were you hoping to get out of taking part in the project? How did what you actually got out of it differ, if it did?
NIESR: As well as university access being a particular area of interest for us, a strong attraction for taking on the project was the opportunity to have a direct and immediate impact on TAP’s ability to understand the effect of their work. TAP have already told us about work they’ve done or changes they’ve made that have helped them to understand and target their work better, allowing them to be as effective as they can possibly be. That has been extremely rewarding for us.
TAP: I think we definitely got what we sought – a robust set of tools for analysing our impact that we could realistically employ ourselves on an annual basis. What we also got (and did not seek) was a sounding board for our programme, our ideas around measurement and impact and our standards for evaluation.
5. What advice would you give to other charities/research organisation embarking on a similar journey?
NIESR: From the researchers’ perspective, it’s important to “embed” yourself in the work that the organisation is doing at an early stage. Three particularly important elements to understand are: what information and resources the organisation has (and what it might be able to obtain), what it’s able to do in order to achieve its goals, and, above all, what it cares about achieving. Without a strong understanding of these elements your work won’t be useful to the organisation you’re supporting. If resources had allowed, it would have been great to build in a longer formal mentoring component to the project to provide more hands-off support to TAP as they undertake the first couple of evaluations.
TAP: We’ve learned a lot about cross-sector collaborations in this process and we’ve definitely become believers in reaching out for where the expertise is. Being very clear on what your aims are, both in terms of the project itself but also the partnership as a whole. This involves clarity around deliverables, deadlines etc. but, equally important, around communication lines. Finally, I think one of the opportunities in working with external partners is that they bring a different perspective to the project and your work more generally. That is where the learning comes from!
TAP: We’re in the full swing of putting NIESR’s tools to work, gathering the data and running the analyses. So far it’s been insightful, useful and, for the evaluation geeks at TAP, pretty fun as well! It would also be great to find ways to continue our collaboration with NIESR in the future; we feel we still have plenty to learn on the impact measurement front.
Jake Anders and Lucy Stokes are both researchers at NIESR. Jake’s research seeks to understand the causes and consequences of inequality in education, and to evaluate policies and programmes aiming to reduce it. Lucy’s research interests within the field of education include issues relating to inequalities in educational outcomes, the school workforce, and school effectiveness.
Tamara Balenu is the Programme Development Manager at The Access Project, working to improve the model and delivery of TAP’s university access programme. For the last two years, Tamara has been wearing two different hats at TAP, combining frontline delivery in a London school with developing Monitoring and Evaluation systems across the organisation.