How to measure your outcomes when you've only partly funded a project

Jen Riley, chief impact officer at SmartyGrants spoke with Squirrel Main about reporting on part-funded projects.

Squirrel Main, Evaluation Manager, Paul Ramsey Foundation
Squirrel Main, Evaluation Manager, Paul Ramsey Foundation

As a grantee, you might have wondered, how are we supposed to report on the outcomes of a $10,000 grant that we combined with a bunch of other grants to fund a project worth $100,000?

Or perhaps your organisation has self-funded the construction of a new building with a budget of $1 million – except for $100,000 contribution from council. Now council wants to know the outcomes of its grant. What will you tell them?

What can a funder reasonably ask of grantees in terms of outcomes measurement, and what outcomes can they claim to have contributed to?

It's like asking how to assess individual contributions when you're playing a team sport.

The evaluation manager at the Paul Ramsay Foundation, Squirrel Main, deals with scenarios like these regularly, and we asked her for some insights.

Jen Riley
SmartyGrants chief impact officer Jen Riley

As a general rule, Main said, it is sensible to report on the whole rather than try to disaggregate. The work of trying to ”unpick” a project is akin to trying to work out which ingredient in the vegetarian laksa makes it amazing (is it the ginger, fish sauce or garlic).

Occasionally there are exceptions to this rule, where it’s easy to identify the difference made by one funder’s $10,000. For example, if a grant enables you to expand your literacy program from Bathurst to Orange, then you can report on the program’s Orange outcomes.

But more often, it’s not that simple. A grant of $10,000 might enable you to purchase some new books or laptops for the literacy program, or a new oven or a saw for the men’s shed. It might enable you to offer a higher salary for the position of program coordinator, which means you draw more and better applicants.

Most likely, that $10,000 grant goes towards the overall project budget (along with six other grants of various amounts), and the thought of trying to work out exactly which line item it goes against makes you want to cry.

So what do you do? The geographic option is off the table, the did-we-buy-something-specific option is not useful because the oven and the saw by themselves did not achieve outcomes, and your new program coordinator is away on long-term carer’s leave. How do you report?

Top tips for calculating on part-funded projects

Main suggests calculating a number, a proportion, and she recommended two methods:

  • Calculate a fiscal proportion, so if the grant was 10% of the budget, then you can report that 10% of the outcomes were contributed by that $10,000 budget.
  • Calculate a stated proportion, meaning you work out a proportion that the funding “contributed to”. For example, if the funding went towards paying the rent, which meant you actually had somewhere to run the program, you could state a proportion (say 50%) and make the argument that 50% of the outcomes came as a result of that funding.

There are various ways of working out this stated proportion.

One is to ask the beneficiaries: on a scale of one to five, how much did the saw contribute to you being here at the men’s shed? Then work through some maths. If on average the response across 10 men was “three”, then three out of five or 60% is the figure you could report to the funder.

Whether or not you do the maths on splitting your outcomes and reporting on 60% is up to you and the funder. You could report the whole and specify in your report to the funder that you have undertaken contribution analysis and your beneficiaries and other sources of evidence suggested that 60% of the results can be linked back to the funding.

I would avoid using the word “attributed” here. Attribution in social outcomes reporting is very complicated (dealing with things in the controlled environment of a petri dish are easier to attribute causal effects to than humans in the messy social world), so instead we talk about contribution analysis.

In 2008, Canadian evaluator John Mayne wrote an article called “Contribution analysis” to help with this challenge of how we “claim” that x caused y. He produced a methodology that allowed evaluators to put forward a reasonable case, rather than conclusive proof, that interventions or certain elements within an intervention led to the change.

The point John Mayne makes, and Squirrel Main echoes, is that it is useful to come up with a plausible, evidence-based narrative about why $x contributed to y% of your program.

The evidence base can come from asking your beneficiaries, looking at the counter factuals (for instance, hardly anyone turned up at the men’s shed before the new saw was installed), and just common sense. For instance, without the rent being paid on the men’s shed building there would be no outcomes, but to say it contributed to all the outcomes discounts the impact of the other inputs, so work out a “stated proportion” and state your case as to why that passes the pub test.

In all of these scenarios, there is a considered dialogue about outcomes. In some cases, it may be okay to state the overall outcomes and report to the grantmaker that their funding contributed to those outcomes without stating a proportion.

The point is grantmakers and grantees are talking about what works and all play a role.

Successful evaluation is less about “claiming success” than it is about being curious about what contributes to change, so we can cooperate to understand what works and apply the lessons we learn.

Find out more about outcomes

Measuring outcomes for grantseekers webinar recording

When your 'outcomes' are way off target

Ask Jen more about outcomes and evaluation

SmartyGrants’ chief impact officer Jen Riley has more than 20 years’ experience in the social sector, having worked with government and large not-for-profits of all kinds in that time, and been part of leading firm Clear Horizon Consulting. She’s a specialist in social sector change with skills in strategic planning, program, and product design and management. If you’ve got a pressing question about evaluation and outcomes measurement, ask here! You'll find the answers on the SmartyGrants forum (available to SmartyGrants users)