For the City of Greater Dandenong, the Outcomes Engine is worth the cost of fuel
When the City of Greater Dandenong Council's grants team first signed up to use the SmartyGrants Outcomes Engine, the team knew it had some challenges – in fact, it had already begun a radical overhaul of its grants program in an attempt to address them.
The program consisted mainly of lots of small grants, and almost all applicants were focused on the small picture, on the activities they were undertaking, rather than on the big-picture changes those activities were making for the community.
For the council’s part, its team used a lengthy internal process to assess grants, and the structure of the applications meant that pulling out relevant data for comparison took a lot of time from a lot of people.
When the team learned of the Outcomes Engine, it seemed like a godsend, and they quickly signed on as a beta user.
The Outcomes Engine is a SmartyGrants feature that helps grantmakers to answer the perennial question “Did our grants make a difference?”
The system helps grantmakers tackle social impact measurement by providing the ability to upload an outcomes framework (a list of outcomes and associated metrics) that can be applied to any programs and rounds and linked to forms and acquittals.
But as the City of Greater Dandenong Council's team quickly found, it still requires grantmakers to put in the grunt work needed to be very clear about their intended outcomes.
When the grants team went looking for a plan to take the outcomes from, they were faced with many, many internal council plans. After much discussion, they selected nine major plans with over 200 objectives and outcome statements. It took the grants team and SmartyGrants staff days of work to turn these objectives and outcomes into a list of 200-plus outcome statements that were suitable for the grants program. Among other things, this meant making sure the outcomes were focused on changes in the community, not changes in the council.
Community development funding officer Monique O’Keeffe describes it as a “learning process”.
“We wanted to stay true to council plans and strategies, so we didn’t want to change the intent of the objectives or delete any, but there was a lot. As we started to pull together a list of outcomes, we found crossovers between the different plans and strategies, leading to similar outcomes in different areas. However, what this did was start conversations internally with subject matter experts in each department for a more consistent approach across each plan and to focus on priority outcomes for this current period.”
Meanwhile, the date for opening the next round was approaching fast, and the team decided that the final list of outcomes (200-plus) was way too long to insert as drop-down choices in the application forms. Instead, the team decided to allow applicants to suggest their own outcomes. The grants team would then update each application at the back end by selecting the City of Greater Dandenong Council's outcome that best matched each applicant’s outcome.
It sounded good in practice, but the quality of applications surprised the grants team.
“What we found was the majority of applicants did not write outcome statements as we would have expected,” Monique said.
“Many of the outcome statements were, in fact, activity statements, and if they were outcomes they were often multi-barrelled, containing more than two outcome statements, or just unrealistic for a two-year grant – for example, ‘solving poverty in Dandenong’.”
“So our plan was not going to work. We had to revisit our approach and we decided to run a series of workshops with the successful applicants.”
These workshops aimed to help grantees to:
- identify and articulate outcomes and measures for their projects
- improve their understanding of data collection
- update their applications with something that the council could work with internally.
The grants team decided that each successful applicant – there were 15 in total – would be required to attend such a workshop, and it scheduled three of these workshops.
But first, we provided an online one-hour training session with me, the chief impact officer at SmartyGrants, Jen Riley, to review some of the basics of outputs versus outcomes, and the types of outcomes that can be expected in the short and medium term.
“What is challenging about the outcomes space is that the language seems so confusing,” Monique said. “It was good to have a session that provided clarity on what we meant by outputs and outcomes and also spoke about what sorts of outcomes are realistically achievable in the two-year granting window.”
Almost all the applicants attended the workshops, and their feedback was very positive. They felt that getting a deeper understanding of outcomes and outcomes in their programs was something that should be available to all community organisations wanting to access grants.
After the workshops, applicants were asked to resubmit their outcomes, and the grants team has since reviewed each application to align those outcomes to City of Greater Dandenong Council's outcomes.
Grant applications are now assessed by community panels, and opportunities on offer include grants of significant amounts – up to $160,000 over two years. These grants are aligned with council outcomes.
Monique said the whole process of implementing the Outcomes Engine had been educational.
“For council, we really want to continue the conversation across departments around consistency in language … and the priority outcomes for funding … rather than the extensive list of council outcomes. This would make the structure of the outcomes framework easier to translate and clearer to report upon in our council systems, which presents the opportunity to standardise not just in council but potentially Australia-wide for comparison of data.
“Our diverse community is at the heart of what we do and we invest a lot of time in actively engaging with organisations that serve our community in our grants process. This provides so much on-the-ground feedback and knowledge that feeds back into how we can improve the program, our responsiveness and ultimately the outcomes for our community.”
More than six months on from those workshops, Monique and her team have found that the number of outcomes that are the focus of the 15 grantees is quite small, which raises questions about what is not being focused on or funded.
“There are so many learnings to take from this experience for when we do it again,” Monique said. “My keyword on this journey has been ‘iterate’ – reflecting on the grants program and understanding our own theory of change for community development and grants funding was part of this process and we have a number of learnings to now integrate into the upcoming policy review.”
Ask Jen more about outcomes and evaluation
SmartyGrants’ chief impact officer Jen Riley has more than 20 years’ experience in the social sector, having worked with government and large not-for-profits of all kinds in that time, and been part of leading firm Clear Horizon Consulting. She’s a specialist in social sector change with skills in strategic planning, program, and product design and management. If you’ve got a pressing question about evaluation and outcomes measurement, ask here! You'll find the answers on the SmartyGrants forum (available to SmartyGrants users)