Who is evaluating the evaluators?

Posted on 17 Aug 2018

By Matthew Schulz, journalist, Our Community

Social impact measurement isn't new, but the demand for quality practitioners is growing rapidly.

Once, impact measurement was an afterthought to grants, but now it's regarded as a critical issue that needs to be factored into planning any program or social investment.

Evaluations are now integrated into funding models and expected at every stage of projects. Yet finding the right talent is a big issue.

So how does someone become an evaluation specialist? And how can you spot a good one?

That's at the forefront of the minds of members of the Social Impact Measurement Network Australia (SIMNA), founded in 2012, and now connected to Social Value International, which links practitioners in 45 countries.

SIMNA is active in sharing knowledge and resources through events and training, setting standards, hosting annual awards for good evaluators, and pushing for better policies and debate.

Circleimg
Social Impact Measurement Network Australia practitioners discuss the state of the industry. Picture: Marli White.

Its membership of more than 1000 ranges from dabblers to leaders in standards and practice.

They are in no doubt that good evaluators are needed now more than ever, with SIMNA members asking at a networking session late last year:

  1. What are the barriers to qualified evaluators?
  2. What are we doing to build expertise?
  3. How can we build a pipeline for practitioners?

Systems thinking an important attribute for evaluators

Many practitioners admit having "fallen into" impact evaluation from other fields.

Kelly Sparke is a case in point, having worked in juvenile justice, the corporate sector and engineering before becoming a data analyst and community insights manager at the Lord Mayor's Charitable Foundation in Melbourne.

She helped produce Melbourne's Vital Signs report, which featured in the November 2017 edition of Grants Management Intelligence, and was among panellists at the networking session.

Some practitioners have occupied program management roles, or have stepped into the breach after holding technology or research-related positions.

Trends in data, design and theory come and go, but a common thread among many practitioners, and recruiters is the importance of systems thinking. Although people drawn to the field tend to have an aptitude for it, it's a skill that can be learnt.

MIT's Peter Senge says systems thinking is underpinned by the notion "that we live in webs of interdependence" (read his definition here).

He says three crucial characteristics of a systems thinking approach include:

  • A deep commitment to learning
  • Being prepared to be wrong, and challenge your own "mental models" to allow you to find unexpected areas of "leverage"
  • Being able to "triangulate"; that is, getting different people with different views to come together to see collectively something that they might miss individually.

Good evaluation draws on collective intelligence

GT
Melbourne University's Dr Ghislain Arbour.

Dr Ghislain Arbour, coordinator of the Masters of Evaluation at Melbourne University, is among those shepherding some of the best evaluation practitioners through a more formal training system, even though he says impact evaluation remains "a very immature field".

Originally qualified in law and public policy analysis, Dr Arbour told the group that his first ambition was to be an inventor, but he says a problem-solving spirit continues to drive his evaluation philosophy: to "know things better … to do better".

He said one of the keys to impact evaluation is that "it doesn't put all the onus on one smart individual, but draws on collective intelligence".

Dr Arbour said the keys to good impact assessment are method, design and theory.

Mastering method meant being capable with the "old-fashioned but essential" techniques of research and data collection. And a good evaluator not only is an expert in their chosen field, but is able to think theoretically and to design research well, he said.

Recruiting for tomorrow

Another leader in the field at the session was Dr Jess Dart, the founding director of Clear Horizon Consulting, who has more than 20 years of experience in the field, including significant work in designing and conducting large-scale evaluation reviews.

Dr Dart says the skills Clear Horizon seeks have significantly changed, and advances in technology and techniques mean the focus is now "recruiting for tomorrow".

Alongside the ability to learn and a great work ethic, Clear Horizon looks for the "ability to step beyond the obvious" using such methods as systems thinking and theory of change.

That's not to say it hasn't found available talent, with up to 70 applicants seeking entry-level research assistant roles.

Dr Dart believes good evaluators need to be talented facilitators, particularly those involved in developing a "theory of change".

She says would-be practitioners should be conversant with the monitoring and evaluations standards model used by the Department of Foreign Affairs and Trade (DFAT) to guide its overseas aid program. The 42-page document details a process of planning and review, one that sets a baseline for quality assurance for evaluations, monitoring and evaluation, and design.

The document, she says, emphasises the need for both flexibility and rigour in practice.

But Dr Dart accepts not all organisations need to reach those "higher levels" of evaluation or be unnecessarily elaborate in their evaluation design.

Revisiting the concept of collective intelligence, Dr Dart says not all evaluators have all the "hard and soft" skills needed for effective evaluation, and creating a team can enable an organisation to stitch together the qualities needed. The team's skills must bridge the following:

  • technical (quantitative and qualitative)
  • facilitation
  • project management
  • engagement.

Dr Dart says this could mean recruiters wanting to round out their units may want to look for talent with experience in program management, or knowledge of the social sciences and politics, for example.

Understanding your organisation

Picking up on Dr Hart's view about keeping an organisation-appropriate view of evaluation, Vanessa Power, the manager of performance and systems for the Brotherhood of St Laurence, said that for NFPs and smaller groups it was critical that evaluators ensure they are "collecting information in a way that's not overly burdensome", and that they are "respecting what you don't know".

Partly, this reflects the fact that grantmaking can occur over long cycles, which also means contracts and evaluations must be considered well in advance.

She said it was important to regard evaluation as not just "celebrating achievements", but as an objective and impartial analysis that leads to improvements.

SIMNA Victorian steering committee member Marli White and others at the session relayed a string of useful resources for evaluators, some of which are listed here.