How will grantmakers cope with AI-generated applications?

Posted on 01 May 2023

By Matthew Schulz, journalist, SmartyGrants

Crash Test Dummy i Stock 157192999
ChatGPT can be unsophisticated, yet can be a powerful tool for not-for-profits.

Not-for-profits and charities are leaping onto the ChatGPT bandwagon for a range of reasons, and among the top uses is generating grant applications, faster.

Not-for-profits and charities are leaping onto the ChatGPT bandwagon for a range of reasons, and among the top uses is generating grant applications, faster.

Our Community is taking a keen interest in the field of artificial intelligence, and its four-person data team is using machine learning and other techniques to develop new products and insights for grantseekers and funders.

In fact, the team has been using ChatGPT’s powerful cousin, GPT-3, for testing and demonstration purposes for more than a year, using the language model to generate mock applications on its SmartyGrants platform.

Our Community data scientist Nathan Mifsud said ChatGPT had witnessed “the fastest uptake of a technology platform in history”, with more than 100 million people using the chatbot in just two months.

The latest and fastest version of the platform, ChatGPT Plus, is now available at $22 per month, although free versions are likely to remain available for the foreseeable future.

Dr Mifsud said ChatGPT and other AI “language models” had the potential to assist groups with basic writing tasks, including grant applications.

These tasks include:

  • brainstorming ideas
  • summarising and synthesising material
  • rephrasing pitches or reports to serve different audiences
  • conducting research of all kinds
  • matching applications to funding criteria
  • understanding funder requests.

While Dr Mifsud said the AI output could be unsophisticated and “boilerplate”, for some not-for-profits, “it might eliminate the ‘blank page’ problem”.

The chatbot can “remember” earlier conversation, so users can improve its input by directing it to modify its responses in various ways, improving them gradually.

He said the system had “been fed most of the internet”, comprising 300 billion words, and had generated “statistical relationships” between all those words and phrases.

“What sets ChatGPT apart from earlier language models is that it’s been provided human feedback,” he said.

But Dr Mifsud cautioned that ChatGPT can be an unreliable assistant, with errors liable to be hidden by a “veneer of perfect English” (or any of the other 100 languages ChatGPT is able to recreate). He suggested users cross-check any material they generate with trusted sources.

Grantseekers, for example, should also seek input from the Funding Centre’s “answers bank”, which provides sample answers and questions for not-for-profits.

Dr Mifsud said the technology was also fraught with ethical issues, such as the use of poorly paid human labour to weed out “toxic responses”, the amplification of bias, and the domination of the technology by corporations.

Despite these concerns, many not-for-profit groups are keen supporters.

Howard Lake
Digital fundraising entrepreneur Howard Lake

The founder of the consultancy UK Fundraising, Howard Lake, is upbeat about the potential of ChatGPT for not-for-profits, writing on his firm’s website that ChatGPT is akin to a “team of assistants”.

He said the system could produce material, including grant applications, in a variety of ways and in many languages, and was also set to mesh with major software platforms.

He said ChatGPT was just one of the first iterations of an AI revolution that would see many more tools and bots appear on the scene, and probably quickly.

In a follow-up commentary, Lake stressed the importance of “prompt engineering”, such as accurate framing of questions, to train your selected AI to produce the kind of content needed.

In The Conversation, Sydney University lecturer Marcel Scharth explored that theme further, describing the value of setting key parameters to steer AIs toward a desired result.

He suggested that “prompt engineering is essential for unlocking generative AI’s capabilities” but described that process as largely “trial and error”. Prompts can also help avoid inaccuracies by specifying ChatGPT uses only reliable sources, he said.

ChatGPT grant-writing tools are breeding like rabbits

For fundraisers seeking guidance, $25 courses on “using ChatGPT to write your grant applications” are already popping up. Boultons Multimedia, for example, offers online workshops and promises that “writing successful grant applications has never been easier”.

Media commentator Tim Burrowes asked the AI to generate a fake film funding pitch “guaranteed to win funding from Screen Australia”.

But tongue-in-cheek applications aside, across the globe grantwriters are increasingly using AI for funding applications.

In late March 2023, UK organisation Charity Excellence launched a free grantwriting tool.

The GPT-3-powered service offers “a free charity and non-profit grant application bid writing service using our AI tech bunny, primarily aimed at small charities that cannot afford a professional bid writer”.

While it didn’t pretend to match the services of a good grantwriter, it offered to “save them time by collecting the information they need from staff teams/groups and creating a draft for them to use in writing a grant application”.

Importantly, the service requires significant input from users, including many details about the grantseeker, project, goals, dates and proposed success measures.

Global fundraising consultants Grenzebach Glier and Associates, based in Chicago, predict AI-generated proposals will “flood the grant market”, partly because it’s now easier than ever to use AI grantwriters to “produce countless proposals”.

In Canada, the not-for-profit Project Include did a snap assessment of ChatGPT’s performance in helping the organisation better structure grant applications and tested whether applications met assessment criteria.

Executive director Afifa Saleem suggested the AI was imperfect but could be used to “guide your process” when it came to writing. She rated the system 7½/10 for its ability to evaluate whether grant applications addressed funders’ criteria.

In the US, grantwriting consultant and trainer Dr Krista Kurlinkus is fine-tuning the use of ChatGPT and boasted on her website that she had written a top-notch grant proposal “in just one hour”.

She was able to guide the system using prompts to generate 13 pages of a “quality grants proposal that I would feel confident in submitting to any foundation”.

Yet Dr Kurlinkus is one of a chorus of professionals who believe ChatGPT is merely a tool for grantwriting, not a replacement.

Kylie Cirak
Grantwriter Kylie Cirak

Australian not-for-profit grantwriter Kylie Cirak, writing for SmartyGrants in 2018, said that funders often rejected applications for reasons unrelated to the worthiness of their cause.

“Applicants are regularly punished for not understanding your language and your needs. They are also punished for not using proper grammar, for poor spelling, for badly constructed sentences. And don't tell me you look beyond this, because 90% of us don't – we can't help but judge the way people write, the way they speak, the way they present.”

For many users, ChatGPT may be useful in reworking applications to avoid those problems.

Funding Centre grants database manager Stefanie Ball said ChatGPT was a useful research assistant, nominating the following question as an example:

Stefanie Ball
Funding Centre grants database manager Stefanie Ball

“Provide a detailed summary (no more than 1000 words) of the X Foundation, including organisations it has funded in the past, the average funding amount granted, and any priority areas.”

But she warned that answers to queries of this kind must be rigorously fact-checked, with ChatGPT known for inventing spurious “facts” in its answers. The bot also generated a lot of superfluous content, she said.

“I recently assessed more than 100 applications, and several stood out – not in a good way – for using vague and non-specific language, raising suspicions they had been generated by ChatGPT. Those applications were clearly less personalised and specific to that organisation and their projects,” she said.

She predicted grantmakers would become increasingly ChatGPT-savvy, using tools such as ZeroGPT to detect AI-generated text.

She urged applicants to back their applications with hard data and case studies, whether they used AI or not.

“No one knows your organisation and its mission like you do, not even a highly capable tool such as ChatGPT,” she said.

Ms Ball said the Funding Centre provided a less sophisticated, but possibly more useful, “answers bank” for grantseekers, comprising examples of the kinds of questions they were likely to face, and answers they were likely to need.

Why humans still need to steer

Paola
Dr Paola Oliva-Altamirano

It’s a common refrain among experienced grantmakers and grantwriters that in the field of funding, “people are the answer, technology is just the tool.”

Our Community chief data scientist Dr Paola Oliva-Altamirano said humans will always be needed for decision-making.

She cited the organisation’s development of a new grants classification tool, CLASSIEfier, an algorithim-based system that automatically assigns categories defined by CLASSIE, the social sector “dictionary”.

CLASSIEfier is now part of the SmartyGrants grantmaking platform, but it is carefully monitored by people to ensure its accuracy and to avoid bias.

Dr Oliva-Altamirano explained that while AI models could be up to 98% accurate in “best case scenarios”, their operation was still “a probability game”.

She cited the example of an earlier iteration of CLASSIEfier that generated some results that “reinforced harmful racial stereotypes”.

That set off alarm bells and a rethink. The bot was redesigned to use a different kind of word-matching algorithm, with the data science team continuing to monitor the system to correct for bias where needed, she said.

She stressed that humans would always be needed to validate and refine results generated by AI, particularly in cases where the AI results could carry accidental racial or gender bias, or could lead to poor decisions.

“Machines are really useful and can perform better than humans in automating repetitive tasks, but if those predictions are going to be used for decision making, it is necessary to have a human review of the results,” Dr Oliva-Altamirano said.

SmartyGrants continues to develop tech tools to assist grantmakers, such as the Outcomes Engine, and another one in the works dubbed “Tessa the Assessor”. It is hoped that Tessa and her components will be able to streamline the assessment of grant applications.

Already the project has uncovered insights about what works and what doesn’t when it comes to automating assessments, and of course it highlights the ethical considerations of turning decisions over to a bot.

SmartyGrants executive director Kathy Richardson said while some may shy away from the idea of handing control to a robot, “automation can have its advantages; in fact, it can promote fairness.”

“A machine does not get tired and will follow consistent rules – the same cannot always be said of human assessors,” she said.

What ChatGPT ‘thinks’ about being asked to write those grants

Funnily enough, ChatGPT pretty much agrees with our experts. We asked it, “Could I use ChatGPT to write a not-for-profit's next grant application?”, and it told us that while ChatGPT can provide general information and guidance on grant applications for not-for-profits, it is not a substitute for professional grant writers who have experience in crafting compelling proposals.

“Grant applications require a thorough understanding of the specific organisation and its goals, and professional grant writers can tailor applications to specific audiences, increasing the likelihood of success,” the bot said.

“ChatGPT can be a useful tool in researching and generating ideas, but it is important to have a professional review and edit the application before submission to ensure it meets all requirements and effectively communicates the organisation's mission and goals.”

A bit clunky, but nothing a bit of editing couldn’t fix, right?

More information

SmartyGrants users: Tell us about your experience of ChatGPT on the SmartyGrants forum

Answers Bank: Sample answers and questions for grant applications

Our data science activities: Explore our projects here

Sign-up to our newsletter