A signpost shown against the sky

‍Four reasons why 50% of applications didn’t succeed in getting funding - and 11 tips to make sure your next one does.

In the past month we have assessed 228 grant applications for the Catalyst & The National Lottery Community Fund COVID-19 Digital Response fund. We have been privileged to get close to the challenges that charities are facing during this crisis, and have seen the incredible work that they are doing in adapting and responding to the needs of the communities they serve.

We have read about the challenges brought about by digital exclusion, online safeguarding and the urgent need to upskill staff and volunteers across multiple platforms.

We are witnessing a monumental acceleration in the digital capacity and capability of the third sector. Charities are right at the forefront of accessible, inclusive, safe and ethical tech and it’s a critical priority for the sector right now. 

But we can’t fund every charity - although we wish we could. Of the 228 we have assessed so far, 114 have been successful (103 Discovery grantees and 11 Development grantees). The aim of this article is to explain openly why some organisations did not succeed, and give every application the best chance of success next time.

How we assessed applications

Each application went through four stages over a two week assessment period (from application closing date to decision of award). 

  1. Due diligence checks
  2. Independent assessment scoring by at least two assessors
  3. Assessment panel shortlisting
  4. Decision making. 

Here are four of the main reasons why applications were turned down. 

Reason 1: They didn’t pass our eligibility and due diligence checks

44 applicants failed to pass due diligence checks. During these checks we looked at the accounts published most recently (either at The Charity Commission or Companies House) to make sure they were up to date. Then we looked at their projected budget (to see how they planned to spend the grant) and checked all of the requisite supporting documents (e.g.  insurance and policies). We also assessed their commitment to the programme, based on whether they had staff capacity and support from different levels of the organisation. 

The main reasons applicants failed due diligence checks were:

  • Their budget didn’t stack up - This was usually because the money was all allocated to capital items, leaving no budget for digital support or staff time. We provided a budget template but some applicants chose to use their own budget formats.
  • They could not provide recent accounts - If applicants couldn’t provide accounts from the current financial year then we were unable to accurately assess their status. 
  • Insufficient reserves - We have to think about an organisation’s ability to complete the programme, so if applicants couldn’t show that they had reserves to cover three months or more of operating costs then that could be a cause for concern. 
  • Key documents weren’t attached - If the specific documents we requested (such as public liability insurance, bank statement, list of signatories and recent management accounts) were not attached then we were unable to move forward with the application. 
Tips on overcoming these issues in future applications:
  • Check the eligibility requirements in the FAQs and read through the questions before you start to see what supporting documents you will need (especially in sections such as ‘Financial Position’ and ‘Key Policies’)
  • If templates are provided (for example for a budget), use that rather than your own. It means that you will provide the right information to the assessors.

Reason 2: Lack of evidence that applicants understand the problem to be solved

As an assessor you can only make a decision based on the evidence put before you. If that evidence is presented clearly then making that decision becomes much easier. This is especially important when decisions are made by consensus during a three hour panel meeting. 

You want to feel confident in that meeting. And the most important criterion you want to feel confident about is whether the applications you assessed show that the applicant understands the problem to be solved.

The main reasons applicants received low scores during assessment were:

  • They failed to show how they had challenged their own assumptions
  • They didn’t show evidence of having learned about or validated the problem for their users (or to put it another way, they didn’t provide any evidence that the thing they wanted to change was a problem their service users needed to have solved).
  • They didn’t provide supporting evidence of having gone through a discovery process. This could have taken the form of anonymised insights generated through user interviews, a summary of their user research methodology and outcomes, evidence of the numbers and diversity of users they spoke to, and / or insights showing the needs and behaviours of users.
  • If you’ve talked about user research, then make sure you attach (anonymised) evidence of it to the application
  • Try and carry out a combination of qualitative and quantitative research 
  • Show the journey you’ve taken:  your assumptions, how you challenged those assumptions and how that led to what you’ve learnt about your user needs and behaviours. A free online Design Hop can help you with all of these steps
  • Show that you understand the needs and behaviours of your users by using clear and concise user needs statements - e.g. ‘As a staff member at a local centre I need to securely share case notes with my team’

Reason 3: Presenting a solution without defining why it’s the right solution

When an applicant presents us with a solution that they want to develop (such as a mobile app, a chatbot, an e-learning platform, or a digital knowledge hub), we want to feel assured that this is the right solution to the problem. The last thing we want is for a charity to end up with a piece of tech that isn’t sustainable in the long term and doesn’t fix the problem. 

Let’s assume the applicant is really clear on the problem they want to solve (see Reason 2), we now want to hear why they’re confident that the solution they are proposing will solve it. We’re looking to understand:  

  • What other solutions have been tried or piloted? 
  • Who else has already tried to solve it? 
  • Why have previous solutions failed? 
  • What makes this solution different? 
  • Are there existing solutions or platforms that can be built on or reused?
  • You don’t have to be specific about the technology, software or platform you want to use. But you do need to be specific about what the outcomes of the solution should be (e.g., “The solution is an online learning platform that enables adults with learning difficulties to access our online training” rather than “We need 10 licences to the E-Learning4U platform”)
  • Show that you have looked at all of the options. Talk to other charities in the same boat as you, see who’s already tried to solve this problem and how. The Catalyst Shared Digital Guides might be a good starting point.
  • If there’s a specific platform you need to use (though remember, that level of specificity isn’t essential at this stage), then make sure you can demonstrate why you believe this is the best solution. For example “We’ve piloted three of our shortlisted platforms following a survey of 10 charities in our sector. E-Learning4U met our must-have priorities, tested positively with six of our target users and can integrate easily with our CRM system”
  • You could always try a free call with a digital expert via the Digital Candle service to give you a steer.

Reason 4: Applicants didn’t say how the work would benefit their network or the wider sector

As a funder, we want to be able to offer the greatest value for money. And for us, optimum value lies in something that benefits people, society and / or the sector more broadly. 

The prospect of one solution being shared and re-used multiple times by a wider group excites us. We’re not the only funder who feels this way. 

For example, imagine you’ve identified a need to upskill your volunteers so that they can offer online support to your beneficiaries. You’ve done your user research, piloted a number of different online platforms and tested some different training methods and materials. 

Now you want to develop a suite of online elearning training for your volunteers but you need some specialist digital support to set it up. Imagine then if you had a network of 10 similar charities who were also looking to upskill their volunteers, and with whom you are committed to sharing your solution. Imagine how a funder would feel knowing that you were planning to share with this extended network. Your commitment to sharing your solution is likely to increase the value of your application to any funder.

Tip: Arrange a short session with a few of your peers to talk about your challenge. Find out if others are having the same problem. You could try Coffee Connections call as a starting point.

Don’t give up 

We know it’s not nice to be turned down for funding. And we appreciate how much time and effort goes into making applications. But if you’re one of the charities that wasn’t funded this time around, please don’t be disheartened. We will do our best to link you to other support, and will make sure that you hear about future funding opportunities through our newsletter, social channels and website.

Photo by Javier Allegue Barros on Unsplash

Our Catalyst network - what we do

Support & services

Our free services help you make the right decisions and find the right support to make digital happen.

Learn what other non-profits are doing

39+ organisations share 50+ Guides to how they use digital tools to run their services. Visit Shared Digital Guides.