Farmer's field

Thinking of funding digital? Read this comparison of two early stage digital programmes funded through the Catalyst network in 2020.

In 2020 Catalyst ran two programmes for early stage digital projects. These programmes shared a vision, but approached it differently. Either of these approaches could be a viable way for your trust or foundation to start funding digital projects. Both are replicable.

This article compares those two approaches. It's not an in-depth analysis of their relative merits, but it does draw on their (differently styled) evaluations. 

Explore and the Discovery Learning Programme (DLP)

One’s a verb and one’s a noun. But they both aimed to help charities:

  • Explore a problem through a digital design lens
  • Build their digital capacity: for strategy, for using tools and for following design-led processes

Similarities

Let’s start with the similarities. Both programmes:

  • included a £5k grant
  • guided charities through a structured programme of weekly activities 
  • included lots of user research activities 

The also both helped participants:

  • identify the right problem to solve
  • develop some early ideas rooted in research evidence
  • ended with a big project retro

Differences of approach

But that's where the similarities end. Here’s the differences.

Similar challenges and outcomes

Even though they took different approaches the programmes created similar outcomes. 

Better understanding of digital service design methods

By programme end participants understood digital design and digital service delivery better. Explore participants felt 65% more confident than they had at the programme start. DLP participants reported similarly. 

Both sets of participants said that their organisation also felt more confident. 

Challenges in user research

Rapid user research pushes charities to speed up their approach to consultation and surveys. Both programmes asked participants to:

  • run user research interviews with 5-6 beneficiaries
  • synthesise their insights
  • repeat if necessary

Many participants struggled to recruit enough users to take part in time. They said that advance notice of this need would have helped them recruit enough beneficiaries in time. 

Pivoting because of new insights

Many DLP participants changed the focus of their solution, or changed their target audience. This happened when user research changed their understanding of their users and their problems. Similarly, 76% of Explore participants changed their solution because of their user research insights. 

Organisational change

Both programmes changed how participants approached digital. Participants changed their digital strategy. They adopted one or more of the better better digital principles.

But the biggest impact was in how participants began applying user-centred thinking and service design methods to other problems. They even applied them to non-digital areas of their organisation. We wrote about this

Other impacts

Our review found four other shared impacts:

  • increased confidence and ability to influence other digital funding applications
  • better connections with the peers they shared their learning journey with
  • peer-to-peer benefits from helping each other with their challenges and ideas
  • participants found it difficult to manage other tasks and activities alongside the programme. This happened even though DLP was fulltime and Explore part time. 

Different feedback

There were also some differences in feedback. Not a lot, but some.

Time and intensity

Explore’s first cohort ran during the summer holidays. This affected staff participation and made it harder to engage other stakeholders. 

But more significant was the feedback on programme length. Explore participants felt six weeks for user research and six weeks for prototyping was a good length. It enabled deep engagement, time to embed learning and the chance to generate ideas and test them.

DLP participants felt differently about their programme. They struggled to find the time to reflect on what they were learning. But they also appreciated the DLP’s intensity. They liked how it encouraged them to put aside some of their usual responsibilities and focus on digital. They said 6-8 weeks would be a better programme length.

Impact on digital agencies 

The DLP was delivered by digital agencies experienced in working with civil society organisations. They were inspired by working with so many charities at once. They also benefited from exchanging ideas and practices with other digital partners. This is something they aren’t usually able to do much. 

Explore was delivered by staff from the Centre for Acceleration of Social Technology, Catalyst’s incubators.

Why the different impact?

Because the two evaluations used different methods it's difficult to compare their impact. This means that:

  • we can’t say that one programme is better than the other
  • there are differences in their degree of impact that are difficult to evaluate so far
  • we can’t be sure of all the reasons for differences in impact

But we do expect to be able to report more as Catalyst grows its evaluative capacity and standardises some of its ways of measuring impact. 

Read more

Three's a charm: Catalyst's discovery programme shows how funders can help charities with digital.

Seven things all charities should learn from those on the Explore programme

103 charities receive National Lottery ‘Discovery’ funding

48 organisations join the Explore programme

Photo by Ivan Bandura on Unsplash


In 2020 Catalyst ran two programmes for early stage digital projects. These programmes shared a vision, but approached it differently. Either of these approaches could be a viable way for your trust or foundation to start funding digital projects. Both are replicable.

This article compares those two approaches. It's not an in-depth analysis of their relative merits, but it does draw on their (differently styled) evaluations. 

Explore and the Discovery Learning Programme (DLP)

One’s a verb and one’s a noun. But they both aimed to help charities:

  • Explore a problem through a digital design lens
  • Build their digital capacity: for strategy, for using tools and for following design-led processes

Similarities

Let’s start with the similarities. Both programmes:

  • included a £5k grant
  • guided charities through a structured programme of weekly activities 
  • included lots of user research activities 

The also both helped participants:

  • identify the right problem to solve
  • develop some early ideas rooted in research evidence
  • ended with a big project retro

Differences of approach

But that's where the similarities end. Here’s the differences.

Similar challenges and outcomes

Even though they took different approaches the programmes created similar outcomes. 

Better understanding of digital service design methods

By programme end participants understood digital design and digital service delivery better. Explore participants felt 65% more confident than they had at the programme start. DLP participants reported similarly. 

Both sets of participants said that their organisation also felt more confident. 

Challenges in user research

Rapid user research pushes charities to speed up their approach to consultation and surveys. Both programmes asked participants to:

  • run user research interviews with 5-6 beneficiaries
  • synthesise their insights
  • repeat if necessary

Many participants struggled to recruit enough users to take part in time. They said that advance notice of this need would have helped them recruit enough beneficiaries in time. 

Pivoting because of new insights

Many DLP participants changed the focus of their solution, or changed their target audience. This happened when user research changed their understanding of their users and their problems. Similarly, 76% of Explore participants changed their solution because of their user research insights. 

Organisational change

Both programmes changed how participants approached digital. Participants changed their digital strategy. They adopted one or more of the better better digital principles.

But the biggest impact was in how participants began applying user-centred thinking and service design methods to other problems. They even applied them to non-digital areas of their organisation. We wrote about this

Other impacts

Our review found four other shared impacts:

  • increased confidence and ability to influence other digital funding applications
  • better connections with the peers they shared their learning journey with
  • peer-to-peer benefits from helping each other with their challenges and ideas
  • participants found it difficult to manage other tasks and activities alongside the programme. This happened even though DLP was fulltime and Explore part time. 

Different feedback

There were also some differences in feedback. Not a lot, but some.

Time and intensity

Explore’s first cohort ran during the summer holidays. This affected staff participation and made it harder to engage other stakeholders. 

But more significant was the feedback on programme length. Explore participants felt six weeks for user research and six weeks for prototyping was a good length. It enabled deep engagement, time to embed learning and the chance to generate ideas and test them.

DLP participants felt differently about their programme. They struggled to find the time to reflect on what they were learning. But they also appreciated the DLP’s intensity. They liked how it encouraged them to put aside some of their usual responsibilities and focus on digital. They said 6-8 weeks would be a better programme length.

Impact on digital agencies 

The DLP was delivered by digital agencies experienced in working with civil society organisations. They were inspired by working with so many charities at once. They also benefited from exchanging ideas and practices with other digital partners. This is something they aren’t usually able to do much. 

Explore was delivered by staff from the Centre for Acceleration of Social Technology, Catalyst’s incubators.

Why the different impact?

Because the two evaluations used different methods it's difficult to compare their impact. This means that:

  • we can’t say that one programme is better than the other
  • there are differences in their degree of impact that are difficult to evaluate so far
  • we can’t be sure of all the reasons for differences in impact

But we do expect to be able to report more as Catalyst grows its evaluative capacity and standardises some of its ways of measuring impact. 

Read more

Three's a charm: Catalyst's discovery programme shows how funders can help charities with digital.

Seven things all charities should learn from those on the Explore programme

103 charities receive National Lottery ‘Discovery’ funding

48 organisations join the Explore programme

Photo by Ivan Bandura on Unsplash


Our Catalyst network - what we do

Support & services

Our free services help you make the right decisions and find the right support to make digital happen.

Learn what other non-profits are doing

39+ organisations share 50+ Guides to how they use digital tools to run their services. Visit Shared Digital Guides.