Research Consultation Exercise Results

Charity Futures are delighted to share the results of the research consultation exercise which they have been carrying out in conjunction with Giving Evidence.

Our two studies were about:

  1. Demand: This asked UK charities and donors (of all types) what they want more research about. It was an open consultation process run over 15 months, through focus groups, surveys and a workshop, which invited any charity or donor to suggest questions for research, and then invited any charity or donor to vote to prioritise the list. It resulted in a prioritised list of 24 questions (listed here). It adapted a method developed in medical / health for consulting with patients and their carers about their priorities for medical research. Download the ‘demand’ findings here, and more details are here.
  2. Supply: This investigated what research already exists about UK charities and philanthropy; what topics it does and does not cover, and what methods it uses. It used systematic review methods, and was led by The Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI Centre) at University College London, precisely because they are experts in systematic reviews but are outside the charity / philanthropy sectors. Download the ‘review version’ of the ‘supply’ findings here, and more details are here.

Combined, the two sets of results form a ‘gap analysis’ and show major areas where more research would be valued by charities and donors, who are among its intended users.

There were some surprises in terms of issues/questions that did arise, and some that didn’t arise.

The biggest finding was probably how disengaged charities and donors are from academic research. Half of our survey respondents said that they use academic journal articles ‘never’ or ‘hardly ever’. We heard many views such as “I find that most of the research is very academic and doesn’t reflect the reality of charities”. It [academic literature] is cleverly written and that language is putting me off right away”.

We found that the existing literature is small. This is hardly astonishing, and perhaps a function of funding. We included: (i)studies conducted anywhere, which include data about UK charities and/or giving, and (b)studies by the UK specialist centres, both since 2006. We found 184 relevant studies in total. That includes 109 journal articles, and 83 academic studies produced by the UK academic centres, plus some ‘grey literature’.

The big gaps were around:

  • Measuring / understanding / communicating / increasing impact. These topics dominated the demand list, but we found few existing studies about them
  • Robust studies of what works. To be clear, our scope did not include research about interventions (e.g., which programmes reduce rough sleeping) but did include research about management practices, e.g., whether/ when / how charities can best collaborate, which ways of engaging beneficiaries improve outcomes, which modes of governance improve outcomes/ reduce costs, which financial management approaches reduce costs / increase efficiency. We found many studies that were interested in these kinds of topics, but very few that used a method capable of providing a rigorous answer: we found only seven randomised controlled trials (the ‘gold standard’ method for a single study) none of which was produced by a UK specialist centre, and we found only seven systematic reviews, which are the best way to use all the existing studies to answer a question.
  • Given the interest (in both demand and supply about ‘what works’), we were surprised that most studies are observational studies describing charity and philanthropy (87 studies).
  • There were some requests for things that already exist, such as specific evaluations of specific organisations, and for research methods e.g,.  to identify the impact of interventions in people’s complex lives. This implies an opportunity for training and wider sharing of material and methods that already exist.
  • There is masses of material in the broader academic literature which could be useful to charities and donors – e.g., about interventions and management practices – but which they clearly need help to find and use. This is a training and support activity (Giving Evidence has done some relevant work on it before, and would happily work on it more.)
  • The literature on charities and philanthropy is thinly spread and badly coded. The standard way to find studies on a particular topic is via the ‘keywords’ with which authors and publishers tag their studies. Of the 110 journal studies that we found, 47 used no keywords at all (which makes it very hard for anybody to find them), and the other 63 studies used 232 keywords, of which 204 were only used once. This makes it hard to find the right material. It also suggests that it is thinly spread: this matters because often several studies of the same thing are needed to get a clear answer.
  • Most existing research about charities and philanthropy looks at phenomena / behaviour unique to these sectors, eg., fundraising and managing volunteers. It could be powerful and relatively easy to ‘translate’ some research and findings from other disciplines and the wider management literature, for use by charities and donors.

Findings of the demand study

Impact dominates the list. The top 24 questions are dominated by questions about how to measure impact (i.e., research methods), use impact measurement data to improve effectiveness, and communicate better about impact with external audiences, notably funders. Impact accounts for all of the top seven questions; eight of the top 10 questions; and 14 of the top 24. We disallowed questions about fundraising – simply because researchers have not overlooked them.

A priority question was ‘Which interventions are most effective (or least effective) and why, within a charity sector (i.e., for a specific problem, or specific context)?’

There was also interest in the effectiveness of funders: e.g., “How do grant-makers currently assess their effectiveness? What ways of giving can improve grant effectiveness?” and “How effective are the approaches used by funders to monitor and evaluate charities?”

Beneficiaries feature directly in three questions. One might have expected more.

Surprising omissions included anything directly about:

  • HR management: recruitment, structuring compensation, retention, managing performance. Neither relating to staff, volunteers or trustees.
  • Finance: nothing related to (for example) whether and when to take on debt or how to manage it, managing allocation of risk in contracts, nothing about social impact bonds, calculating the full cost of work, negotiating contracts, whether / when to walk away from contracts (e.g., if the funding is inadequate). This omission is particularly remarkable given how many charities are involved in commissioning processes, which involve determining prices for services and negotiating contracts.
  • Ethics of research, e.g., around human subjects. This is surprising in view of requirements of the new GDPR, and that most charities’ monitoring and evaluation is research on people.
  • Understanding the location, nature and cause of need.
  • Environmental issues.
  • Influencing policy.

Surprisingly few questions concerned running the charity / internal issues: most of the list is about communicating with outsiders, notably funders. Governance only occupied two questions of the

There was nothing about whether / when to collaborate. Collaboration is on the list of 24 questions only (i) about what aids and hinders it (which is a good question, but not about whether to do it) and (ii) about collaborating to demonstrate impact.


We looked at (1) all academic journals globally for studies published since 2006 which contain data about UK charities or philanthropy (date chosen to coincide with the Charities Act of 2006); and (2) the research produced by various UK academic centres specialising in charity / philanthropy[i] since 2006.

Unsurprisingly, the literature is small: we found 184 relevant studies in total. That includes 109 journal articles, and 83 academic studies produced by the UK academic centres, plus some ‘grey literature’.

We found that most studies look at issues peculiar to the charity sector, such as volunteering and fundraising. Few looked at the broader literature (e.g., around management, economics, diversity, behavioural insights) and how it could be applied to charities or giving.

The largest clusters were around:

  • What works and why (37 studies). They mainly addressed donor behaviour (28 studies) e.g., mechanisms of charitable giving; fundraising activity (13) e.g., different fundraising activities such as direct mailing. Governance studies (9) investigated, for instance, the impact of government contracts. Communications (9) included studies that assessed marketing strategies.
  • Donor behaviour (28 studies), funding and income (19 studies) and fundraising (13 studies).
  • Distribution / scope of charity and philanthropic activities (12 studies).

There were some surprising gaps, and some gaps relative to the demand that we found:

  • We found few studies about impact measurement methods or involving beneficiaries (either potential or actual beneficiaries).
  • We found no studies of ‘societal outcomes’, e.g., whether / when / how charities’ campaigning and lobbying efforts succeed. This is a surprise given their historical importance and the fights for charities to retain the ability to campaign and lobby.
  • Given the interest in the literature about effectiveness (‘what works and why’), we were surprised to find only seven randomised controlled trials (the ‘gold standard’ for a single study to establish the effect of something), none of which was produced by a UK specialist centre. We found only seven systematic reviews, which are the best way to use all the existing studies to answer a question such as ‘what works?’.

[i] We looked at the academic centres specialising in charity / philanthropy at the following universities: St Andrews, Birmingham, Plymouth, Southampton, Sheffield Hallam, the Open University, Newcastle, Kent, Cass / CGAP, and the Marshall Institute at LSE.



/ News /