Is giving up or down and what is the best way to tell? nfpSynergy responds to the CAF's UK Giving Report 2012

collection tin from above

Is giving up or down and what is the best way to tell? nfpSynergy responds to the CAF's UK Giving Report 2012

Introduction 

The recent CAF/NCVO UK Giving report has announced that giving by individuals in the UK dropped by 20% between 2010/11 and 2011/12. If true, this would indicate a catastrophic decline in the generosity of the UK public and a major challenge for UK charities.
 
If their data were about local and central government grants and contracts, we don’t think anybody would question their findings. There are plenty of charities who have announced a fall of 20% in income from government, not least NCVO! But we have yet to hear of a fundraising charity who says their donated income from individuals has declined by 20% in recent times, let alone in a single year.
 
The report from CAF/NCVO doesn’t give any examples or case studies. The one organisation mentioned in their materials is the Children’s Society, whose overall income only decreased by 4% (following a 6% decline in voluntary income). There are plenty of charities whose recent results show an increase in giving or at least maintained levels (Macmillan, RNLI, RSPCA, to name but a few).
 

What other evidence do we have about giving levels?

If individual giving was down, we might expect to see the level of Gift Aid claims dropping. The figures from HMRC for charity claims say that in 2010/11, £3.8 billion was donated by individuals under Gift Aid and in 2011/12 £3.9 billion - a modest increase of around 3%. While there has been a continued drive by charities to encourage donors to give via Gift Aid, it seems highly unlikely that this would be enough to compensate for a 20% overall decline in value or anything like it. So no indication of a drop there. 
 
Similarly we might expect the Charity Commission data to show a drop. The Charity Commission does not report on voluntary income, but it has reported overall income has increased from £55.871bn in December 2011 to £58.578bn in September 2012. Given what we know about statutory income levels dropping, it is hard to believe that both individual and statutory giving have dropped while overall levels of income have gone on rising. More anecdotally, Children in Need showed an increase this year over last (raising a record £26.7 million), as did Comic Relief in 2011 (another record £108.4 million).
 

What does nfpSynergy data say?

At nfpSynergy, we also run a measure of giving levels in our surveys, six times a year surveying 1,000 people. Our results for 2011/12 show that 74% of the public stated that they had given to charities in the last three months – an insignificant decline from 75% in 2010/11 and well in line with the long term average. 
 
So we are seeing participation holding up – what about donor value? Again, the results from 2011/12 showed no significant change – we have seen a decline by all of 6p in average claimed donation over the past three months – from £50 in 2010/11 to £49.94 in 2011/12 (though this does mask massive volatility within the year’s data points).
 
If our results (on a sample of 5,000 – 6,000 people) are showing no change in giving while NCVO/CAF are seeing a 20% decline, who is right?
 

 

CAF/NCVO 

nfpSynergy

 

2010/11

2011/12

2010/11

2011/12 

Proportion of population reporting donating*

58% 

55% 

75% 

74% 

Average annual donation value**

£372

£324

£199.99 

£199.76 

Total donations 2011/12 

£11.0 bn

£9.3 bn

£7.64 bn

£7.61 bn

 
 
 
 
 
 
 
 

*We ask about giving in the previous 3 months, whereas the NCVO/CAF data ask about the previous month

** Our research is online and the CAF/NCVO data is face to face. Our research has at least 6 data points of 1000 people per year,  while NCVO/CAF has 3 data points of 1000 per year

Whose data is right; NCVO/CAF's or nfpSynergy's?

If we are honest, we think neither of our data is right. The massive fluctuations in the levels of giving don’t correlate with any report that we have from either individual charity accounts or survey data from charity professionals about how their own organisations are faring. Karl Wilding’s blog makes all the right noises about why their data should be accurate: consistent methodology, representative sample of respondents, multiple data points etc. We can make all the same arguments about our measurements. The problem is that our data and theirs just don’t tie up. Nor do what charities are saying dovetail with either set of public data.
 
The issue is further complicated by the method NCVO/CAF use to measure total giving – multiplying two measures from surveys together. Each measure is subject to normal statistical variance, which naturally leads to a margin of error. When the two figures are multiplied together, so is the error, meaning huge swings can result from changes in results that are either small or even not statistically significant. 
 

Asking the public to self-declare their giving habits has substantive flaws

There are two main problems with asking people to self-declare what they have given:
  • Human memory is poor. Imagine we ask you to tell us what you have given in the last month or three months. Add it all up in your head? Are you feeling confident that you can remember it? If you give by direct debit you might feel more confident, but adding up every pound or penny in a collecting tin is much harder. 
  • Nice people give more. Giving to charity is a good thing to do. It makes you a nice person. So when somebody asks you how much you give, most people have a tendency to say how generous they are.  Researchers call this ‘social desirability bias’ - the bias to say things that make the respondent appear to do the kind of things that are socially desirable. The same kind of bias creeps in when asking people how much they drink or how much sex they have. They have good reasons to want to say how they would like the world to be like, rather how than it actually is.
Just occasionally we can actually measure the gap between what people say they give and what they actually give. A survey of the public published in October 2011 suggested that the DEC East Africa appeal had raised £1 billion, whereas it actually only raised £100 million. Our own research implies that just over 1.5 million give through their payroll, when the actual number is about half that.
 
Our belief is that asking people about their giving patterns measures not what they actually give, but their perceptions about what they could or should be giving. The volatility in both our data and that of NCVO/CAF is an indicator of people’s economies worries and woes, not of what they give. So when giving levels drop in our data, it tells us that people are worrying more about what they can afford to give and about their own economic circumstances.
 
It is not good news to see giving levels go down in these surveys, even if they are just a barometer. However, to extrapolate out from these polls to derive a figure for giving across the UK is something that we just don’t believe can be justified.
 

We should we measure giving by creating a panel of charities

My belief is that the only way to measure giving is in as direct a way as the retail industry measures how much people spend in the shops, where they count what goes through the tills. The advantage of this type of mechanism is that it overcomes the problems of both memory and human bias. 
 
The difficulty of the charity industry using this kind of measure is that currently the only cast-iron ‘through the tills’ measure is annual accounts and this has a lag in getting data of between 6-18  months after the financial year ends. In contrast, the retail industry knows how they did at Christmas by the middle of January. 
 
The best way to overcome this problem would be to create a panel of charities who had the financial systems to allow them to report on their giving figures on a quarterly basis. The panel would need to represent different sizes and types of charity and so on. It is well within the ingenuity and determination of the sector to create a measure for giving that is both completely robust and timely.
 
This would have the advantage of giving all charities a better idea of how their fundraising is doing compared to the market. Crucially, it would also unite people behind taking action on research data and fundraising performance that everyone has confidence in.
 

Joe Saxton and Cian Murphy

This blog features as a news story in Third Sector magazine here and Civil Society here.

Giving us your support? Have CAF got it right? Or have we all got it wrong? Leave us a comment below.

 

 

Submitted by Stephen Pidgeon (not verified) on 28 Nov 2012

Permalink

Well done nfpSynergy, at last some sensible thoughts on numbers that differ so wildly, they should not have been used for a major publicity push. Thank you very much.

Submitted by Daryl Upsall (not verified) on 29 Nov 2012

Permalink

Congratulations Joe and Cian and the the team at nfpSynergy for giving a a fresh perspective on the CAF report on UK giving which seemed too dramatic to be true. Even here in Spain giving is broadly holding up and for some charities,namely the INGO/UN fundraising investors and those delivering services to those affected by the crisis income is still growing significantly

Submitted by Catherine Clark (not verified) on 29 Nov 2012

Permalink

The minute I heard these numbers at the Action Planning conference in London on November 14, I knew they had to be wrong. They are now being spread everywhere, causing panic among board members, smugness among politicians, and glee among fundraising consultants. I have just had a back-and-forth with the Wiltshire Charities Information Bureau, who are refusing to withdraw the report from their newsletter on the grounds that a) it was published by CAF and b) it's creating 'conversation' in the sector!

This is sort of like saying oranges cause cancer --once you disprove it, a large number of people still 'remember' that oranges cause cancer.

I'm not at all happy that such an abysmally incorrect report could have been so casually published.

Submitted by Peter Maple (not verified) on 29 Nov 2012

Permalink

This is very helpful and timely. I've always doubted the CAF giving percentages though though the absolute amounts they calculate usually tie better to the NCVO sector analysis of income.

My research (qualitative not quantitive) also indictated that regular givers are certainly not giving less so I really think that there must be some serious sampling errors in their report.

Submitted by Karl Wilding (not verified) on 30 Nov 2012

Permalink

Hi All
I'm one of the people responsible for producing the CAF/NCVO report on giving, as well as the NCVO research mentioned by Peter Maple that takes data from charities' annual accounts, published at http://data.ncvo-vol.org.uk. In 2014 we'll be able to compare charity accounts with the survey and, as I say in my blog on the report, I hope we will turn out to be wrong.

Joe and Cian make some good points about the need for different approaches. But to say donors' reports of what they give don't relate to what they give is too simplistic. Evidence suggests they exaggerate - but if they do, they do so consistently, and in relation to what they give. Search for Renee Bekkers' work on this issue.

We also know that small changes in methodology lead to different estimates (search for Peter Halfpenny's work on why and how). The UK Giving survey is stable in what it reports: much more so than other surveys. It performs well in relation to the *key* tests of reliability and validity. And the survey method hasn't changed: so if there are errors (such as the bias of people not being truthful) then there is no reason to believe that the bias changes. The amounts may be wrong - but the trend is likely to be right.

I am happy to discuss our findings and methods with anybody. You can email me at karl dot wilding at ncvo-vol.org.uk
Cheers
Karl

Add new comment

The content of this field is kept private and will not be shown publicly.

Plain text

  • No HTML tags allowed.
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.