Research intended to increase public trust in the gambling regulator may just have the opposite effect – and could yet have a distortive impact on the outcome of the Gambling Act Review.
The Gambling Commission’s pilot gambling prevalence survey released this week reports a ‘problem gambling’ rate among adults of 1.3% on the Problem Gambling Severity Index (PSGI). This is far above the 0.375% found by the Health Survey for England in 2018 and the 0.2% for 2021/22 from the Commission’s own Quarterly Telephone Survey.
The Commission was careful to note at the time of releasing the data that the finding “requires further attention in the experimental phase of the project” and that “the percentage of 1.3% should not be used as an estimate of problem gambling at this stage”.
Dan Waugh of Regulus Partners says that the results are both unsurprising and unlikely to be particularly reliable.
“The Commission was explicit at the outset of the exercise that the switch from tried and trusted household interviews to an online survey would result in selection bias towards those ‘more likely to be engaged online’ and that this would ‘skew the data’.
“At the same time, it warned that the shift from Health Survey to gambling-specific survey was likely to attract a disproportionate number of both gamblers and ‘problem gamblers’.”
He adds that the new pilot survey reports “implausible levels of participation in gambling” given that a majority of licensed operators spent the first half of the year being surveyed in Government-imposed lockdown.
“Overall, the pilot survey provided a national gambling participation rate for 2021 of 63% – or 17% higher than in 2018 when no such lockdown restrictions existed. It suggests for example that 2.5 million people played games of bingo in bingo clubs or that 1.5 million people played table games in casinos. Such estimates are clearly unrealistic and don’t match up against hard data.”
Waugh notes that there are several other reasons to exercise caution where the pilot survey is concerned.
These include its very small sample (1,078 – or around 15% of the size of the most recent Health Survey for England) and a response rate of just 21% (compared with 54% for the Health Survey in 2018) – both of which limit the extent to which respondents might be considered representative.
“Where people are approached to take part in the Health Survey, the orthodox response is to accept; where people were approached to take part in the pilot survey, the orthodox response was to refuse, despite the financial inducement offered,” says Waugh.
The timing of the release, coming as it does just ahead of the government’s white paper on gambling now rumoured to be slated for late June, is particularly sensitive.
The GB Gambling Commission has stated that the NHS Health Survey remains the ‘gold standard’ when it comes to assessing rates of problem gambling and that the prevalence rates reported in the pilot survey “should not be used as an estimate of problem gambling”.
But such is the fevered atmosphere surrounding the public debate around gambling in Great Britain right now, there is a risk that these warnings may be ignored.
Part of the problem, Waugh observes, is that the publication of prevalence data has become highly politicised, with different interest groups making competing claims in order to support extant agendas: “In general, the industry wants to show low prevalence while public health activists – rather morbidly perhaps – appear to crave high prevalence.
This may be due in part to the fact that the Commission has threatened punitive action if rates of ‘problem gambling’ are not reduced. It is not difficult to understand the consternation of operators if the regulator suddenly decides to come up with new figures to measure this rather simplistic objective.”
“Surveys however can only ever provide imperfect estimates – one cannot diagnose a gambling disorder or any other mental health disorder by survey. The Gambling Commission and the NHS use surveys to make guesses of prevalence – but some guesses are better informed than others.”
The Health Survey – described by Dr Heather Wardle of the University of Glasgow as “one of the finest survey vehicles we have in this country” – provides the best estimates currently available to us.
The squabbling over prevalence rates is an unedifying waste of resources, says Waugh and presents a test of the Commission’s mettle: “The regulator has said that it wishes to ‘increase public trust and confidence in its statistics’. It therefore has a duty to ensure accurate reporting of the information it releases. If it fails to do so then a project designed to strengthen public trust may in fact have the opposite result.”
Waugh warns that the potential impact on public trust goes much further than gambling. “If, as some people appear to believe, this small-sample pilot survey shows that the Health Survey underestimated rates of ‘problem gambling’, we are forced to ask what else might the NHS have got wrong?
“Suggestions that the Health Survey might be quite so inaccurate is likely to call into question just how gilt-edged it is as an instrument of health research in this country.”
Scott Longley has been a journalist since the early noughties covering personal finance, sport and gambling. He has worked for a number of publications including Investment Week, Bloomberg Money, Football First, eGaming Review and Gambling Compliance.