色盒直播

Bang for your bucks

What is research worth? The time lag before it bears economic fruit and the difficulty of gauging its social effects mean it cannot be easily accounted for in terms of profit and loss. But, as Zoe Corbyn reports, that has not stopped the authorities looking for answers

April 30, 2009

Ask any researcher to justify why taxpayers' money should be spent on their work, and inevitably the economic benefits are mentioned. It is important to fund research not just to create knowledge for knowledge's sake, but because there are tangible benefits to the bottom line. A scientific discovery could be licensed or spun off into a company, creating jobs in the process. Research could lead to innovations or policy changes that save taxpayers millions of pounds.

Simply put, research matters to the economy. The argument - famously advocated by Vannevar Bush, scientific adviser to presidents Franklin D. Roosevelt and Harry S. Truman, in his report Science, the Endless Frontier (1945) - appears, at least in part, to have been bought by the politicians: after all, the fact that the UK's science budget has doubled since 1997 is not because the Government simply likes research.

Ministers have recently announced their desire to align academic research with the UK's industrial strengths because of the perceived economic benefits. And certainly the new US Administration views research as a lever to be pulled to deliver the world from the economic crisis it now faces: witness President Barack Obama's $21.5 billion (?15 billion) boost for science funding as part of his economic stimulus package for the US.

"The bottom line is that the downturn is no time to slow down our investment in science," Gordon Brown said at a lecture in Oxford last month, where he promised to shift the economy from its dependence on financial services towards science and technology. "We will not allow science to become a victim of the recession, but rather focus on developing it as a key element of our path to recovery."

色盒直播

ADVERTISEMENT

The extent to which the Prime Minister's words hold true was revealed last week in the 2009 Budget. While, to the chagrin of academics, there has been no new money for research, the ring-fence around the science budget will be protected. However, the Government is demanding that the research councils undertake to make savings, with those funds reinvested "to support key areas of economic potential".

Budgets aside, the fact that research is being sold as some kind of economic saviour deserves analysis. If science and research are to receive special dispensation when the public purse strings are tightened, what level of economic return can taxpayers reasonably expect for their investment? How much is research worth to the economy?

色盒直播

ADVERTISEMENT

Surely it is not "beyond the wit" of Government to have agreed metrics for measuring economic impact, observes Phil Willis, Liberal Democrat MP for Harrogate and Knaresborough and chair of the Innovation, Universities, Science and Skills Select Committee.

They are fair questions, yet the answers are as elusive as ever, and likely to remain so for the foreseeable future. Measuring the economic return on investment from scientific research is a difficult, if not impossible, task.

For starters, there is no consensus about what constitutes impact, let alone how much impact research is having right now. This is an issue that the Campaign for Science and Engineering in the UK (CaSE) is considering.

"Measuring is extremely difficult," Nick Dusic, director of CaSE, says. "Justifying investment in the research base versus trying to change it to increase its impact get muddled up. We need to clarify what we are trying to do in these different areas."

Bob Bushaway is a senior research fellow at the Centre for Higher Education Management and Policy at the University of Southampton. "Spaghetti" is the word he uses to sum up the tangled relationship between research funding and economic benefits.

"Everyone agrees that we should measure it, but no one has the answer to how we do so because there is no definitive answer," he says.

Ben Martin, professor of science and technology policy studies at the Institute of Science and Technology Policy Research (SPRU) at the University of Sussex, agrees. He describes a suite of SPRU studies carried out for the Government at various points over the past ten years that reviewed the wealth of literature in the field.

The conclusion, he says, is that while there is evidence that publicly funded research generates "substantial economic returns", trying to put a number on it is a "pretty fruitless task", particularly if quality-of-life or public-policy benefits are also being fitted with pound signs. "If you are looking for a simple answer, you are probably barking up the wrong tree," he says.

色盒直播

ADVERTISEMENT

He cites pioneering work by the late Edwin Mansfield, an American economist at the University of Pennsylvania, who in 1991 produced a paper boldly calculating the rate of economic return for total US academic research over the preceding 15 years. His conclusion - that it generated a return on investment of 28 per cent, so that for every $1 invested, an income stream of $0.28 per year was generated in perpetuity, with investments paying out after seven years - led to substantial increases in public funding for research and development across the Atlantic.

But, Martin notes, Mansfield's "too good to be true" answer rested on "heroic assumptions" and "flawed methodology" - it was based on interviews with 76 industrial research and development managers who were pushed by the economist to give answers.

It also led to an interesting problem for the SPRU when it came to advising the UK Government. "Did we give the Government the 'Mansfield answer' so it would then go away happy and pour money into research? That was something our integrity would not allow us to do," Martin says.

Instead, the institute spelt out the complications, which are manifold. One of the most difficult is that the economic impact of research is felt in myriad ways, both direct and indirect. Bringing these strands together is a difficult task.

The route that gets the most political attention, although it is by no means the most important, is academics making discoveries that are then converted into products or innovations that generate jobs and wealth. Think of indicators such as the number of university spin-off firms and their income, or patents and licences.

But by all accounts, the path to economic gain that is the most important is "people transfer". Highly trained graduates schooled by university researchers enter the labour market and provide the highly skilled workforce that British companies need to compete - in essence, knowledge transfer on legs.

But there are other less obvious modes, too. For example, the constant push by publicly funded researchers for new instrumentation and equipment can drive business innovation, and researchers themselves are invariably part of extensive global networks of expertise that businesses can plug into.

Another big complication is the long time lag that can occur between the completion of research and its impact. Examples here are legion. Discoveries ranging from antibiotics to nuclear magnetic resonance have had huge economic impact, but were both serendipitous and did not hit pay dirt for many years.

In many cases, the impact of research being funded now will not be felt for decades. Projects funded by the Thatcher Government are the ones most likely to deliver their payoffs now, Dusic notes. This complication makes life particularly tricky when politicians demand rapid economic gains, although it is worth remembering that research activity employs a chain of people now, and capital projects such as building laboratories will be beneficial immediately and in the future.

Other challenges when determining the economic benefits of research include drilling down to the contribution research makes to the success of an innovation, since the failure of a spin-off could be just as much to do with a failed marketing campaign as the innovation itself. The fact that scientific knowledge is cumulative can also make it hard to pin down contributions.

Martin's conclusion is that you can capture economic impact, but you need a "vast array" of indicators - about 65, according to the SPRU. "It would be hopelessly complicated and expensive," he says.

He also notes the problem of measures changing behaviour. For a 1999-2000 survey of the return on universities' business activities, the Government made the number of spin-offs a key indicator of the success of knowledge-transfer activities. Numbers increased nearly threefold, rising from 70 to about 200 the next year.

"That did not mean that we suddenly became three times better at technology transfer and generating economic impact," Martin says. "What it meant was that once it became a performance indicator, everybody started maximising their score."

So what is the research councils' approach to evaluating the economic impact of the work they fund? What is their methodology for demonstrating the contribution research makes to the economy?

Bushaway argues that they are latecomers to the evaluation game. The councils used to argue that since academics do the research, it would be unfair to hold them responsible for delivering impact, with that job falling to the universities.

But Research Councils UK's new motto, "excellence with impact", suggests that times have changed. Philip Esler, outgoing chief executive of the Arts and Humanities Research Council and RCUK's knowledge-transfer champion, says that since they distribute ?3 billion of public money a year, the councils "certainly" see it as their job to ensure that the money is having an effect and to evaluate its impact.

But they have learnt from recent unsuccessful attempts to try to boil down impact into one number.

Following a recommendation in the 2006 Warry report - the councils' action plan for increasing the economic impact of research - they commissioned a study aimed at setting a baseline measure for gross economic impact across each of their fields, only to reject the approach after being advised that it was too difficult. As a result, they now favour a more nuanced approach.

Esler says there is no single figure to quantify the economic impact of research council funding, nor are the councils trying to find it or being pushed by the Government to deliver one. "At present, the methodology is not in a fit state to produce one number in any way that is particularly robust," he says.

Instead, the councils have adopted a broad approach to describing the impact of research. As well as looking at economic effects, they also consider social benefits such as improved public-policy and quality-of-life outcomes.

To do so, they undertake case studies featuring numbers and narratives that show the value of particular pieces of work to the economy and society (see box, page 34). They have also initiated a series of broader studies to analyse aspects of their portfolios.

Current examples include a project examining how council-funded PhD students contribute to economic innovation and growth by charting what happens to them when they leave university. Another seeks to calculate the proportion of spin-off value attributable to research council funding.

"Rather than saying 'Spin-offs in 2008 yielded x million pounds', we will be able to say 'Spin-offs in 2008 yielded x million pounds, and y percentage was attributable to research council funding'," Esler says.

色盒直播

ADVERTISEMENT

He argues that the ultimate aim is to achieve an "ever more persuasive" evidence base for the impact of research council funding. But scaling up the results of a few case studies to represent a whole portfolio has methodological problems, particularly when the direct link between funding and outcome is not always clear. It is also impossible to put monetary values on all effects, Esler adds.

"How do you weigh up ?1 billion of spin-off increases against a potential policy outcome of 100,000 fewer people becoming obese?" he asks. "There are some things that are not capable of being turned into a pound sign.

"Our metrics will hit things that we have not hit before in ways that carry conviction, and the narratives will be more persuasive, but no one is pushing us for a single number. We are not trying to get it, and no one is saying we should."

Jonathan Kestenbaum, chief executive of the National Endowment for Science, Technology and the Arts, praises the stance the research councils have taken and believes it is finding wider favour.

"They are encouraging us to think about a blended approach to understanding impact," he explains. "To reduce everything to just one number would not accurately capture the impact of research ... But that is not to say that everything is so intangible that you can't measure anything. That is not true either."

Yet while the research councils might have evolved a more convincing approach than Mansfield's, there is a major problem with it, too. Which research council is going to shoot itself in the foot with the Treasury by choosing case studies that portray insignificant economic effects?

"The research councils have poster displays at all their town meetings and events showing how their money works, but no one believes them," Bushaway observes. Martin argues that a broader range of case studies - including failures as well as successes - is needed.

Yet a more nuanced approach does appear to be catching on. "The discussion (around economic impact) now is becoming a more qualitative rather than quantitative one," John Denham, the Universities Secretary, said in a recent wide-ranging interview with Times Higher Education. "I think there has been a time in Government where the debate has been framed in terms of 'surely if we did less fundamental research we could do more applied research and therefore we could get more economic impact'.

"We are determinedly saying that we are very interested in economic relevance, and have to be, but it is a much more sophisticated and nuanced argument now."

But not everyone in the academy agrees with even the basic premise of such measurements. Andrew Oswald, professor of economics at the University of Warwick, is known for his work on measuring happiness. He spends his time looking at indicators such as how well people sleep and fluctuations in suicide rates to determine average levels of wellbeing.

He rejects the notion of trying to evaluate the economic impact of research. He draws the line not because it can't be done, but because it is "utterly the wrong way" to evaluate activity.

"Research projects that attempt to understand our world should be evaluated by the quality of their contribution to human knowledge," he argues. "We don't say Shakespeare was a great writer because there are jobs in Stratford now or because people are employed in the printing of his books. Evaluating research on monetary terms is fundamentally the wrong criterion, and it is just madness to think that way."

While the research councils might argue that their definition of economic impact - widened to include policy and quality-of-life effects - deals with this point, it is also increasingly clear that the hard economic benefits that research can impart are likely to carry most weight with the Government.

Earlier this year, Denham and Lord Drayson, the Minister for Science, announced that they wanted to align academic research funding with the UK's industrial strengths. The plan is being followed through in the Budget and will involve attempts to identify which research areas will be of the biggest economic value in the future - something Esler regards as intensely problematic.

"I am not aware of evidence that would say which part of the research base would produce the greatest returns economically," he says.

"Will it come from STEM (science, technology, engineering and mathematics)? Will it come from economics research that helps the financial sector operate in a more effective way? Will it come from arts and humanities research that helps the creative industries lead a recovery?

"I don't think anyone knows and I don't think there is an evidentiary basis to allow you to determine between those various possibilities."

As the UK tries to claw its way out of recession and pressure grows to show how the cash spent on research delivers and to find ways to increase its economic impact, it will be important to remember the nuances of the debate - and convey them to taxpayers who both provide the funding and reap its rewards.

色盒直播 PROFITS OF MEDICINE, CULTURE AND UNBOUND CREATIVITY

There is no single figure for what research is worth to the British economy. However, one recent study that has received plenty of interest offers the first quantitative estimate of the economic benefits to the UK from its public investment in some medical research.

The study - commissioned by the Medical Research Council, the Wellcome Trust and the Academy of Medical Sciences and published in November 2008 - calculated the returns from research into cardiovascular disease and mental health by gauging health gains in terms of extra years of life converted into monetary value.

It indicated that every pound invested in cardiovascular research by taxpayers and charities produced a return on investment of 39 per cent - that is, a stream of benefits equivalent to earning 39 pence each year for ever.

The time lag between research and the eventual health benefits was estimated at 17 years.

Figures for mental health research were similar, with a 37 per cent return and a time lag of 12 years, based on research undertaken between 1975 and 1992.

"We have all learnt a huge amount from (it)," says Philip Esler, the research councils' knowledge transfer champion.

Others, however, are more sceptical. "I accept that there is a good rate of return, but I don't believe any figures," says Ben Martin, professor of science and technology policy studies at the Institute of Science and Technology Policy Research at the University of Sussex.

The case studies used by the research councils also offer a handle on the impact projects can have.

One, from the Natural Environment Research Council, estimates that the economic benefits of NERC-funded researchers' discovery of the hole in the ozone layer in 1985 will be worth ?8 million-?42 million over 25 years. The figure is based on the reduced healthcare costs of fewer skin cancer incidents.

Another example, from the AHRC, estimates the economic benefits of an exhibition at London's Victoria and Albert Museum, funded in part by an AHRC research grant of ?850,000. The event, "At home in Renaissance Italy", was held in 2006. It was estimated to be worth ?2.85 million to London's economy and an additional ?1.33 million to the UK as a whole. The figure was calculated based on what those who saw the exhibition spent when they came to London (excluding ticket sales).

And then there are the annual statistics offered by research councils to central government relating to achievements in knowledge transfer and "economic impact baselines".

However, there is no standardisation in place for either set of indicators, so it is difficult to draw meaningful conclusions.

The UK's progress on knowledge transfer is also monitored by the annual Higher Education-Business and Community Interaction Survey, although it measures university rather than research council success.

Interestingly, a report by the Russell Group of research-intensive universities published in November 2008 found that blue-skies research is worth more to the economy than its applied counterpart.

色盒直播

ADVERTISEMENT

It looked at the commercialisation of 82 projects by the group and found that the average return from blue-skies work was worth more than twice that derived from commercialising applied projects.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT