Accuracy of RAE 2001
22. With such a spectacular increase in RAE ratings, it is
legitimate to ask whether the improvement is a true reflection
of the state of UK academic research and its performance over
the last five years. The evidence we have received suggests that
most in the science and education communities agree with HEFCE's
assertion that it is indeed largely a reflection of reality.
The Funding Councils' international benchmarking exercise supports
the view that there has been improvement. The ratings of all 5
and 5* departments were validated by 290 overseas experts. All
but nine agreed with the judgements of the panels,
although the EPSRC argues that "the involvement of international
expertise is limited so the thoroughness of the international
calibration could be questioned".
The improvement in research quality is also supported by citation
analysis: the UK's share of the world's 1% most cited papers increased
from 11% to 18% over the assessment period (1996-2001) and the
average citation rate of UK papers has increased by 12%.
23. There are sceptics. In an article in The Guardian on
15 January 2002, Professor Susan Bassnett, a pro-vice-chancellor
at the University of Warwick, wrote "What that 55% represents
[the proportion of researchers in 5 or 5* departments] ... is
a morass of fiddling, finagling and horse trading. Nobody who
works in a university in the UK in 2002 seriously believes that
research is improving". Professor Richard Green of the University
of Hull told us "the results are starting to lack credibility".
The biological science societies state that "some or all
of this improved RAE score is undoubtedly due to increased familiarity
with RAE exercises and the ability of university departments to
play the RAE game".
We are aware of a mismatch between the formal institutional view
and that expressed by individuals in private.
24. A number of concerns have been expressed. First, there
is concern about the non-inclusion of researchers. In 1992,
many new universities with little background in research entered
the RAE for the first time. HEFCE changed the rules to allow departments
to choose which researchers to submit. This was reasonable, given
that the ratings system recognises the proportion of quality research
in a given department. Departments with a small research capacity
would be penalised and in any case universities would no doubt
split the departments to isolate the research element to achieve
the same effect. There is greater concern that researchers who
are active are omitted for tactical reasons. We are aware of one
department that received a 5* in 2001 despite having submitted
less than 60% of its staff.
We recognise that few top-rated departments went to such extremes,
but there have to be question marks about a mechanism that aims
to measure UK research quality yet provides an incentive not to
include some of the research conducted.
Funding should reflect the actual amount of research and its
quality over the whole department and not those deemed active.
Universities should have no incentive to omit any researchers.
25. Second, there is concern that by moving researchers between
UoAs or splitting and merging departments universities can improve
ratings without any improvement in quality. HEFCE said universities
"are free to divide the staff employed in any one department
between several UoAs, or to submit staff employed in more than
one department to a single UoA".
Universities may, of course, use these techniques to improve their
ratings, and no doubt some will achieve this without any actual
improvement in research quality. Very strong departments could
be used to protect weaker researchers and maximise the ratings
of other departments. To achieve 5* status, a department needs
to achieve levels of international excellence in more than half
of the research activity submitted and attainable levels of national
excellence in the remainder. Thus there is no additional benefit
to a 5* department to achieve international excellence in, say,
three quarters or almost all of its research activity. Thus by
'diluting' its best researchers with a few less successful researchers,
a strong department can increase its income by boosting its active
staff numbers while maintaining its rating.
26. Similarly, HEFCE says "Institutions with a number of
poorlyrated departments often seek to increase quality by
merging less successful departments with those that were highlyrated,
in the expectation that exposure to the new culture and management
will raise the overall level of the merged entity".
Mergers can have this positive effect. But if a university merges
a very strong department, rated 5*, with a good 4-rated department,
with some selective staff omissions, the new department could
still get 5*. As a result, more staff are working in a 5* department
and the university receives a larger QR grant, but not necessarily
with any real increase in research quality.
27. Third, there is concern that transfers between institutions
can distort the RAE results. Transfers are not a bad thing
as they may bring dynamism to scientific research. We note that
academic movement is lower in the UK than in the United States,
for example, and that while the timing of staff movements has
been affected, overall movement has not increased as a result
of the RAE. Mr Bekhradnia,
HEFCE's Director of Policy, suggested that transfers would be
neutral since, if one department gained, another lost. This is
not quite true. Category A* staff (those who transferred in the
year preceding the RAE) must submit two items of research. Thus
if a researcher has only produced two high quality outputs in
the assessment period, his or her contribution to that department's
rating would be modest.
If that researcher moves to another university, however, his or
her new department need only submit two outputs. Since he/she
has only produced two high quality outputs, this means that someone
who could be considered a '4-rated' researcher suddenly becomes
a 5* by virtue of having moved institution. Again, there would
be no increase in research quality, but a possible increase in
28. Fourth, there are concerns about the way the panels operated
and their membership. There is a lack of clarity about how
panel members and chairmen are chosen, and concerns about whether,
as academics judging other academics, they are truly objective.
We questioned HEFCE about the representation from industry. They
told us that on science and engineering panels, non-academic membership
was between 20 and 26%.
We solicited comments from the Chemical Industries Association
and the Association of the British Pharmaceutical Industry.
The former had identified no problem. The ABPI felt that the volume
of work may have discouraged participation from industry and that
in the future it would suit industry representatives to act as
observers on the panels rather than full members. These problems
are confirmed by HEFCE.
Of the initial invitations to industry to serve on panels, 29
of 102 were declined (28%) compared to 52 out of 653 for other
groups (8%). The EPSRC believes that there are still too few active
industrialists on the panels.
29. There are also concerns about the size of the panels and the
number of outputs they had to consider. In its guidance to panel
members, HEFCE says "you should not feel that you are required
to collect, review or examine all research outputs listed".
It is easy to understand why. In Chemistry, for example, 11 panel
members had to sift through over 5,000 submitted outputs, many
of which must have been outside their area of expertise. The criteria
by which sub-panels are set up not clear. For example, we have
been told that plans for a sub-panel in Development Studies were
dropped at the eleventh hour.
We recognise that panels could call on help in fields where they
were weak, but there is still the suspicion that place of publication
was given greater weight than the papers' content. It has been
suggested to us that the RAE could make better use of the review
process employed by the Research Councils at the end of project
grants. The burden of work on panel members was considerable,
and they seem to have been given inadequate support from HEFCE,
for example, having to obtain their own copies of submitted papers.
We recommend that, in any future RAE, HEFCE provide panel members
with more effective administrative support. Ensuring the validity
of the ratings is money well spent.
30. On request, HEFCE supplied us with data that showed that those
panels who received the lowest number of submissions tended to
award higher ratings.
Such UoAs will not necessarily get more money since in England
this is determined by the research volume in each Unit. While
HEFCE rightly points out that a relationship is no proof of cause,
it does give rise to doubts about the objectivity and reliability
of the ratings.
31. With the above reservations, we accept the widespread view
that the RAE ratings reflect an improvement in UK higher education
research. There are areas of concern, however, and it is in
no-one's interest to give a false impression of the state of UK
research. HEFCE seemed rather complacent about these concerns.
Mr Bekhradnia said "we should celebrate it, not try to denigrate
HEFCE, at least Mrs Hodge was prepared to admit that gamesmanship
had had an effect. In her evidence she said "Is there an
element of the clever old researchers in the HE sector learning
how to manage the system that they themselves have put in place?
There is probably a slight element but I do not want to overplay
that". We think
the Minister has got it about right.