UK Higher Education Research Yearbook

Maurice B. Line (Harrogate, UK)

Journal of Documentation

ISSN: 0022-0418

Article publication date: 1 December 2003

179

Keywords

Citation

Line, M.B. (2003), "UK Higher Education Research Yearbook", Journal of Documentation, Vol. 59 No. 6, pp. 739-742. https://doi.org/10.1108/00220410310506367

Publisher

:

Emerald Group Publishing Limited

Copyright © 2003, MCB UP Limited


This is the kind of publication one is rather surprised to find did not exist already. It is an extensive compilation of data relating to all UK institutions of higher education that have some research activity, including colleges; some of these small and highly specialised (Writtle College is one of several hitherto unknown to me). All the data are considered to have significance as indicators of research performance. The volume, though produced as a commercial operation, has semi‐official seals of approval in the form of short prefatory notes (described on the Contents page as “articles”) by Sir Alan Wilson, vice‐chancellor of the University of Leeds and Bahram Bekhradnia, director of the Higher Education Research Institute.

There are three main sections:

  1. 1.

    Introductory material (18 pages);

  2. 2.

    “Institution profiles” (three‐fifths of the volume); and

  3. 3.

    “Tables of research performance indicators”.

The wording is misleading; the indicators appear also in the profiles. It might have been better if in section 2 the institutions had been ordered by sector, rather than in five alphabetical sequences (England, Wales, Northern Ireland, Scotland and specialised). Many smaller specialised institutions are not included here. In any case, De Montfort belongs under D rather than M. Section 3 presents data by indicator – institutional income, etc. – followed by institution.

It is impossible to use or understand the several hundreds of pages of statistics without reading the introductory notes carefully. For example, figures on performance by several criteria are compared with other institutions in the same sector group, not with all other institutions. There are five sector groups:

  1. 1.

    pre‐1960 institutions;

  2. 2.

    1960‐1990 and peer foundations;

  3. 3.

    post‐1990, including polytechnics;

  4. 4.

    institutions with specialised missions; and

  5. 5.

    HE colleges and other institutions.

The fourth sector is subdivided into colleges in the arts sector, which are compared with sector 2, and “institutions that carry a reputation for research excellence in a niche area”, which are compared with sector 1; these include LSE and SOAS. Sector 5 includes several institutions of the University of London. Institutions in sectors 4 and 5 are given one page each in the institution profiles, the rest two pages. Cranfield University, which is highly specialised, is in sector 2, with Reading, Strathclyde and Warwick. This means that, for example, the University of Sheffield cannot be compared with Sheffield Hallam University, or the Institute of Historical Research with King's College London (though it can with Armagh Observatory). No grouping would meet all requirements, but some of the oddities here could surely have been avoided.

Subjects are grouped into eight areas:

  1. 1.

    clinical medicine and dentistry;

  2. 2.

    subjects allied to medicine;

  3. 3.

    biological sciences;

  4. 4.

    physical and mathematical sciences;

  5. 5.

    engineering and technology;

  6. 6.

    social and related sciences;

  7. 7.

    humanities and languages; and

  8. 8.

    visual and performing arts.

Here again some allocations are questionable, e.g. should computer science really be in engineering and technology rather than physical and mathematical sciences? Ideally that should depend on whether it is largely theoretical or applied, but it would have been impossible to make such distinctions in such a compilation. Librarianship is in social and related sciences; it is not stated where information science is. (One might argue that divinity is by definition not a humanity, but it could hardly have a section to itself.)

As for the all‐important indicators themselves, they are: academic and research staff; research income; Research Council studentship awards; PhDs graduating; research journal publications on the Thomson IS database, as totals and per FTE research staff; and citation impact of these publications. All these figures are broken down by subject area. The tables also include (over all subjects): institutional income; research income (total); Research Council income (total and as a percentage of total research income); and industrial contract income.

The figures in the tables in section 2 are the averages for a five‐year period in each dataset, whereas in section 3 separate data are given for each of the five years. Summary tables give the overall ranking within its sector of each institution in each subject area, calculated as averages of ranks. There are also “footprints”, which show graphically how the institution's research income, research students and publication impact compare with the sector's; some (e.g. Sheffield) show a very close fit, some are very different. On publication impact, some institutions “over‐perform” (e.g. University College London), some “under‐perform” (e.g. Ulster).

In spite of all the information provided, it is often not detailed enough. For example, the only subject at Bath that is “allied to medicine” is pharmacology; there must be many more cases of one‐subject sectors. Anyone using this volume seriously needs to have at his/her side a list of institutions with all the subjects they cover.

I do not see that any figures but those for publications and their impact can be regarded as indicators of anything useful; and figures for publications too can be called into question. Most of the other “indicators” are inputs. It is interesting to compare publication impact with, for example, research income; this suggests in some cases that the research funders are not getting value for money. But, as we all know, publication impact has its limitations. Perhaps there are no really satisfactory indicators of research performance, and all that can be done is what is done here, to use several together in the hope that combined they mean something.

That said, the data are presented as well and clearly as they could be. As stated above, close attention to the introductory matter is essential, but once familiar with the presentation, any user should find no problems with its use.

Who is going to use this work, and what for? As the Foreword says, “people in research are asking for more information abut research”. There is plenty of information here in the form of data, but it is it doubtful if it adds much to our knowledge. Bodies that seem to perform well can, and almost certainly will, use the volume for self‐promotion. Those that perform badly will probably look for faults in the indicators or in the way the data are presented before subjecting themselves to critical scrutiny. Funding bodies may find interesting information (e.g. by relating publication impacts to their grants, as suggested above), and can presumably be relied upon to handle it with great care. The HEFCE and its sister bodies will doubtless delve into it frequently for specific information. It is in fact quite hard to predict how and when a reference work will prove useful; some of my own reference works, acquired because I saw them as useful, have hardly been touched, while others have received usage that I would never have predicted.

Whatever the usefulness of this work, it is bound to decline quite rapidly. Improvements can be made in the subsequent editions that are promised, and indeed suggestions are invited. Future updatings will presumably appear annually, since the title says it is a “yearbook”; the date should clearly appear as part of the title. Obviously not a work for purchase by individuals, but, for all its weaknesses, an essential one for all institutions of higher education.

Related articles