World famous in Warsaw – or New York? The importance of location in journal ranking services

Online Information Review

ISSN: 1468-4527

Article publication date: 23 November 2012

178

Citation

Gorman, G.E. (2012), "World famous in Warsaw – or New York? The importance of location in journal ranking services", Online Information Review, Vol. 36 No. 6. https://doi.org/10.1108/oir.2012.26436faa.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 2012, Emerald Group Publishing Limited


World famous in Warsaw – or New York? The importance of location in journal ranking services

Article Type: Editorial From: Online Information Review, Volume 36, Issue 6

Editorial

This journal was recently advised by its publisher that the Polish Ministry of Science and Higher Education had ranked Online Information Review as an “A” journal (see Index Copernicus, 2012). For far longer OIR has been ranked by the Institute for Scientific Information (ISI) – now Thomson Reuters – in its Journal Citation Reports (JCR) as a Tier 2 journal (Thomson Reuters, 2012). These different rankings in the two systems raise the question, who is correct here? Is it our friends in Warsaw or New York? Does the home base of a ranking system make any difference?

Part of the answer can be found in the commentary on the Australian Business Deans Council Journal Quality List (Australian Business Deans Council, 2010), which states in part that there are shortcomings in available international lists:QUOTEA review of international journal lists was undertaken and it was agreed that an Australian Business Deans list was required as there were shortcomings in the available international lists. These shortcomings included regional biases, insufficient coverage of Australian journals, too heavy an emphasis on some criteria that worked against specific disciplines, and lack of consensus of a definitive list.For our purposes two of these shortcomings deserve highlighting:

  1. 1.

    regional biases; and

  2. 2.

    insufficient coverage of a country’s own journals.

If one looks at the two JCR subject areas in which OIR is listed (Computer Science and Information Science), there appears to be a Northern Hemisphere and Anglophone bias in the listings. Further, within disciplines, information science as an example, particular focus is on journals that address topics of a more technical nature, and often within the quantitative paradigm. Further, there genuinely is a dearth of journals from Australia, Canada, New Zealand and elsewhere in JCR.

These shortcomings can (or must?) be debilitating for research development in very many countries. This in a nutshell suggests an answer to our question: yes, location does make a difference, or at least allows for a geographical bias in listings. For anyone outside North America it should be blindingly clear that the highly respected JCR has a particular regional bias unsuited to most other parts of the world. “Regional bias” might mean that a ranking system such as JCR is parochial in content, preferring North American journals to those from other regions. It is equally reasonable to suggest that within regions there may be preferences for particular fields within disciplines, or for particular research paradigms (quantitative research, for example, rather than qualitative research).

The Australian Business Deans Council (ABDC) suggests as much in their justification for their own list, and something similar may have motivated the Polish Ministry of Science and Higher Education in turn to create its own list, perhaps reflecting the needs and outputs of the Polish research community.

There are also many other jurisdictions that devise their own rankings of key journals. Australia’s near neighbour, New Zealand, felt that the ABDC listing was not well suited to the New Zealand research environment, so individual universities were left to make their own in-house lists – probably localisation gone to the extreme. At one New Zealand institution, Victoria University of Wellington (VUW), its School of Economics and Finance (SEF) states clearly that “the ABDC provides only one set of journal rankings available and respected internationally. As with all rankings there is an element of subjectivity involved in the ranking process” (Victoria University of Wellington, 2012).

In recognition of this subjectivity, SEF bravely states that “SEF celebrates publications of excellence of its staff in the variety of journals and books staff publish research in. [ … .] Information on the impact of SEF staff can be derived from the information provided through Scopus and through Publish or Perish”. We say “bravely” because it is pretty clear that the New Zealand research evaluation system is strongly, but not exclusively, wedded to JCR as the principal indicator of research quality.

It is commendable that countries like Australia, New Zealand, Poland and others have taken dissatisfaction with the research evaluation status quo into their own hands to produce journal lists and other evaluation criteria to suit their particular geographic and disciplinary requirements.

This is preferable to those jurisdictions outside North America that slavishly follows “alien” quality evaluation criteria. A good example of this is the University of Malaya, where everything is measured against JCR. Here is a tertiary system that prides itself on being attuned to the needs of Malaysian society and industry (as determined by government), and which produces quality research in these areas, yet its research outputs are little known outside Malaysia, and to some extent, Singapore. This is partly because, in my view, such indigenous research is deemed of less relevance by international journals. One consequence of this is that whole fields of research can be ignored – Sharia law and Islamic banking for example.

This possibility of localised, contextualised listings would allow researchers and research communities to “define the metrics that matter to them”. (Henning and Gunn, 2012) Of course there will be an element of subjectivity in any such development, but at least it will be your or my subjectivity, as distinct from the subjectivity of a group of people in Warsaw or New York.

G.E. GormanEditor, Online Information Review

References

Australian Business Deans Council (2010), “Australian Business Deans Council Journal Quality List”, available at: www.abdc.edu.au/3.43.0.0.1.0.htm

Henning, V. and Gunn, W. (2012), “Impact factor: researchers should define the metrics that matter to them”, Guardian Professional, available at: www.guardian.co.uk/higher-education-network/blog/2012/sep/06/mendeley-altmetrics-open-access-publishing?CMP

Index Copernicus International (2012), available at: www.vitae.ac.uk/policy-practice/1397/About-Vitae.html

Thomson Reuters (2012), “The Thomson Reuters Impact Factor”, available at: http://thomsonreuters.com/products_services/science/free/essays/impact_factor/

Victoria University of Wellington (2012), “Research rankings”, School of Economics and Finance, available at: www.victoria.ac.nz/sef/researchrankings

Further Reading

Excellence in Research for Australia (2012), “ERA 2012 Journals List”, available at: www.arc.gov.au/era/era_2012/era_journal_list.htm

Related articles