If you suspend your transcription on amara.org, please add a timestamp below to indicate how far you progressed! This will help others to resume your work!
Please do not press “publish” on amara.org to save your progress, use “save draft” instead. Only press “publish” when you're done with quality control.
Scientific evaluation as governance technique is conducted through different instruments which have intended and unintended effects. One aspect of evaluation is the measurement of research quality through the performance of scientific publications, for example, how often they are cited. The design of such performance indicators is one core task of bibliometrics as a discipline.
There is incidence that citation-based performance indicators might have side effects on citation behaviour. Those effects have to be considered by the bibliometrics community. On the one hand, they have to be considered with regard to indicator design aiming at achieving validity of measurement. On the other hand, and maybe more important, they have to be considered with regard to indicator use and its effect on science and society.
We find some of this behavioural adaptation analogously in the development of search engine optimization (SEO). Search engine rankings share one core principle with citation-based indicators: that relevance (quality) is understood to be measurable through incoming links (citations) to a website (publication). The discourse on SEO and which strategies are to be regarded as white hat SEO or black hat SEO led to a more or less stable set of 'allowed' activities, which are approved by the search engine monopolist Google.
Citation-based performance indicators are also the aim of optimization activities. One activity, which is believed to be undertaken by scientific journals, is the establishment of 'citation cartels' (groups of journals, which agree on mutually citing each other to boost their indicators). This form of strategic citation is widely regarded as morally corrupt. Beyond this specific type, there is an ongoing debate, which citation strategies are to be regarded scientific misconduct, and therefore threatening the 'fairness' of performance indicators.
In our talk, we will outline the discourse on strategic citation with examples, which show concerns or label some strategies as unethical, and some which demand detection and punishment of questionable behaviour. We especially point out that the request to embank strategic citation is often addressed to the publication database provider Thomson Reuters. Proceeding from this point, this opens up a new perspective on power structures in the science system.