The literature concerned with bibliometry and scientometry has been concerned with the question of how the actual output and impact of scholars can be ascertained. Whereas a situation in which scholars were mainly judged of subjective criteria was deemed undesirable, scientometry has developed new strategies based on objective criteria such as the number of published articles in ISI-rated journals, the number of citations to an author in ISI-rated journals, the number of citations on Google scholar, The H-index, Crown-index etc. Recently our faculty has introduced personal metis – a tracking system for mapping one’s output, which has to be maintained by the researchers themselves. Because personal metis is connected to repub, researchers can keep their output up to date and upload the papers so as to make them widely available through the internet. On the one hand, it is nice that researchers themselves have the autonomy to be responsible for their research output, but on the other hand it makes you wonder about the checks on this system – For instance, are all publications which the researcher says they are A-rated papers (top-quality internationally peer-reviewed papers) counted as such, or are there checks and balances so as to prevent employees to submit even the most obscure papers as top-quality papers.

Article rating, Goegekeurd door de Nederlandse Vereniging van Repositories, H-index, Impactfactor, Sociale verandering, sociale processen en sociale conflicten, Sociology
TJOS - Society for Obscure Sociology
hdl.handle.net/1765/18241
Centre for Rotterdam Cultural Sociology (CROCUS)
Accepted manuscript, to be published on pp 112-116
Department of Sociology

Achterberg, P.H.J. (2010). Reassessing the validity of research assessments. A social experiment. Centre for Rotterdam Cultural Sociology (CROCUS). TJOS - Society for Obscure Sociology. Retrieved from http://hdl.handle.net/1765/18241