The Impact Factor and Its Discontents: Reading list on controversies and shortcomings of the Journal Impact Factor. How to use Harzing’s ‘Publish or Perish’ software to assess citations – a step-by-step guide. In his recent blog post on the need for a digital census of academic research, Patrick Dunleavy argued that the ‘Publish or Perish’ software, developed by Professor Anne-Wil Harzing of Melbourne University and based on Google Scholar data, could provide an exceptionally easy way for academics to record details of their publications and citation instances.
An academic with a reasonably distinctive name should be able to compile this report in less than half an hour. Here we present a simple ‘how-to’ guide to using the software. Step 1: Download the software for free from www.harzing.com/pop.htm. The program installs onto your PC. Step 2: Launch the application from your desktop. Step 3: The next step has three elements: (a) Enter the author name you want as initial and surname; in this case we used “J Tinkler”. (b) There are boxes on top right where you can tick which disciplines you want covered. (c) Click the Lookup button on the top right. Twitter and traditional bibliometrics are separate but complementary aspects of research impact. In a recent study, Haustein and colleagues found a weak correlation between the number of times a paper is tweeted about and subsequent citations. But the study also found papers from 2012 were tweeted about ten times more than papers from 2010.
Emily Darling discusses the results and finds that while altmetrics may do a poor job at predicting the traditional success of scholarly articles, it is becoming increasingly apparent that research can contribute to both scientific and social outcomes. Do open access articles in economics have a citation advantage? Posted: July 9th, 2014 | Author: Sven | Filed under: found on the net, journals | Tags: academic publishing, open access | 1 Comment » The two economists Klaus Wohlrabe and Daniel Birkmaier (both from the Ifo Institute for Economic Research in Munich) have published a new working paper in which they analyse the impact of open access publishing in economics on citations.
Downloads. Time to discard the metric that decides how science is rated. Scientists, like other professionals, need ways to evaluate themselves and their colleagues.
These evaluations are necessary for better everyday management: hiring, promotions, awarding grants and so on. One evaluation metric has dominated these decisions, and that is doing more harm than good. This metric, called the journal impact factor or just impact factor, and released annually, counts the average number of times a particular journal’s articles are cited by other scientists in subsequent publications over a certain period of time. The upshot is that it creates a hierarchy among journals, and scientists vie to get their research published in a journal with a higher impact factor, in the hope of advancing their careers. The trouble is that impact factor of journals where researchers publish their work is a poor surrogate to measure an individual researcher’s accomplishments.
We can also take heart from real progress in several areas.
Bibliométrie & production scientifique des entités de recherche - D... La bibliothèque au coeur de l’activité bibliométrique des universités : bonne ou mauvaise idée ? Durant les années 2000, les bibliothèques suédoises ont ressenti le besoin de redéfinir leurs missions et leurs rôles par rapport à la communauté qu’elles desservaient, c’est ce que nous apprend l’article How implementation of bibliometric practice affects the role of academic libraries de Fredrick Aström (Lund University Libraries) et Joacim Hansson (Linnaeus University) parue dans Journal of Librianship and Information sciences mais disponible en Open Access sur l’archive de l’Université de Lund.
Four reasons to stop caring so much about the h-index. The h-index attempts to measure the productivity and impact of the published work of scholar.
But reducing scholarly work to a number in this way has significant limitations. Welcome to Ranking Web of Repositories. Fabrica. Scimago Journal & Country Rank. Les indicateurs de l’évaluation de la recherche : de l’impact factor à l’h-index. Evaluer la qualité de la recherche scientifique est essentiel dans une société où l’innovation et les progrès techniques dépendent en partie de la recherche académique.
Anne-Wil Harzing, University of Melbourne Web: www.harzing.com Email: email@example.com © Copyright 2007-2008 Anne-Wil Harzing. All rights reserved. Eighth version, 20 December 2008. Introduction Instead of the Thomson ISI Web of Science, Publish or Perish uses Google Scholar data to calculate its various statistics. Publish or Perish - Anne-Wil Harzing. Are you applying for tenure, promotion or a new job?
Do you want to include evidence of the impact of your research? Is your work cited in journals which are not ISI listed? Then you might want to try Publish or Perish, designed to help individual academics to present their case for research impact to its best advantage. Version: 4.17.0 (18 June 2015) Journal Citation Reports. Journal Citation Reports® offers a systematic, objective means to critically evaluate the world's leading journals, with quantifiable, statistical information based on citation data.
By compiling articles' cited references, JCR helps to measure research influence and impact at the journal and category levels, and shows the relationship between citing and cited journals. Now offered on the InCites platform, JCR allows you to access and explore the underlying data that informs JCR metrics from article keywords to citation thresholds.
This expanded capability lets you conduct analysis and comparisons of citation relationships across journals and categories over time. You can even localize your analysis to understand publishing practices within your organization. The recognized authority for evaluating journals, JCR presents quantitative data that supports a systematic, objective review of the world’s leading journals.