Les enjeux de la science ouverte : retour sur les 6e journées « open access » (1/4) #OAweek – Le Carreau de la BULAC. Les 12, 13 et 14 octobre dernier, le consortium Couperin organisait la sixième édition des « Journées open access ».
Intitulées « La science ouverte en marche », ces journées visaient à dresser un bilan des avancées du libre accès à l’information scientifique en France, à la veille de la future Loi sur le numérique, mais surtout à proposer de nouvelles pistes en faveur du développement de la science ouverte. Particulièrement stimulantes, elles ont rassemblé plus de 30 intervenants issus des différents métiers impliqués dans l’avènement de la science ouverte : chercheurs, bibliothécaires, informaticiens, politiques… Elles ont permis de découvrir de nouvelles initiatives, d’interroger les positionnements des institutions et des politiques, de confronter les différents modèles de « l’open access », et d’évaluer l’impact des transformations en cours sur les méthodes de travail et d’évaluation des chercheurs.
Pendlebury White Paper. Presentations. Is there a problem with academic integrity? For many academics today, research is not about pushing intellectual boundaries.
It is not about investigating a fascinating issue so much as it is about churning out publications, demonstrating impact and generating revenue in order to meet the performance targets upon which institutional reputation and individual careers depend. The temptation to cut corners is immense. Tricks include getting your name on a paper that you contributed little towards, or “salami-slicing” the same research across several publications. More seriously, some researchers falsify – misrepresent – their data, or even fabricate them entirely. Some universities tacitly encourage such behaviour and the boundary between academic integrity and malpractice is becoming blurred. The absence of shared understandings and the risks to career and reputation make the nature and extent of academic misconduct a delicate issue to investigate.
Amplifr. Programmer et analyser ses publications sur les médias sociaux – Les outils de la veille. JournalBase. Comparer les bases de données scientifiques internationales en sciences humaines et sociales (SHS) Nous remercions Marc Guichard (CNRS-INIST) pour son intervention auprès de Thomson Reuters qui nous a permis d’obtenir les index en fichiers excel, Elsevier qui met à disposition sur son site l’index général de Scopus en fichier excel, Sonia Christon (documentaliste, stagiaire INTD) et Gilles Liévin (informaticien) pour leur collaboration.
Metrics and measurement - American Press Institute. The American Press Institute offers publishers software tools and guidance to help them measure their content in new ways and form data-driven strategies.
Read more about our Metrics for News program here, and see other recent insights about metrics and measurement below. Creating a successful metrics strategy comes down to one thing, Alexa Roman writes: Predicting how much money you’re going to make. Or if money isn’t the goal, replace “money” with your own currency of what matters most. Roman outlines five steps for executing a metrics strategy focused on money (or some other form of value) […] Published Media organizations tended to give high priority to Brexit coverage early on, but Chartbeat’s data on the thousands of stories on Brexit shows that didn’t translate into attention from readers until much later on.
Bookmarklet – Altmetric. How to reach a wider audience for your research. In today’s age of knowledge abundance, the scholarly community is turning its attention to the use of social media channels and other online platforms.
Scholars have been increasingly integrating these tools into their everyday work, creating enormous potential to capture the digital traces of their research. Not surprisingly, then, in recent years academics have shown a growing interest in non-traditional ways of evaluating their scholarly ‘impact’. These altmetrics, short for alternative metrics, allow researchers to gauge the impact and reach of their research in the social web beyond traditional citation counting. Here we offer practical advice on how to make the most of the opportunities provided by altmetrics. Much of this advice overlaps with other tips on how to measure your research impact — but only because, to track and connect with your audience, it must be able to find you, and you must be able to find it. References. CRIS2016 Final Programme With Poster List Session Chairs Hyperlinks. Benchmarking with SciVal in Scholarly Communication and Research Services.
Download PDF of Article Benchmarking is the process of evaluating the performance of one entity in relation to other similar entities using standard measures.1 In the academic sphere, when we benchmark, we are evaluating an individual researcher’s or a group of researchers’ scholarly performance using bibliometric measures.
Bibliometric measures are traditionally citation-based, measuring the circulation of an idea through formal communication outlets — journals — by tracking how often, where, and by whom a work is referenced. Though anchored in print, citation-based metrics are an established and familiar way to determine how well a work is received within its discipline. Bibliométrie. Publish or Perish. Anne-Wil Harzing - (updated Sun 16 Oct 2016 16:52) Donations The development of the Publish or Perish software is a volunteering effort that has been ongoing since 2006.
Download and use of Publish or Perish is and will remain free (gratis), but donations toward the costs of hosting, bandwidth, and software development are appreciated. Your donation helps to support the further development of Publish or Perish for new data sources and additional features.