background preloader

Impact

Facebook Twitter

MYSCIENCENEWS. Evaluer la qualité de la recherche scientifique est essentiel dans une société où l’innovation et les progrès techniques dépendent en partie de la recherche académique. Les indicateurs tels que l’impact factor, l’eigenfactor ou l’h-index, jouent un rôle utile dans ce processus. Mais ils ne peuvent être l’unique paramètre d’évaluation. Ils sont très souvent utilisés de manière erronée. Il est donc important de comprendre comment ils sont calculés et de connaître leurs limites et les alternatives. Impact factor, h-index, kesako ? Comment évaluer la recherche ? Il existe un classement des revues scientifiques basé sur l’impact factor (IF) des journaux. L’eigenfactor est un indicateur similaire quantifiant l’influence des revues sur cinq ans.

Certains spécialistes reprochent à ces indicateurs l’étroitesse des paramètres jugés significatifs de l’importance d’une revue. On constate la complexité de la mise en œuvre de cette définition. D’autres indicateurs existent. En savoir plus : SciCombinator. Scholars are quickly moving toward a universe of web-native communication. Jason Priem, Judit Bar-Ilan, Stefanie Haustein, Isabella Peters, Hadas Shema, and Jens Terliesner get a sense of how established the academic presence is online, and how an individual academic online profile can stand up to traditional measurements of number of publications and citations.

Traditionally, scholarly impact and visibility have been measured by counting publications and citations in the scholarly literature. However, increasingly scholars are also visible on the Web, establishing presences in a growing variety of social ecosystems. Examining this broader set of altmetrics could establish a more comprehensive image of influence, uncovering authors’ weight in the informal, invisible college: their “scientific ‘street cred’” [pdf] (Cronin, 2001). We also delved deeper by looking at publications of our sampled scholars. Only 28 per cent of articles were bookmarked in CiteULike, suggesting that Mendeley is cementing dominance in the online reference manager space. [1202.2461] How the Scientific Community Reacts to Newly Submitted Preprints: Article Downloads, Twitter Mentions, and Citations.

As Scholarship Goes Digital, Academics Seek New Ways to Measure Their Impact - Technology. By Jennifer Howard In academe, the game of how to win friends and influence people is serious business. Administrators and grant makers want proof that a researcher's work has life beyond the library or the lab. But the current system of measuring scholarly influence doesn't reflect the way many researchers work in an environment driven more and more by the social Web.

Research that used to take months or years to reach readers can now find them almost instantly via blogs and Twitter. That kind of activity escapes traditional metrics like the impact factor, which indicates how often a journal is cited, not how its articles are really being consumed by readers. An approach called altmetrics—short for alternative metrics—aims to measure Web-driven scholarly interactions, such as how often research is tweeted, blogged about, or bookmarked. Audio Listen: Jennifer Howard explains why it's time for a new way of measuring scholarship's reach. (9:20) Link Photos Lissa Gotwals for The Chronicle Mr. Metrics Remixed: The Times They Are a Webby | InTechWeb Blog. Altmetrics Correlations: We need to think about n-dimensional impact space, taken from a slide presentation by Jason Priem “See the mind at work and see the mind in work…”(from REMIXTHEBOOK review) “Until quite recently the complete justification for even the most complex scientific facts could be understood by a single person,” Michael Nielsen reminds us before describing science today as graspable beyond individual understanding.

If one person can no longer understand the ways of science, then, perhaps, “a group of people” may “collectively claim to understand all the separate pieces that go into the discovery, and how those pieces fit together.” How do we correlate members of this group, are some members closer to each other, how can they collectively claim what they collectively communicate, and what pieces hold them together? This is what the altmetrics group tries to explore, and teaches the machine to seek. Communicating is Doing the Science To Each His Flavor Time to Listen In.

Tracking Scholarly Influence Beyond the Impact Factor - Wired Campus. “A very blunt instrument” is how Peter Binfield of the Public Library of Science describes the impact factor. It’s handy for librarians and others who make decisions about which journals to buy but not so dandy for evaluating specific papers and researchers.

Mr. Binfield is the publisher of the journal PLoS One and the PLoS community journals, like PLoS Computational Biology. PLoS works on an open-access model; the impact factor doesn’t reign supreme there as it does at so many subscription-based operations. Instead, the publisher emphasizes a variety of article-level metrics: usage statistics and citations, sure, but also how often an article is blogged about or bookmarked and what readers and media outlets are saying about it. The approach is part of a broader trend toward altmetrics, alternative ways of measuring scholarly influence.

Go to any PLoS article online and you will find a “metrics” tab at the top of the screen. Mr. [Creative Commons-licensed image by Flickr user /charlene.] The effect of open access and downloads ('hits') on citation impact: a bibliography of studies. See also Papers produced by the project Find your way through the bibliography Selected topic ALERTboxes: OA impact biblio rapid reader | Reviews of OA impact studiesLatest additions Studies with original data Web tools for measuring impact | Comparative reviews Background The financial imperative: correlating research access, impact and assessment | Citation analysis, indexes and impact factors | Open access Last updated 25 June 2013; first posted 15 September 2004.

If you have any additions, corrections or comments, please tweet @stevehit #oaimpact or email Steve Hitchcock. What others say about this bibliography "a brilliant source of articles" (on impact of Open Access material)Christine Stohn, Ex Libris Initiatives (3 September 2012) "a major bibliography on this debate"Alan M Johnson, Charting a Course for a Successful Research Career: A Guide for Early Career Researchers, 2nd edition (April 2011), ch. 9, Where to Publish, p50 A great resource! Introduction to the bibliography.

Altmetrics

1205.5611.pdf (Objet application/pdf) Science Intelligence and InfoPros. Blog » Blog Archive » Social Media are important for the scienti c impact of Academic Papers. The demise of the Impact Factor: The strength of the relationship between citation rates and IF is down to levels last seen 40 years ago. Jobs, grants, prestige and career advancement are all partially based on an admittedly flawed concept: the journal Impact Factor. Impact factors have been becoming increasingly meaningless since 1991, writes George Lozano, who finds that the variance of papers’ citation rates around their journals’ IF has been rising steadily. Thomson Reuters assigns most journals a yearly Impact Factor (IF), which is defined as the mean citation rate during that year of the papers published in that journal during the previous 2 years.

The IF has been repeatedly criticized for many well-known and openly acknowledged reasons. However, editors continue to try to increase their journals’ IFs, and researchers continue to try to publish their work in the journals with the highest IF, which creates the perception of a mutually-reinforcing measure of quality. Impact factors were developed in the early 20th century to help American university libraries with their journal purchasing decisions. Total-impact: uncover the invisible impact of reseach. Peer review: why each research community should define the metrics that matter to them | Higher Education Network | Guardian Professional. One of the challenges faced by research funders – both public and private – is how to maximise the amount of work being done on important problems, without institutionalising any particular dogma which may suppress novel ideas. The most common arrangement is to fund good researchers but refrain from being overly prescriptive about outcomes, and, in turn, the way to identify good researchers has been to look at the publications that follow the research they fund.

In 1955, Eugene Garfield, the founder of the Institute for Scientific Information (now part of Thomson Reuters), introduced a means for identifying influential journals according to the number of times work appearing in them was cited. This process results in a number called the impact factor (IF), and it's build on the assumption that those whose works have been the most influential will be the most cited. One of these is the widespread misuse of the IF to compare people. Victor Henning is CEO and co-founder of Mendeley.