background preloader

Altmetrics

Facebook Twitter

Altmetrics for Librarians: Pros and Cons. While digital libraries, institutional repositories, journals and databases provide an open opportunity to download scholarly research, “alternative metrics” or altmetrics supply usage statistics, which can be very useful in determining an article’s popularity and its reading potential.[1] “We may be witnessing a tipping point in collaboration, faster access, and new opportunities.”[2] Almetrics allows librarians to provide their users with statistics regarding academic articles more quickly.

Altmetrics can tell us how many times an article, website, software or blog has been viewed, downloaded, reused, shared, and cited.[3] It has been noted that there is correlation between the number of online views and downloads of an article and the number of times that article will be cited in future research.[4] The pros of altmetrics And altmetrics are designed to be easy to use. Altmetric tools, both open source and proprietary, provide economic incentives for use and greater data granularity. Plum Analytics | Metrics. Plum Analytics is building the next generation of research metrics for scholarly research. Metrics are captured and correlated at the group / collection level (e.g., lab, department, museum, journal, etc.) We categorize metrics into 5 separate types: Usage, Captures, Mentions, Social Media, and Citations.

Examples of each type are: Usage - Downloads, views, book holdings, ILL, document delivery Captures - Favorites, bookmarks, saves, readers, groups, watchers Mentions - blog posts, news stories, Wikipedia articles, comments, reviews Social media - Tweets, +1's, likes, shares, ratings Citations - PubMed, Scopus, patents We gather metrics around what we call artifacts.

Artifacts are more than just the journal articles that a researcher authors. Artifacts are any research output that is available online. We aggregate artifact and author level metrics into a researcher graph. A manifesto – altmetrics.org. 31 Flavors of Research Impact through #altmetrics. The impact of a research paper has a flavour. It might be champagne: a titillating discussion piece of the week. Or maybe it is a dark chocolate mainstay of the field. Strawberry: a great methods contribution. Licorice: controversial. Bubblegum: a hit in the classrooms. Low-fat vanilla: not very creamy, but it fills a need. CC-BY-NC by maniacyak on flickr There probably aren’t 31 clear flavours of research impact. To do this we have to be able to tell the flavours apart. We need more dimensions to distinguish the flavour clusters from each other. Unfortunately we can’t accurately derive the meaning of these activities by just thinking about them.

Flavours are important for research outputs other than just papers, too. Below is a concrete example of impact flavour, based on analysis that Jason Priem (@jasonpriem), Brad Hemminger, and I are in the midst of writing up for the soon-to-be-launched altmetrics Collection at PLoS ONE. Here is a taste of the clusters we found. Like this: Altmetrics – Alternative Metrics for Articles (think: impact 2.0) « Kresge Physical Sciences Library. March 30, 2012 by Jane Quigley Altmetrics are metrics that attempt to capture the impact of scholarly publications as reflected in non-traditional media, – social media like blogs, Twitter, and Mendeley. Traditional works of published scholarship (articles, journals, and scholarly monographs) have citation metrics such as impact factors that reflect their impact in specific, carefully defined venues – the number of times cited by other published articles, for example.

Increasingly, however, published works of scholarship are causing ripples in social media, – scholarly blogs, Twitter and the like, – that can be tracked and quantified: altmetrics. Moreover, altmetrics can be applied to non-traditional works of scholarship as well as traditional, – datasets in repositories, software, or slidesets and other curricular materials. Read more: Like this: Like Loading... Altmetrics and the Future of Science | Wired Science. Last week, I went to NetSci, the big yearly network science conference. Hosted at Northwestern University this year, I had a great time. But in addition to enjoying the conference itself, this year I got to attend two events that bookended my time at the conference. Early in the week, I went to one of the satellite workshops entitled Networks: the Science of Science and Innovation, and at the end of the week I went to the second annual Altmetrics workshop.

And they were both great! Both touched on similar topics: how to quantify and measure scientific progress. The first conference explored the science of science, also known as scientometrics. And networks were used to wonderful effect! The network of relationships between adviser and student (and accompanying institutions) allows us to determine how universities should be ordered.

Many other research results were detailed, from other ways to examine academic genealogy to how to determine what your next research project should be. Jason Priem. Purdue altmetrics talk. Altmetrics — Replacing the Impact Factor Is Not the Only Point. Courtesy of karindalziel There are other important value metrics beyond the strength of a journal. This might come as a shock to some STEM publishers, who have flourished or floundered based on the performance of impact factor rankings published each June. While the value of the journal as a container is an important value metric and one that needs to continue, the rapidly evolving alternative metrics (altmetrics) movement is concerned with more than replacing traditional journal assessment metrics. Like much of life these days, a key focus of our community has been on those qualities that are measured to ensure one “passes the test.” The coin of the realm in scholarly communications has been citations, in particular journal citations, and that is the primary metric against which scholarly output has been measured.

Another “coin” for scholarly monographs has been the imprimatur of the publisher of one’s book. Like this: Like Loading...