ImpactStory

From HLWIKI Canada
Jump to: navigation, search
Impactstory-logo1.png
Total imp monitor.png
Are you interested in contributing to HLWIKI International? contact: dean.giustini@ubc.ca

To browse other articles on a range of HSL topics, see the A-Z index.

Contents

Last Update

  • Updated.jpg 3 March 2016

Introduction

See also Altmetrics | Author impact metrics | E-Science | FigShare | Open data | Scopus vs. Web of Science | Webometrics

ImpactStory (formerly called Total-Impact) is a research-output measurement tool with the motto: share the full story of your research impact. The tool helps academics (and students) aggregate their total scholarly impact based on the culling of engagement data from social media tools and other websites. (See ImpactStory blog). Consequently, ImpactStory aims to track the buzz and popularity generated by articles, blogposts, datasets, slides, preprints, videos, and other scholarly outputs via bookmarks, Mendeley and Zotero, slide-sharing tools such as Slideshare and other knowledge-sharing tools. ImpactStory is the brainchild of Heather Piwowar and Jason Priem, two young American academics who have written about the importance of alternative metrics and measuring scholarly production in the age of social media. According to their altmetrics manifesto "...scholars are moving their everyday work to the web. Online reference managers Zotero and Mendeley each claim to store over 40 million articles (making them substantially larger than PubMed); [in addition], as many as a third of scholars are on Twitter, and a growing number tend scholarly blogs...." Hence, Piwowar and Priem's research into (micro)knowledge level metrics.

Alternative metrics (metrics at an article or interaction level on the web) is a type of altmetrics or total metrics. Generally speaking, altmetrics refers to usage metrics such as views or mentions on social media (social metrics). The Journal of Medical Internet Research (JMIR) started to publish article-level metrics some years ago, including views and tweets ("tweetations"). Tweets were later found to predict highly-cited articles leading Eysenbach to propose a "twimpact factor", ie., number of tweets within first 7 days of publication and a twindex, a rank percentile of the twimpact factor of an article compared to similar articles within the same journal (Eysenbach, 2011).

Rationale

  • Research data may need to be harvested from a variety of places such as repositories affiliated with institutions, funding organizations, government departments, professional associations and elsewhere. In order to locate and use this multidisciplinary data, you may need to look for additional tools and ways to find and manipulate it. For a more complete view of scholarly output, look at number of comments in social media spaces; be more confident that the total impact of research outputs within your institution, beyond published content, is accounted for...

Pros & cons

  • ImpactStory provides a useful (if somewhat controversial) alternative to traditional impact factors (see Swets blog. Altmetrics for Librarians and Institutions)
  • As digital scholarly publishing models evolve, Google scholar, the Web of Science and Scopus will increasingly provide only a partial view of total impact
  • As a newer web application, TI aggregates impact data from many alternative sources and displays it in a single report; in other words, it seeks to reveal how many citations, site visits, clicks or downloads have been received by your knowledge objects
  • IS does not provide a single number to describe total impact; it provides a report of a scholar's impact based on the overall engagement with knowledge objects created by the scholar, be they blogposts or full research reports with accompanying datasets
  • While Mendeley and Zotero are increasingly used by academics in research, a high percentage still use tools such as EndNote Web and RefWorks
  • Prominent names in bibliometrics and webometrics have spoken about the need to develop new alternative metrics for scholarly impact; however, others have questioned the value of blogposts, tweets and mentions on social media
  • Social media tools can provide real-time clues into what's being discussed on the web ~ but does that convey scholarly impact? Or is it just information to aid in filtering information?
  • as with bibliometrics and the work of Eugene Garfield from the '60s, academic librarians must determine the quality of information sources (and their value in the scholarly enterprise)
  • in altmetrics, we must question the value of accounting for micro-interactions on the social web (which are mostly superficial "thought balloons"; but this would change if we could determine whether an article or blog tweet was clicked on, opened and read) at a time when many academics themselves (especially in the sciences) do not use social media
  • in other words, the quality of engagement is as important as the metrics (see 'Altmetrics': quality of engagement matters as much as retweets).

Similar & related projects

  • Altmetric (Altmetric Explorer) aims to track activity around scholarly literature; the Altmetric score is a quantative measure of quality and quantity of attention given to scholarly articles
  • Bookmetrix overview of the reach, usage and readership of your book or chapter by providing various book-level and chapter-level metrics
  • CitedIn is a webtool used to explore citations on the web where a scholar's papers are mentioned in blogs, databases, Wikipedia; track resources citing a PubMed identifier
  • CrowdoMeter is a web service that displays tweets linking to scientific articles, and allows users to add semantic information using a subset of the Citation Typing Ontology (CiTO)
  • FigShare a web-based repository designed to enable research outputs to be sharable, cited/cite-able (with DOI) and visible in the browser; Figshare's platform is easy to use and helps researchers get credit for their research (both positive & negative results)
  • Hiptype is a platform for data-driven publishing
  • Newsflo for universities is ideal for tracking the media profiles of university academics and departments
  • Open Researcher & Contributor ID (ORCID) aims to solve name ambiguity in scholarly communications by creating a registry of persistent unique identifiers for individual researchers and an open and transparent linking mechanism between ORCID, other ID schemes, and research objects such as publications, grants, and patents
  • PaperCritic offers researchers a way to monitor feedback about their scientific work, and a way to review the work of others in an open, transparent environment
  • Plum Analytics gathers metrics across five categories — usage, mentions, captures, social media and citations — what Plum Analytics calls artifacts more than just the journal articles that a researcher authors
  • ReaderMeter is a tool based on Mendeley and allows searching for authors and obtaining their metrics (HR-index, GR-index) based on number of readers
  • Webometric Analyst analyzes the web impact of documents or web sites and creates network diagrams of collections of web sites, as well as creating networks and time series analysis of social web sites (e.g., YouTube, Twitter) and some specialist web sites (e.g., Google Books, Mendeley).

References

According to Acharya A et al. Rise of the rest: the growing impact of non-elite journals. arXiv. 16 October 2014, the idea of non-elite journal articles (traditionally, those that have not been cited much) have started to be cited more in the last ten years due to Google scholar.

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox