Communicating your research online? ImpactStory tells you how well you’re doing.

So you’re now a confirmed research 2.0. When you’re not entering your latest thoughts in your research blog, you’re twitting them. You take part in wikis, your ResearchGate profile is up to date, your papers accessible through some self-archiving repository, you use Mendeley or CiteUlike and started publishing in open access journals.

Congratulations, you’re redefining  the way research is communicated! Surely with so much efforts to communicate your research to a wide audience, your work will have a higher impact. Right? But with only the good old citation number as the standard metric for impact, how do you know how impactful your research is in the research 2.0 era?

Measuring the impact of research is not an easy task. Several parameters should be taken into consideration. Alternative metrics would complement already existing tools. Image taken from http://altmetrics.org/manifesto/

This question was asked and discussed quite a bit over the last few years through articles, conferences and workshops. It became obvious that a new way of measuring research impact should be developed in a way that considers the new online ecosystem for researchers.  This alternative metrics has been named “Article Level Metrics” or “Altmetrics” (alternative metric).

 

ImpactStory is a perfect illustration of the current effort to develop such new methods of evaluating research impact. The service provides a global view of your research impact, combining both traditional and non-traditional metrics.  In addition to standard citation counts, ImpactStory evaluates how many users have bookmarked your articles in online reference managers, how many times your research has been twitted about or was mentioned in posts on blogs or social networks.

This collection of metrics put together in an intelligent fashion has the potential to emerge as a real alternative or complement to journal citation counts and impact factors. This type of metric is also more responsive than traditional journal article citations and could be a good early prediction for the actual citations an article will collect months or years later. It is also a necessary effort, since researchers are asked to share more, to be more open and pedagogical towards the general public, the incentives and rewards must follow. A true metrics of a broader impact must thus be established.

Of course, we are not there yet. And ImpactStory is an experimentation that needs your feedback. As a developing methods and the technologies that come with it, they have attracted criticisms. And indeed altmetrics in their current forms are somewhat flawed. For example the current methods are far from being absolute and quantitative , thus any comparison between articles or researchers in premature.

ImpactStory was developed by two academics studying and promoting alternative metrics for academic research impact. Heather Piwowar a postdoctoral fellow at Duke University and the University of British Columbia studying ”research data availability and data reuse“. And Jason Priem, PhD student in information science at University of North Carolina-Chapel Hill. Jason is credited for putting term altmetrics out there and an author of the altmetric manifesto.

Similar initiatives are also out there with a similar mission as ImpactStory. You can check these out:

  • http://article-level-metrics.plos.org/
  • http://altmetric.com/help.php
  • http://sciencecard.org/

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>