Metrics

Since December 2017, Internet Policy Review has been showing Metrics next to all peer-reviewed articles. You can view these by clicking the “Metrics” drop-down to the right of the article text.

Internet Policy Review team understands that metrics, while generally useful, can also be a contentious issue in the academic community. Therefore, in an effort to be as transparent as possible, this page documents the meaning, setup, and limitations of the journal’s metrics implementation.

What they mean

While providers of metrics often bill them as markers of scholarly impact, Internet Policy Review considers metrics primarily as a means for providing additional context. Spikes in an article’s views may draw attention to its relevance in contemporaneous events or debates. Furthermore, looking beyond the raw numbers – done by clicking on the name of the individual source in the metrics box – reveals, for example, who posted the article or in which contexts it appears online. This kind of information adds a supplementary dimension to reading the text, allowing readers to expand their research with connections that exist around the that developed after the article’s publication. From the perspective of the eteam, metrics’ true value for academia – and other portions of the lies in this aspect.

Limitations

Like virtually all metrics, Internet Policy Review’s are incomplete. In addition to the usual reasons, endemic to virtually all metrics solutions, this particular implementation has some shortcomings due its newness, its independence from corporate services, and its small-scale implementation. The table below lists these.

In the area of citations, this appears especially clear. Unfortunately, in the current landscape of academic publishing, a small number of actors control this data and charge exorbitant fees for its access. This dynamic produces further centralization and consolidation, which means that small and independently-published Open Access journals – and their communities – suffer. While some alternative services are available, they require transferring DOI (Digital Object Identifier) registration, a costly endeavor that, again, limits access.

Thankfully, this is an area that the Open Access community is working on; the I4OC’s Open Citations Corpus may soon address this problematic.

How They work

Technical Setup

From a technical standpoint,  the metrics setup consists of two main components. The open source, PLoS-developed article-level metrics software Lagotto (version 4.3) aggregates data from online platforms and updates them every 24 hours. The website then displays this data by way of a slightly-modified version of the ALM Viz visualization software (extracted from the Lagotto project).

 

Page view data comes directly from the journal’s installation of the open source analytics software Piwik and likewise updates every 24 hours.

 

If you further interested in the technical functioning of the setup – or would like to create a similar constellation for another journal – take a look at the Internet Policy Review’s Github page.

 

Sources

At the moment, Internet Policy Review shows metrics compiled from the following services:

Metrics Sources

Source

Counts

Limitations

Twitter

Tweets (including retweets)

Earliest data are from September 2017

Reddit

Posts of an article URL

Posts of only the doi are not counted

Wordpress.com

Mentions of an article URL in posts published on Wordpress.com

Posts of only the doi are not counted

Wikipedia

Citations, each language counting separately

 

CiteULike

Saves

 

Piwik

Unique Pageviews

Data only reaches back to late June 2016;

Data from late November 2016 to late January 2016 are incomplete

 

The Internet Policy Review is working to fix some of these issues in the following months. The journal’s metrics implementation constitutes an ongoing project, undergoing gradual refinement, and this page will be updated concomitantly.

Feedback & contact

In the last few years, the altmetrics movement has sparked the conversation around the obscure and often unfair processes of judging academic impact such as the Journal Impact Factor. On the one hand, the emergence of alternative forms of measurement and quantification enabled by the digitisation of academia and social networking have led to a useful diversification of measurement practices. On the other hand, though, implementations of altmetrics have brought their own obscurity, possibilities of manipulation, and place additional pressure on researchers to market their work on social media – especially as these metrics influence hiring and promotion practices.

 

In this context, the Internet Policy Review team hopes that its metrics implementation contributes positively to everyone in the journal’s community. If you have questions, concerns, or feedback, please write editor@hiig.de.