Time for an update to a previous post. For the past few years, I have been using an automated process to track citations to my lab’s work on Google Scholar (details of how to set this up are at the end of this post). Due to the nature of how Google Scholar tracks citations, it […]
Tag: h-index
Rollercoaster III: yet more on Google Scholar
In a previous post I made a little R script to crunch Google Scholar data for a given scientist. The graphics were done in base R and looked a bit ropey. I thought I’d give the code a spring clean – it’s available here. The script is called ggScholar.R (rather than gScholar.R). Feel free to […]
Rollercoaster II: more on Google Scholar citations
I’ve previously written about Google Scholar. Its usefulness and its instability. I just read a post by Jon Tennant on how to harvest Google Scholar data in R and I thought I would use his code as the basis to generate some nice plots based on Google Scholar data. A script for R is below […]
You Know My Name (Look Up The Number)
What is your h-index on Twitter? This thought crossed my mind yesterday when I saw a tweet that was tagged #academicinsults It occurred to me that a Twitter account is a kind of micro-publishing platform. So what would “publication metrics” look like for Twitter? Twitter makes analytics available, so they can easily be crunched. The main […]
Strange Things – update
My post on the strange data underlying the new impact factor for eLife was read by many people. Thanks for the interest and for the comments and discussion that followed. I thought I should follow up on some of the issues raised in the post. To recap: eLife received a 2013 Impact Factor despite only publishing […]
Strange Things
I noticed something strange about the 2013 Impact Factor data for eLife. Before I get onto the problem. I feel I need to point out that I dislike Impact Factors and think that their influence on science is corrosive. I am a DORA signatory and I try to uphold those principles. I admit that, in the […]
“Yeah” Is What We Had
When it comes to measuring the impact of our science, citations are pretty much all we have. And not only that but they only say one thing – yeah – with no context. How can we enrich citation data? Much has been written about how and why and whether or not we should use metrics for […]
Sure To Fall
What does the life cycle of a scientific paper look like? It stands to reason that after a paper is published, people download and read the paper and then if it generates sufficient interest, it will begin to be cited. At some point these citations will peak and the interest will die away as the work […]
Blast Off!
This post is about metrics and specifically the H-index. It will probably be the first of several on this topic. I was re-reading a blog post by Alex Bateman on his affection for the H-index as a tool for evaluating up-and-coming scientists. He describes Jorge Hirsch’s H-index, its limitations and its utility quite nicely, so I […]
Give, Give, Give Me More, More, More
A recent opinion piece published in eLife bemoaned the way that citations are used to judge academics because we are not even certain of the veracity of this information. The main complaint was that Google Scholar – a service that aggregates citations to articles using a computer program – may be less-than-reliable. There are three main sources […]