Contributing Authors
Recent Tweets @PlumAnalytics

imageSanford-Burnham Medical Research Institute is a non-profit medical research institute located in La Jolla California and Florida. Sanford-Burnham is one of the seven National Cancer Institute (NCI)-designated basic research cancer centers in the United States. Sanford-Burnham has over 80 full-time faculty all with their own labs and over 900 scientific staff members working on research. They have received over 170 NIH grants representing over $90 million in the last five years.

Jennifer Vigil, Librarian at Sanford-Burnham, said that at the same time a lot of people were asking questions about their grants, journal citation reports, and metrics about their publications, they had the opportunity to trial PlumXTM. According to Vigil, they seized the trial opportunity because they were going to have access to metrics that were not available through any of their traditional tools.

Since they implemented, Vigil reports having visibility into new ways people are interacting with their research, for example, Facebook Likes & Shares and capturing articles in Mendeley. Below is an article from Sanford-Burnhman’s PlumX that illustrates what Vigil is talking about. This article was just published in 2014, so has not had time to gather citations. Yet, with PlumX you can tell that there is already usage and social media attention, and 48 people capturing this article in Mendeley. These captures can be good indicators of future citations.


Sanford-Burnham is organized around fourteen programs so they set up their PlumX dashboard by program to see the metrics for each program. Some example programs are:

  • Bioinformatics and Structural Biology
  • Cardiovascular Pathobiology
  • Cell Death & Survival Networks
  • etc.

Within each program they have themes so it can be narrowed down even further. For example within the Bioinformatics and Structural Biology program there are two themes, Structural Bioinformatics/Systems Biology and System Biology of Microbes & Microbiome. They can see the metrics for the program, for both of these themes, as well as metrics for the researchers and their research output in the program.

Below is the first page of the Sanford-Burnham PlumX dashboard where you can see a list of programs. This page also displays the research output and their metrics for the entire organization.


"It’s been fun" says Vigil about implementing PlumX and dealing with the structure and the faculty and seeing metrics emerge as more and more PMIDs were put into the system. 

Taylor & Francis recently published their annual Open Access Survey. This is a survey they conducted of the authors they published during 2012. 


What interested us about the survey were the responses concerning usage statistics.


Sixty percent of the respondents said that usage and download statistics will become important for assessing the value of research over the next ten years. Only 12% said that they would not be important. While citations still rank (81%) as the most important way for assessing the value of research in the coming decade, it is clear that more and more authors are seeing the importance of the new ways people interact with and use research.

At PlumTM Analytics, we see this survey as validation of our view of altmetrics, or rather ALLmetrics since the definition of altmetrics does not include usage, nor do most altmetric providers.

When we developed PlumXTM there was growing attention to the role social media was playing in scholarly communications. We knew that Twitter, Facebook and others were becoming a big part of research promotion, and subsequently important to measure. But, we also knew that there were lots of other metrics that are important in understanding the impact of research. We set out to gather as many of those metrics as we could and categorize them in a meaningful way for clearer understanding. We wrote about this categorization in a previous blog.

One of those categories is Usage.

image   PlumX categorizes  a lot of activity in Usage including, Downloads, Views, Holdings and Video Plays.

Recently, we added usage statistics from EBSCO Databases, eBooks and EBSCO Discovery Service. You can read about this in more detail on our blog. By including this amount of usage across publishers, PlumX gives you a good proxy of the usage of articles and other research output.

Plum Analytics also makes it easy to show usage to your authors in your open access repositories.

image If you embed the PlumTM Print widget into your repository you can show your authors usage and other altmetric information. You can read more about the Plum Print in this blog post.

We can also include the usage statistics of the repository itself. That is exactly what some of our customers are doing. See this blog for more details and see this example below.


We are excited that we can help authors of Open Access articles assess the value of their research.

One of the things we looked forward to when considering our EBSCO acquisition was more collaboration between our companies. So, it is exciting to announce that PlumXTM now includes usage metrics for articles and books from EBSCO Databases, EBSCO Discovery Service and EBSCO eBooks.

image image    image

This marks the first time the wealth of information about the actual usage per article or book, such as abstract views, downloads, etc. can be measured across publishers. While we recognize that this does not represent all usage, it is a good proxy, and it gives you more usable information about articles and books.

One of our customers, Alain Dussert, Director of Library Services at Pacifica Graduate Institute, liked what he saw and commented, 

Wow, bringing in EBSCO usage stats is an impressive development. The benefits of being able to see EBSCO ‘hits’ by author is really going to bring tremendous amounts of relevant scholarly data to PlumX. That is a big, and I mean, big, development. I don’t think anyone has ever done that before, to be honest.

Below is an image of a PlumX Article page. You can also view it live here,

You can see that it includes EBSCO usage. Prior to adding this usage, this article indicated seven Mendeley readers. The Mendeley captures are an important statistic and indicate good interaction with this article. Now, with the addition of the EBSCO usage, you get a fuller picture of the level of interest in this article. 


Here is a closer view of the article level metrics:


Below is a list of the specific usage metrics we get from EBSCO Databases, EBSCO Discovery Service and EBSCO eBooks.

  • Abstract Views -  a view of the abstract
  • HTML Views -  a view of the HTML version
  • PDF Views - a view of the PDF version
  • ePub Downloads - a download of an ebook in epub format
  • Clicks - a click to request the item through EBSCO Smart Links or via Custom Links, such as ILL, OPAC, OpenURL, etc. when full text is not part of EBSCO
  • Plays - a play of an audiobook or video content
  • Supporting Data Views - a view of supporting data, figures or related images

Here is an article page with metrics in all five categories, Usage, Captures, Mentions, Social Media and Citations that includes EBSCO Usage. See it live here.


You can read more in our recent press release here.

It is our goal to give as complete a picture of research output - from articles and books to videos and web pages, and everything in between - as we can. We are very interested in working with publishers and others directly to include more usage data. Please email us if you are interested in discussing this further.

Autism Speaks is set to use PlumXTM to track the research impact of the research they fund.

Autism Speaks is the world’s leading autism science and advocacy organization, dedicated to funding research into the causes, prevention, treatments and a cure for autism; increasing awareness of autism spectrum disorders; and advocating for the needs of individuals with autism and their families. 


We are thrilled that this organization, which has spent over $200 million to fund autism research since they were founded, is using PlumX to track and evaluate this research.

Autism Speaks will use PlumX to track the metrics about the research outputs that their funding has helped create. Specifically, using PlumX, Autism Speaks will gather and analyze metrics at many levels and views. These include:

  • Grant ID
  • Researcher
  • Institution
  • Research Topic
  • Grant Type
  • Geography

By doing this, they will have powerful and actionable information about the research they fund. For the first time Autism Speaks will be able to tell the stories of impact behind what they are doing and have a better idea about the ROI of their funding. Furthermore, by opening this data, donors will see the impact of the funds they’ve donated, researchers can get credit for what they’ve done, and mentees will gain visibility.

We are really excited about working with Autism Speaks to implement their research grants with new metrics that can benefit the whole research community by enhancing the way we look at research impact and scholarly communication. 

To read the press release go here.

The key to quickly navigating complex data sets is to turn them into elegant, simple visualizations that do the hard work for you. We think our new PlumTM Print does exactly that. With one glance, it’s easy way to see the relative impact of each of the five categories of metrics - usage, captures, mentions, social media, citations - that we track (see this post for more information on these categories). A simple mouseover over the Plum Print quickly shows the metrics behind the visualization.


We have always included data visualizations as part of PlumXTM, in fact, one of our first data visualizations was the sunburst used to visualize a researcher’s output. We also wrote about this in a previous post.

The Plum Print is a natural evolution of that effort, designed to work at the artifact level, and carries forward the work we’ve done categorizing metrics. You can now see at a glance the impact of a given article, presentation, or any other research artifact.

In developing the Plum Print we learned that boiling down all of the metric information to a single number was like boiling down a home to just its purchase price. A $500,000 urban Victorian built in 1880 is different from the $500,000 brand new suburban home on a cul-de-sac. Sure, both are worth the same, but that number does not even come close to telling the stories of both homes. 

The Plum Print visually changes based upon the metrics in each category. Each circle of the Plum Print represents one of the five categories of metrics – Usage, Captures, Mentions, Social Media or Citations. By the size of the circle, you quickly visualize the relative number of metrics, and indeed, whether there are any metrics at all in a given category. 

In addition to creating the Plum Print, we updated our artifact page to make it easier to see impact. Here is an example of the new artifact page.


You still have the valuable and necessary information about the artifact, e.g. title, authors, journal, etc. Now, the metrics are easier to see and the impact is easier to grasp.

To see the evolution of PlumX for yourself, visit the demonstration site. If you’d like to see your own researchers in PlumX, write us at to request a trial.

To read our press release go here.

As a self-professed data junkie, I always thoroughly enjoy the yearly Internet Trends presentation from Mary Meeker from Kleiner Perkins Caufield & Byers.

One slide that jumped out talks about the very small percentage of data that is “tagged” and the even smaller amount of data that is “analyzed” in the Digital Universe.


(See all the slides here.)

These percentages hold true with what we have seen at Plum Analytics when looking at the data exhaust surrounding scholarly research output.  Although we cannot yet gather metrics for ALL output, we are increasingly able to quantify the engagement around an ever-growing set.

Working with our partners and customers, we continue to make progress on tagging and building analytics and dashboards for curated subsets of the world’s scholarly data. 

Drilling into the stat of why only 1% of the data in the digital universe is analyzed, it’s illustrative to think about just how fast this universe of data is expanding.  Below is a real time infographic that shows this expansion really well.


Note that the above image is just a sample of the data at, it is worth clicking through to see the live infographic.

At Plum Analytics, we’re excited to think about the opportunities of harvesting the firehose of data surrounding scholarly output, and utilizing it to tell the great stories of the impact that researchers are having.

We’re excited to announce that PlumX is Shibboleth enabled. Institutions and large enterprises can now authenticate PlumX in the way they are used to for subscription services. In addition the Shibboleth-enabled authorization also provides the identification process necessary to safeguard researchers’ profiles. By using this authentication method, PlumX knows who is using PlumX and allows for researchers to update their own profiles but not others. Likewise, other authorized users, such as a liaison librarian, can update specific researcher profiles, while preventing others from doing so.

From the beginning, we wanted to build products that met the needs of institutions. Supporting Shiobboleth in PlumX is one of the ways we are fulfilling this goal.

Click here to see the press release.


In my 11th grade biology class, with my lab partner, we dissected a fetal pig.  I remember the aha moment when I realized what the diaphragm was.  I had taken clarinet lessons for many years, and my teacher would always tell me to “breathe more with your belly, fill up your diaphragm”  I never quite understood what he was talking about, until, right before my eyes,  I could see that thin layer of muscle that ran across the abdomen of the animal.  And then, click, I understood.

With the advent of 3D printing, it’s amazing to think of all of the other things we can now see, and understand, in the blink of an eye.  For example, I saw this tweet recently:


With the work that we are doing at Plum Analytics, to visualize the impact of research, we are faced with a similar challenge.  How do we push forward the bounds of what is visible?  We sometimes compare the difference between what we measure with PlumX and altmetrics to visible light verus the full electromagnetic spectrum.



For centuries, all that people could see and measure was the visible light spectrum.  As instrumentation progressed, and better measurement tools were developed, we can now measure more.

For decades, the only tools that were instrumented to measure the impact of research came a time when journals were printed.  Measuring the quality of the journal that it was printed in, and the number of times other journal articles cited that work, was all that you could see and measure.

Now, with researchers and lay people interacting with research online, a whole new era of measurement can be possible.  We can discover who is interacting with the work, how much, across what channels, etc.  This enables us to better tell the stories behind the research that is being performed.

We love hearing your stories of how having access to this sort of data has enabled you to see what you could only imagine.  Please drop us a line at

When we started working with all of the metrics that we could gather from the data exhaust created when people interacted with research we quickly realized three things:

  1. Not all metrics are created equal, a download is not the same as a tweet.
  2. Synthesizing all of the metric data into a single number dilutes the meaning.
  3. Categorizing the metrics into buckets gives you useful information.

For example, we have seen that people “capturing” work to save it for later is often an early indicator of later citations. Since citation counts lag, this is a great way to find work that other researchers are finding valuable. But, we don’t want to “bury” the fact of those captures inside some grand number - you would lose this valuable information.

After a lot of experimentation and working with early customers, we categorized metrics into these useful categories:

imageimage  image image image

Here is a list of examples of what we put into each category:

  • Usage - Downloads, views, book holdings
  • Captures - Favorites, bookmarks, saves, readers, groups, watchers
  • Mentions - blog posts, news stories, Wikipedia articles, comments, reviews
  • Social media - Tweets, +1’s, likes, shares
  • Citations - PubMed, Scopus, patents

You can see PlumX in action and see more on how these categories work with real research at the PlumX Demo Site.


Mike Buschman is in the UK for a few weeks attending and presenting at conferences. Here is where he will be.


Mike will be in the EBSCO booth at London Book Fair 8-10 April. If you are attending please stop by and see Mike and the latest from PlumX, our research impact dashboard.


Mike will also be presenting at UKSG, taking place from 14-16 April at the Harrogate International Centre. The information for his presentations is:

  • Altmetrics in Practice 
  • Tuesday 15 April at 14.00
  • Wednesday 16 April at 11.00
  • Room 3 in the Queen’s Suite

The abstract:

Citation counts have long been the standard measure of academic research usage and impact. Specifically, published articles in prominent journals citing other published articles in other prominent journals equate to prestige and tenure. Metrics can now be harvested and applied to research around usage, captures, mentions, and social media, in addition to citations, giving a much more comprehensive and holistic view of impact. These new metrics are also much more timely than citation metrics and can keep pace with new formats much faster than the entrenched, legacy practices. The session will highlight some practical ways institutions are using these new metrics today and what the future holds.

If you are attending UKSG we hope you can go to one of Mike’s presentations. If you are there, please say hi to him.