Paul Meehan is an integral part of the Journal Usage Statistics Portal (JUSP) team. He is responsible for all areas of the service, including data processing, database administration, web development, support and training. He is working at the ‘coal face’ of COUNTER usage data. So, we were delighted when he agreed to be interviewed for our COUNTER blog.
Paul explained that the JUSP was established at Jisc in 2010 with the intention of being a “one stop shop” for libraries to login and access their COUNTER journal usage statistics from a range of vendors. “Over time JUSP has grown to encompass book, database, platform and other statistics, and is now used by close to 300 institutions in the UK and overseas; these include FE and HE sites, Research Councils, non-academic sites and others. We currently collect data on a monthly basis from approximately 90 publishers and suppliers.”
Paul has worked on JUSP since early 2010, effectively since the service moved into live production. His role is very varied, though currently his main focus is on the transition to COUNTER Release 5 (R5) and the testing, validation and gathering of R5 stats from vendors who have made the transition. He also continues to collect Release 4 (R4) data from the remaining vendors who haven’t yet made the transition. In total this equates to around 40,000 reports per month, all of which go through numerous quality control checks, both manual and automated; a large part of this work is liaising with publishers and/or institutions when there are any issues – these can range from account errors to problems with the report metadata to more systematic or technical issues at the vendor side.
Paul also plays a key role in user support and training, webinars, database administration, maintain the website and as part of a small team developing new JUSP functionality and services on an ongoing basis. Paul says, “It’s rare for my job to be the same two days in a row! I also work on other Jisc services or projects as required and have been part of Jisc and one of its predecessors, Mimas, since 1998.”
The change to Release 5
“R5 has presented a major change both in the way that data are presented and, in the metrics, contained therein.” explained Paul. “At its core, Release 5 data is presented through a number of master reports, each of which can be “sliced and diced” to produce standard views of the data, as well as affording the opportunity for the development of custom reports that services such as JUSP can offer. This differs from Release 4 in that the previous standard just offered reports at the more granular level, such as the JR1 report for journals or the BR2 for books, while the ”Master Title Report (TR) in Release 5 offers both book and journal data therein. Effectively there are fewer, but larger and more featured, reports to collect.”
“In terms of metrics, Release 5 is quite different, and it would take a lot of words to describe all the differences. Thankfully both COUNTER and JUSP have provided lots of documentation to assist in understanding the range of metrics included, and these can be found on the relevant websites e.g. https://jusp.jisc.ac.uk/guides/r5-overview.html and the excellent “Friendly Guides” at https://www.projectcounter.org/friendly-guides-release-5/.”
Paul went on to explain how this great change opens up new opportunities for librarians when analysing their usage data. “COUNTER designed R5 to offer librarians more powerful, yet simplified, metrics to better enable them to make use of the usage statistics; these are used to inform purchasing, renewal or subscription decisions. This forms a huge part of a library’s budget and therefore the ability to view comparable metrics across multiple vendors in a consistent and coherent way is extremely important. Release 5 also distinguishes between ‘unique’ and ‘total’ item and title requests and investigations, giving librarians and users more powerful ways to analyse and understand how their users or customers are making use of their subscribed publications. Release 5 also provides standard views of metrics such as accesses denied or reports both including or excluding Gold Open Access usage. The reports have very much been designed to provide an easier way for libraries in particular to get answers to frequently desired statistics on usage, turnaways or access denials, and use of paid vs open access content.”
Challenges with compliance
This sounds exciting, but I also wanted to know about the challenges that come with this change. Paul was clear, “currently the biggest challenge is in working with publishers and suppliers to provide data compliant with the standard.” He went on to explain, “there have been a lot of challenges for vendors in adopting the new standard, though it’s also true that the standard itself has been subject to updates and changes since its initial release. From a JUSP perspective, we can’t work with data until the reports meet compliance and pass all our checks, so we have done a lot of work with publishers to help them work through issues in their reports. As of late January, around half of the publishers we work with in JUSP are now supplying Release 5 data, though this is still subject to data restatements and some issues we have recently spotted. With that said, we are continuing to add Release 4 data into JUSP while vendors make their R5 data compliant. The other main challenge “in the real world” is the volume of data that is collected every month, and ensuring it is all processed, checked and added to JUSP in a timely manner!”
Paul said, “we recognise that adopting a new standard has its challenges. One of the biggest ways that vendors could potentially make the process easier from our perspective is to fully validate their reports before making them live. There have been many examples where reports have been made live, but we have discovered critical errors when coming to add them to JUSP. COUNTER’s Validation Tool at https://stats.redi-bw.de/counter-r5-validation-preview/ is a well-featured tool that picks up a wide range of issues. If publishers make regular use of this tool to validate reports, both before making data live, and also on an ongoing, monthly basis to pick up any problems that might appear post-release, this would be an enormous help. It’s impossible for a service like JUSP to manually run tens of thousands of reports through an online tool each month, so any help from the publisher side would be much appreciated and save us a lot of work in liaising with publishers over errors they could discover themselves!”
I asked Paul if once JUSP has the COUNTER data, they are seeing an errors, “Yes. This is probably unavoidable, given that issues can even still crop up in R4 reports from vendors we have been harvesting from for a decade! The R5 reports have provided a wide array of challenges, though this of course is because there have been changes to the standard in 2019 and it’s inevitable that there will be problems throughout this transition period. The Validation Tool, mentioned previously, has been a great help to us in picking up errors introduced into reports retrospectively, and use of this by vendors would alleviate much of the issues we encounter. Errors can be anything from missing metrics to totals not making sense to certain fields remaining unpopulated. When we run into these issues, we work with the vendors to try and resolve them before we pass the data into our live system for users and librarians to work with. We are used to restating data in JUSP!”
Finally, I asked Paul if there are any steps publishers and vendors can take to overcome some of these challenges. “Yes” he said, “as previously explained, use of the Validation Tool would solve a huge percentage of these issues, and would help not only JUSP but also save a lot of time in back and forth communication between ourselves, the publishers and librarians who may also spot issues.”