Category: Events

UK ORCID members meeting and launch of Jisc ORCID consortium at Imperial College London, 28th September 2015

On Monday 28th September representatives of over 50 UK universities, ORCID, Jisc, GuildHE, RCUK and CRIS vendors met at Imperial College London for the first UK ORCID members meeting, and to launch the Jisc ORCID consortium. ORCID provides a persistent identifier that links researchers to their professional activities and outputs – throughout their career, even if they change name or employer. The unique iD ensures that authors receive credit for their work and allows institutions to automate information exchange with other organisations such as funders, thereby increasing data quality, saving academics time and institutions money.

In 2014, Imperial College London was one of the first universities in the UK to make ORCID available to researchers, working with the Jisc-ARMA-ORCID pilot. We have since actively engaged with ORCID and the community to increase uptake and improve systems integration.The UK ORCID meeting was designed to bring together different strands of these discussions, andto facilitate a broad discussion about the next steps for ORCID in the UK. Following the pilot programme, Jisc has negotiated an ORCID consortium through which universities can benefit from premium ORCID membership at significantly reduced cost. The meeting was the official launch event for the consortium.Over the last two years ORCID, a relatively new initiative, has gained a lot of momentum, not just in the UK:

  • over 1.65m researchers registered globally
  • ORCID iDs associated with over 4.3m DOIs
  • over 300 member organisations
  • 3 national consortia agreements signed (Italy, UK and Denmark) with more in progress

In 2011, Jisc had set up a “researcher identifier” task and finish group, that included funders, libraries, IT directors, research managers and organisations like HESA. This group eventually recommended ORCID as a solution for the UK. Since then, ORCID has seen increasing support from research organisations and funders. Recently, both the Wellcome Trust and NIHR have mandated the use of ORCID for grant applications. RCUK’s Overview of Systems Interoperability Project resulted in a strong endorsement for ORCID, as did HEFCE’s Report of the Independent Review of the Role of Metrics in Research Assessment and Management.

Neil Jacobs from Jisc speaking at UK ORCID members meeting

The UK ORCID meeting was not in the first instance about funders and their mandates though, it was about a discussion between the ORCID member organisations, the Jisc consortium and the way we as a community want to move forward. Specifically, the meeting had four aims:

  • to raise awareness and understanding of ORCID and the Jisc consortium offer and benefits
  • to bring together the UK ORCID community and establish how we want to work together
  • to discuss community expectations for system and platform providers, funders and publishers
  • to inform the Jisc technical and community support offering

Audience at UK ORCID members meeting

The aim of the morning session was to raise awareness and create a shared understanding of ORCID. It started with presentations from ORCID and Jisc, followed by four university case studies from the pilot programme (Kent, Imperial, Oxford and York) and a Q&A panel. After lunch we discussed community requirements, and ways to work together to achieve these. Four thematic areas were discussed in breakout groups, organised through a community document where participants and others who could not attend in person, had listed their issues and expectations in advance of the meeting. This approach helped focus the discussions and led to a broad agreement on key issues.

Below is my summary of the key community requirements:

CRIS and repository platforms:

  • actively prompt users to link their ORCID iD
  • facilitate iD creation by pre-populating ORCID profiles with institutional affiliation and other relevant information
  • harvest metadata for outputs associated with an iD from other systems
  • allow users to push output metadata into the ORCID registry

Publishers:

  • collect ORCID iDs for all authors, not just the corresponding author
  • make iDs of all authors available with output metadata
  • mint DOIs on acceptance and link to authors’ iDs
  • make the author accepted manuscript available on acceptance, with an ID

Funders:

  • fully integrate ORCID into their workflows and systems
  • move towards mandating ORCID

This is only a high-level summary of a much richer discussion. Some of the detail that I have conveniently skipped over will no doubt lead to further discussions later, but I found it remarkable how broad the consensus was – across more than 50 universities with very different approaches, requirements and cultures. There is still a lot of work to be done until we can reap all of the benefits that ORCID can enable, but the members meeting showed that universities are keen to work together with Jisc and ORCID to make progress.

Universities across the UK are now actively considering how to roll out ORCID, and there was much interest in lessons learned and emerging best practice. A UK ORCID mailing list is currently being set up and Jisc and ORCID are looking into ways to capture and share information through the new consortium. Jisc are currently hiring for staff to support the consortium and help members to implement ORCID. I am looking forward to follow-on discussions with Jisc, ORCID and the community about the next steps.

 

Presentations (in order of appearance):

 

UKSG – Untying the knots and joining the dots – 20th November 2014

This year’s UKSG one day conference focused on how researchers are being supported in the changing scholarly communications landscape. The day brought together academics, librarians, publishers and funders to discuss how we can work together to achieve open access requirements as painlessly as possible. What follows is a summary of the event, and the whole day was filmed so you can catch up on the talks at the UKSG website.

Knot_bowline

The day began with Ben Johnson from HEFCE who told the story of how open access came to the attention of the UK government when David Willetts was unable to access the journal articles required to write his book. From Willetts to the Finch Report to the new REF policy, universities are now being pushed into action to ensure publications are made open access and impact of research is demonstrated. HEFCE and other UK funders are making it clear that if research is to have an impact on policy people within government need access to it.

Simon Hubbard from the University of Manchester spoke next about the complicated process of making a paper open access, reporting on research to your funder and storing your research data in the appropriate place. Even for a researcher who has an active interest in open access publishing, the burden of bureaucracy can be off-putting, especially when it feels like he’s entering the same information over and over again into different systems. Finally, Simon had a few recommendations to improve the open access workflow: remove academics from the process as they only slow things down; better and more unified systems; and a simpler message from funders and publishers.

A final highlight of the morning came from Ian Carter at the University of Sussex, who spoke from the perspective of university management and strategic planning. Ian started by summarising the pressures that researchers find themselves under, from conducting “world-class” research, to providing value for money to students paying much higher fees than ever before, to compliance with varying funder policies. To achieve all of this there must be behavioural change from researchers, for example making their work more accessible through open access, and additional support from institutions to ensure these goals align with their overall strategy. Dissemination, communication and impact were identified as some of the most important aims for both researchers and institutions.

The second half of the day saw the librarian’s perspective from Martin Wolf at the University of Liverpool; he believes librarians have a better understanding of the overall picture and how different stakeholders interact. Librarians often find themselves interpreting both funders’ policies and publishers’ open access options to researchers. However, in addition to this advocacy work, librarians seem to be getting increasingly stuck on the detail and are too risk averse when it comes to promoting open access, for example, over the minutiae of a publisher’s copyright policy. Comments from publishers after this session implied that early career researchers are asking very basic questions about open access, so there is still a lot of work to be done.

The last few sessions were lightning talks from providers of altmetrics tools; Digital Science, Kudos and Plum Analytics. These are just three of the many new products designed to capitalise on the impact agenda, and aim to help researchers increase and measure the impact of their publications.

Overall, the day was very useful and demonstrated the various perspectives on research and publication, including changing expectations from all stakeholders involved in the process. It’s clear that while the post-REF2014 policy has been a disruptive force, change was already beginning in the areas of open access, alternative metrics and demonstrating the impact of research.

You can find a summary of Tweets from the day here; collected by Ann Brew, our Maths and Physics librarian.

 

Lucy Lambe
Ann Brew
Philippa Hatch
Michael Gainsford

Open Access Button

Last night saw the launch of the Open Access Button to coincide with worldwide Open Access week. The team behind the Open Access Button aim to help researchers, students and the general public access research papers that are behind paywalls and beyond their means.

The idea came from two medical students who were frustrated at not being able to access all the research they wanted to read, and finding the average cost to read a paywalled article was $30. Although the team has expanded to include partnerships with Cottage Labs, Jisc and more, there are still a large number of students donating their time to the project.  Work began on the Button last year with a beta project that saw 5000 people hit almost 10,000 paywalls or denied access.

The new version of the Open Access Button is a plug-in for your browser that works as a button you click any time you cannot access an article due to a paywall. The system registers information about the article and your location to create a map of researchers who need access to information.

Open Access Button Paywall Map
Image credit: Open Access Button CC-BY-SA

The Open Access Button will try to find a free to access version of the article, for example a pre-print deposited to an institutional or subject repository. If an alternative version cannot be found, the Button will then email the author to let them know that someone wants to access their research but can’t, and suggests the author deposits a copy in a repository.

Upon clicking the button, users are asked to enter a few sentences about why they want to read the article and what they could do if the research was available open access. The creators hope to use this information for open access advocacy, and to create stories that connect researchers, their work and readers around the world.

Keep up to date with the project on Twitter @OA_Button

1:AM London Altmetrics Conference 25-26 September 2014

Held at the Wellcome Collection in London and organised by Altmetric.com and the Wellcome Trust, this was the very first conference to focus solely on alternative metrics and their use by funders, universities and researchers.

The first day began with an introduction from seven different altmetrics providers to their products. Although similar, they each do something slightly different in how they measure their metrics and present them.

Below is a summary of the event, with a more comprehensive blog available from the organisers here.

Altmetrics, by  AJ Cann https://www.flickr.com/photos/ajc1/6795008004 Licensed CC BY SA 2.0
Altmetrics, by AJ Cann. Licensed CC BY SA 2.0

How are people using altmetrics now?

During this session we heard from a range of stakeholders, including representatives from the Jisc funded project IRUS, a university-publisher collaborative project, and an academic who studies altmetrics as part of his research.

IRUS is using article level metrics to answer the question: are people using university repositories? The answer is yes, and IRUS can help repository managers to benchmark their repository contents and use. IRUS allows an institution to check the quality of its metadata, and also provides COUNTER compliant statistics that can be trusted.

Snowball Metrics is a university-driven and Elsevier-facilitated project that has produced a number of “recipes” designed to help universities use altmetrics for benchmarking. This takes metrics beyond the individual paper or researcher, and allows the university to assess a department as a whole. However altmetrics alone are not good enough to judge scholarly quality.

Finally Mike Thelwall, based at the University of Wolverhampton, presented his research group’s findings. Mike has been investigating how altmetrics relate to citation scores and overall has found a positive but weak correlation. Twitter seems to lead to more publicity for a paper, but doesn’t necessarily lead to more citations; Mendeley’s read count has a much stronger correlation with citations.

What’s going on in the communication of research?

This session gave us a great opportunity to hear from two active researchers on how they communicate their research to an academic audience and beyond. What was apparent was that Renée Hlozek, a postdoctoral researcher, had a lot more time to spend not only on actual research, but also on creative ways to communicate her research to a wider audience. For example, she is active on Twitter, blogs and is a current TED Senior Fellow.

As a professor, Bjorn Brembs spends more time on teaching and university administration. This means he struggles to find time to spend on promoting his research more widely, for example on social media. This is just one example of the importance of context when it comes to interpreting altmetrics. A researcher could find themselves with work of varying altmetric scores depending on the stage their career is at.

Impact assessment in the funding sector: the role of altmetrics

This session first heard from James Wilsdon, who is chairing the steering group on the role of metrics in research assessment for HEFCE. The group called for evidence from publishers, researchers and other stakeholders and received over 150 responses. There are loud voices both for and against altmetrics, and the full response would be published on the HEFCE website in early 2015.

Representatives from three different funders then spoke, including the Wellcome Trust, Science Foundation Ireland and the Association of Medical Research Charities. All three identified the need for researchers to show evidence of engagement with a wider audience and providing greater value for money. Altmetrics have the potential to give funders a lot more information about the research they fund by highlighting attention to articles before they are cited. However, Ruth Freeman from Science Foundation Ireland warned against using altmetrics in isolation, and Adam Dinsmore from Wellcome agreed that the altmetrics “score”  is less important than the conversations happening online.

Altmetrics and publishers

The publishers who spoke identified what they saw as the two primary uses for altmetrics in publishing. First, they allow the author to track how popular their work is; second, altmetrics can help with discoverability. Both PLoS and Springer are planning to use altmetrics to create cross-journal highlights for specific subject areas, for example Neurostars from Springer.

The open access publisher PLoS was the first publisher to introduce article level metrics. Jennifer Lin explained that PLoS plan to do more to reveal the stories behind the numbers. To do this they need to advocate for improvements to article metadata, and see ORCID as something that will help disambiguate author information

Workshops

During the final session of the conference, we attempted to reach some final conclusions and also to think about what developments we would like to see in the future. There were three main points:

  1. The need for standardisation was identified – there are a number of different organisations that are collecting and measuring alternative metrics. Some standardisation is necessary to ensure the results are comparable and trustworthy.
  2. A lot of data is being collected, but there are a lot of improvements to be made in the interpretation and use of the data. The use of altmetrics by funders, REF, etc. should be as transparent as possible.
  3. In all cases, the use of altmetrics should include a consideration of context, and should be used in creating a story of the impact that is followed from the lab to publication to policy implementation.

Altmetrics at Imperial

Symplectic and Spiral both feature altmetrics from Altmetric.com, displayed as a colourful “donut”. You can see an example in Spiral here. Clicking on the icon will take you to the Altmetric page for that article, where you can explore the Tweets and blogs that have mentioned it.