Author: Lucy Lambe

New Workflow: “Be REF compliant”

The library has released a new workflow on how to make your publications REF compliant. Authors can now deposit their journal articles and conference proceedings on acceptance in Spiral via Symplectic. At the same time an application can be made for APC funding to pay open access fees.

 

New REF workflow

Find out more about how the library can support you in making your work open access at our new web pages, or contact us: openaccess@imperial.ac.uk.

UKSG – Untying the knots and joining the dots – 20th November 2014

This year’s UKSG one day conference focused on how researchers are being supported in the changing scholarly communications landscape. The day brought together academics, librarians, publishers and funders to discuss how we can work together to achieve open access requirements as painlessly as possible. What follows is a summary of the event, and the whole day was filmed so you can catch up on the talks at the UKSG website.

Knot_bowline

The day began with Ben Johnson from HEFCE who told the story of how open access came to the attention of the UK government when David Willetts was unable to access the journal articles required to write his book. From Willetts to the Finch Report to the new REF policy, universities are now being pushed into action to ensure publications are made open access and impact of research is demonstrated. HEFCE and other UK funders are making it clear that if research is to have an impact on policy people within government need access to it.

Simon Hubbard from the University of Manchester spoke next about the complicated process of making a paper open access, reporting on research to your funder and storing your research data in the appropriate place. Even for a researcher who has an active interest in open access publishing, the burden of bureaucracy can be off-putting, especially when it feels like he’s entering the same information over and over again into different systems. Finally, Simon had a few recommendations to improve the open access workflow: remove academics from the process as they only slow things down; better and more unified systems; and a simpler message from funders and publishers.

A final highlight of the morning came from Ian Carter at the University of Sussex, who spoke from the perspective of university management and strategic planning. Ian started by summarising the pressures that researchers find themselves under, from conducting “world-class” research, to providing value for money to students paying much higher fees than ever before, to compliance with varying funder policies. To achieve all of this there must be behavioural change from researchers, for example making their work more accessible through open access, and additional support from institutions to ensure these goals align with their overall strategy. Dissemination, communication and impact were identified as some of the most important aims for both researchers and institutions.

The second half of the day saw the librarian’s perspective from Martin Wolf at the University of Liverpool; he believes librarians have a better understanding of the overall picture and how different stakeholders interact. Librarians often find themselves interpreting both funders’ policies and publishers’ open access options to researchers. However, in addition to this advocacy work, librarians seem to be getting increasingly stuck on the detail and are too risk averse when it comes to promoting open access, for example, over the minutiae of a publisher’s copyright policy. Comments from publishers after this session implied that early career researchers are asking very basic questions about open access, so there is still a lot of work to be done.

The last few sessions were lightning talks from providers of altmetrics tools; Digital Science, Kudos and Plum Analytics. These are just three of the many new products designed to capitalise on the impact agenda, and aim to help researchers increase and measure the impact of their publications.

Overall, the day was very useful and demonstrated the various perspectives on research and publication, including changing expectations from all stakeholders involved in the process. It’s clear that while the post-REF2014 policy has been a disruptive force, change was already beginning in the areas of open access, alternative metrics and demonstrating the impact of research.

You can find a summary of Tweets from the day here; collected by Ann Brew, our Maths and Physics librarian.

 

Lucy Lambe
Ann Brew
Philippa Hatch
Michael Gainsford

Open Access Button

Last night saw the launch of the Open Access Button to coincide with worldwide Open Access week. The team behind the Open Access Button aim to help researchers, students and the general public access research papers that are behind paywalls and beyond their means.

The idea came from two medical students who were frustrated at not being able to access all the research they wanted to read, and finding the average cost to read a paywalled article was $30. Although the team has expanded to include partnerships with Cottage Labs, Jisc and more, there are still a large number of students donating their time to the project.  Work began on the Button last year with a beta project that saw 5000 people hit almost 10,000 paywalls or denied access.

The new version of the Open Access Button is a plug-in for your browser that works as a button you click any time you cannot access an article due to a paywall. The system registers information about the article and your location to create a map of researchers who need access to information.

Open Access Button Paywall Map
Image credit: Open Access Button CC-BY-SA

The Open Access Button will try to find a free to access version of the article, for example a pre-print deposited to an institutional or subject repository. If an alternative version cannot be found, the Button will then email the author to let them know that someone wants to access their research but can’t, and suggests the author deposits a copy in a repository.

Upon clicking the button, users are asked to enter a few sentences about why they want to read the article and what they could do if the research was available open access. The creators hope to use this information for open access advocacy, and to create stories that connect researchers, their work and readers around the world.

Keep up to date with the project on Twitter @OA_Button

1:AM London Altmetrics Conference 25-26 September 2014

Held at the Wellcome Collection in London and organised by Altmetric.com and the Wellcome Trust, this was the very first conference to focus solely on alternative metrics and their use by funders, universities and researchers.

The first day began with an introduction from seven different altmetrics providers to their products. Although similar, they each do something slightly different in how they measure their metrics and present them.

Below is a summary of the event, with a more comprehensive blog available from the organisers here.

Altmetrics, by  AJ Cann https://www.flickr.com/photos/ajc1/6795008004 Licensed CC BY SA 2.0
Altmetrics, by AJ Cann. Licensed CC BY SA 2.0

How are people using altmetrics now?

During this session we heard from a range of stakeholders, including representatives from the Jisc funded project IRUS, a university-publisher collaborative project, and an academic who studies altmetrics as part of his research.

IRUS is using article level metrics to answer the question: are people using university repositories? The answer is yes, and IRUS can help repository managers to benchmark their repository contents and use. IRUS allows an institution to check the quality of its metadata, and also provides COUNTER compliant statistics that can be trusted.

Snowball Metrics is a university-driven and Elsevier-facilitated project that has produced a number of “recipes” designed to help universities use altmetrics for benchmarking. This takes metrics beyond the individual paper or researcher, and allows the university to assess a department as a whole. However altmetrics alone are not good enough to judge scholarly quality.

Finally Mike Thelwall, based at the University of Wolverhampton, presented his research group’s findings. Mike has been investigating how altmetrics relate to citation scores and overall has found a positive but weak correlation. Twitter seems to lead to more publicity for a paper, but doesn’t necessarily lead to more citations; Mendeley’s read count has a much stronger correlation with citations.

What’s going on in the communication of research?

This session gave us a great opportunity to hear from two active researchers on how they communicate their research to an academic audience and beyond. What was apparent was that Renée Hlozek, a postdoctoral researcher, had a lot more time to spend not only on actual research, but also on creative ways to communicate her research to a wider audience. For example, she is active on Twitter, blogs and is a current TED Senior Fellow.

As a professor, Bjorn Brembs spends more time on teaching and university administration. This means he struggles to find time to spend on promoting his research more widely, for example on social media. This is just one example of the importance of context when it comes to interpreting altmetrics. A researcher could find themselves with work of varying altmetric scores depending on the stage their career is at.

Impact assessment in the funding sector: the role of altmetrics

This session first heard from James Wilsdon, who is chairing the steering group on the role of metrics in research assessment for HEFCE. The group called for evidence from publishers, researchers and other stakeholders and received over 150 responses. There are loud voices both for and against altmetrics, and the full response would be published on the HEFCE website in early 2015.

Representatives from three different funders then spoke, including the Wellcome Trust, Science Foundation Ireland and the Association of Medical Research Charities. All three identified the need for researchers to show evidence of engagement with a wider audience and providing greater value for money. Altmetrics have the potential to give funders a lot more information about the research they fund by highlighting attention to articles before they are cited. However, Ruth Freeman from Science Foundation Ireland warned against using altmetrics in isolation, and Adam Dinsmore from Wellcome agreed that the altmetrics “score”  is less important than the conversations happening online.

Altmetrics and publishers

The publishers who spoke identified what they saw as the two primary uses for altmetrics in publishing. First, they allow the author to track how popular their work is; second, altmetrics can help with discoverability. Both PLoS and Springer are planning to use altmetrics to create cross-journal highlights for specific subject areas, for example Neurostars from Springer.

The open access publisher PLoS was the first publisher to introduce article level metrics. Jennifer Lin explained that PLoS plan to do more to reveal the stories behind the numbers. To do this they need to advocate for improvements to article metadata, and see ORCID as something that will help disambiguate author information

Workshops

During the final session of the conference, we attempted to reach some final conclusions and also to think about what developments we would like to see in the future. There were three main points:

  1. The need for standardisation was identified – there are a number of different organisations that are collecting and measuring alternative metrics. Some standardisation is necessary to ensure the results are comparable and trustworthy.
  2. A lot of data is being collected, but there are a lot of improvements to be made in the interpretation and use of the data. The use of altmetrics by funders, REF, etc. should be as transparent as possible.
  3. In all cases, the use of altmetrics should include a consideration of context, and should be used in creating a story of the impact that is followed from the lab to publication to policy implementation.

Altmetrics at Imperial

Symplectic and Spiral both feature altmetrics from Altmetric.com, displayed as a colourful “donut”. You can see an example in Spiral here. Clicking on the icon will take you to the Altmetric page for that article, where you can explore the Tweets and blogs that have mentioned it.