In research, most of the time this question does not come up until it is time to publish. Who did what? This question is essential, it will determine how authorship is distributed and ultimately how credit for the work is attributed. But very often, this information is not communicated, and although first authors are generally the do-ers and last authors the managers, there is a sea of unknowns between the two. This makes judging achievements based on authorship incredibly unreliable. PLOS journals and others already require precise descriptions of how authors contributed to the work. However terminologies can vary across journals, which prevents any real use of the information to assign credit.
In an attempt to solve the issue, the Wellcome Trust (Liz Allen) and Digital Science (Amy Brand) launched a new project called CRediT (Contributor Roles Taxonomy) last June. CRediT is now proposing a standard taxonomy composed of 14 defined roles such as “conceptualization”, “resources”, “supervision”, “writing – review & editing”… You can view them all here.
The CRediT project is now asking everyone to provide feedback on the taxonomy. If researchers show their interest in such a standard by helping to define it, there is more chance that journals will pick it up. And eventually that actual credit and career advancement are based on this system. So don’t hesitate to speak out your mind and spread the word.
I am probably not going to surprise you much by saying that the peer-review process of research articles could be improved. As I see it, peer-review should be an opportunity for researchers to improve their work. It should help detect flaws in the methodologies used and advise on how to improve the manuscript itself. But sadly, for many, peer-review feels like an unproductive waste of time, with sometimes endless waiting periods when the manuscript is in the hands of editors or slow reviewers. Even more frustrating is when the feedback consists of minimalistic reviews that bring little to the work. But things are slowly changing. Many journals now advertise their quick turn arounds and their network of competent and responsive reviewers. But with thousands of research journals in activity, it can be hard for researchers to know which journal is doing a good job at it and which are not.
SciRev is a platform that allows researchers to publish reviews of their experience with the peer-review process of specific journals. Facts including: the duration of the first review round, total handling time, number of reviewers, quality of the reviews, and an overall rating are displayed. A very useful “motivation” section also allows users to explain a bit about their experience with the journal. This usually gives interesting insights on the peer-review process. SciRev also allows you to compare several journal side by side.
Review of journals by users.
The reviews seemed to be well balanced between the positive and the more negative. SciRev also gathers very valuable data that could help grasp the problems in the publishing sector. Based on the user data, SciRev also provides interesting statistics about the review process within various scientific fields (see below).
Rejection time and duration of the first round of review averaged for all scientific fields
This initiative, along with others that help researchers choose the right journals to publish in, will likely play a role in improving the publishing system. Indeed, the cost supported by universities associated with publication and with subscription to journals, should come with a matching service. For the price, we should expect services such as open access, innovative ways of communicating the research, and an efficient peer review process. Such journal-rating platforms could become key to identifying the journal that are ahead of the game.
JournalGuide, the service that helps you find the most appropriate journal for your research (blog post) has just released an interesting functionality. As an effort to fight against predatory journals, they are providing a Verified label for all journals that meet their level of expectation for journal. The details about the criterium used to grant the Verified status are explained in details in a white paper and a JournalGuide blog post.
The new Verified status appears on the right, along with other information about the selected journal (see red square).
Of note, JournalGuide chose to remain inclusive keeping the journals that could not be verified in their database. Those not designated as Verified are not necessarily fraudulent, they simply could not be verified using the criterium chosen. If you wish to publish in an unverified journal, the best method is to ask around. If your colleagues had a good experience, chances are you will too.
A quick note to say I’m adding ScienceOpen in the list of online tools for researcher.
The platform qualifies itself as a “Research and Publishing Network”. It is an open access publisher, inviting members of the ScienceOpen network to review submitted papers. ScienceOpen then opens it up to all researchers with a ORCID and has at least five publication to further comment on the article after publication. At the moment it is referencing over 1,000,000 open access articles from various open access databases. ScienceOpen also included a networking component, to share papers, open discussion groups and collaborate with other researchers.
ScienceOpen is a start-up company based in Berlin (Germany) and Boston (USA). Learn more about ScienceOpen here.
ResearchGate recently announced that they now encourage researchers to share data through their platform. They hope to get more unpublished information out in the open to fuel scientific discussion. Such information include:
- Datasets and raw data
- Negative results
- Figures and media files
- Unpublished articles
This new service comes in addition to a set of other services that already allow researchers to share data and unpublished information. ResearchGate, with their 2+ million users, will probably quickly become one of the main platform to publish such information. By steadily releasing new services, ResearchGate seems to be taking the lead as a social platform for scientific exchange. However published data should be easily searchable, citable and not prisoner of proprietary formats. This is not the case as of today, and I would be curious to learn more about efforts currently underway to address these issues.
Open access journals have clearly a lot going for them. One advantage over traditional journals is their accessibility. No login, pay-per-view or university-IP needed to view the articles. Just Google-it! This also means more readers. And with such large audiences, open access journals are naturally tempted to build social exchange platforms to promote and bring additional value to their collection of published articles.
Several open access publishers have come to offer very similar features than dedicated social networking sites. Two examples:
- Frontiers. They offer a personalized profile, networking functions, job and events sections, internal messaging, and the possibility to start your own science blog.
- PLoS allows to comment on the articles and shows valuable metrics such as how many time the article was viewed and tracks any record of the paper on social networks (twitter/Facebook).
Social networking sites have a complementary approach. They focus on developing the social platforms while gathering as many references and papers as possible from external sources. So may the Elsevier and Springers rest assured; scientific articles are still the centerpiece of research. And with the exceptions of a few initiatives, the communication format remains what is was 50 years ago. However, the new online services offer alternative models in the way the papers are written, published, exchanged and discussed.
With Open access publisher introducing more social and social networks strengthening their publication database and accessibility, I would bet on a great merger of services in the next few years. Open access publishing, reference management, articles sharing, networking and discussion boards could be gathered within unique tools. This could also be achieved through better interoperability between existing services.
There are literally thousands of peer-reviewed journals out there. Finding which is best suited for your research can be tough, especially if you are not a native english speaker.
Edanz, an english editing service for scientific publishing based in Japan now offers a free online journal selection service. The tool analyzes your abstracts and suggests a list of adapted journals ranked by matching score.
I have tried it with one of my manuscript in preparation. The top three journals suggested by the tool corresponded to my target. Looks like this could help even experienced researchers to find the optimal journal for their communications.