PubChase has been busy these past few months. In addition to their literature recommendations search engine on desktop and mobile, they regularly invite researchers write assays on the story behind their latest publications. They now launched a new initiative that could have a strong impact by helping young scientists navigate in the sometimes troubled waters of academic life.
The new career advice forum is a semi-crowdsourced list of science career-related Q&A. PubChase has aligned a rather impressive panel of mentors that answer questions that anyone can ask. The motivation for the forum (as explained in this blog post) is to provide free and good quality mentoring for scientists and researchers. Too many young researchers are facing difficulties with their supervisors in part because academic scientific leaders often lack proper management training. PIs are also incredibly busy and giving career advice is often not on top of their list of priorities.
List of questions and answers on the PubChase career advice forum
The forum gives you advice about what should you do if, for the past 3 years, your advisor has been asking for one “last experiment” before you publish? Or how should you react if your PI has simultaneously handed the same project to you and another student in the lab At the moment, one question is being answered every week, eventually buildup up to be a repository of career advise accessible to all. PubChase Career Advice Forum
Publons is another great alternative or complement to the traditional peer review process. Likeothers, this service is an answer to the slow and rather opaque peer-review process, in which the fate of a manuscript is to the mercy of an anonymous pair of experts. The idea is that publishing research results should not be the limiting step. Papers should be published, then reviewed and commented-on by the readers. This sort of system would allow researchers to have a direct, rapid and interactive feedback on their work.
Andrew Preston and Daniel Johnston, described in their founding article that publon are facetious particle that is to academic research what an electron is to charge. Peter Koveski first described them as “[…] the elementary particle of scientific publication. It has long been known that publons are mutually repulsive. The chances of finding more than one publon in a paper are negligible. Even more intriguing is the apparent ability of the same publon to manifest itself at widely separated instants in time. One reason why this has not emerged until now seems to be that a publon can manifest itself with different words and terminology … defeating observations with even the most powerful database scanners.”
As you might have guessed, Publons is focused on physics manuscripts. It allows researchers to comment and review paper published on the pre-print repository arXiv and a list of top physics journals (Applied physics letters, Nature, PRL…).
Users can review, discuss and rate papers, and can also create a profile page gathering their contributions as well as their own publications. Once more, Publons’ success will largely depend on the size of the community that it can attract. So, have a look and share the word!
Believe it or not, crowdsourcing in scientific research has been around for a while. Surveys, have been a great way for researchers to obtain large amounts of information about human behaviors. With the introduction of online surveys, researchers have never had easier access to participants for their surveys. But along with the anonymity of internet, comes risks of fake answers or repeated participation that can influence the studies’ outcome.
This might be changing with Socialsci, a startup located in Cambridge (Massachusetts, USA), offering online survey services, tailors for researchers. In addition to quite standard survey-generation services, Socialsci guaranties academic-adapted prices (=low?) and high-quality participants. High-quality participants are assured by an internal quality control, where the answers to over a thousand different questions given by participants are monitored for consistency. That way, users making up answers will see their rating downgraded and their participation in other surveys will be less likely.
Here’ s little video found on their website summarizing this way better that I did: https://www.socialsci.com/. I’ve added this in the “Using the crowd section” of the “Online tools for researchers” list.
Kaggle is not exactly a newcomer but it is an excellent example of how the web 2.0 can boost science and help solve scientific problems.
Kaggle harvests the power of crowdsourcing to solve problems in need of data modeling. Predictive models are everywhere, they help predict various phenomena, from customer behaviors to bird migration. However there is no general rule for designing such models, and they often end up being optimized by trial and error. So it seems the field is well suited for the massive amounts of work-hours crowdsourcing can provide.
Kaggle asks participants to develop predictive models to help resolve problems that have been submitted by companies ( GE, Allstate, Merck, Ford…) and other organisations (universities, governmentalorganisations…). Tens to hundreds of different models can then be compared, and the best is chosen as the winner.
Turning work into a game is a common startegy to motivate participation, however it is interesting to see that Kaggle pushes the sport analogy quite far. The terms “player”, “competition” and “winner” are often used. And a winner there is, with the creator of the most optimized models usually rewarded with hundreds if not millions of dollars.
Founded in 2010, the company has successfully raised millions of dollars and major companies are coming onboard with their own data to be modeled, convinced by a series of successful projects. A great example how to put brilliant minds (with some free time on their hands) to collaborative working!