Keep an eye on your lab with TetraScience

tetrascience-normalised-logoIt feels very cliché to say we live in a connected world. But… it’s true isn’t it? We hold our beloved smartphones all day long. Some wear bracelets that track our physical activity. Objects in our homes are connected, allowing us to remotely control the air conditioning systems, lights, and windows blinds. This constant flow of data is supposed to make our lives more efficient, or help us gain special insights on our health. The technology gives us a probably unprecedented feeling of control, and data driven lifestyle is getting traction amongst the general public (us scientists included). But could we say as much for what we do in our laboratories?

Surprisingly, our lab instruments are commonly left alone without any supervision, hoping they will function “as normal” throughout their lifetimes. But of course that is wishful thinking. Most laboratories are like any other buildings. Temperature changes, humidity changes, vibrations comes and go. People turn off equipment by mistake, or turn a knob in a direction or the other. All of it can happen without the experimentalist even noticing.

In these circumstances, it is difficult to truly understand in which environmental conditions experiments are performed. Reproducibility can suffer from this lack of control. When scientists report in protocols experiments at room temperature, what does that truly mean? The “room temperature” in the un-airconditioned laboratory in southern France (Montpellier) I spent my PhD years in was certainly different than the one in my current Northern European research institute.

TetraScience is a company amongst a few others thinking of bringing a new layer of information to modern experimentation. Powered by cloud based software, TetraScience collects and stores data from scientific instruments so that you know what conditions your lab is running in. For instance, TetraScience will stream data from freezers and incubators directly to your mobile device indicating vital parameters such as temperature, humidity CO2 levels, and will notify you when something goes wrong.

TetraScience helps you get concrete idea of how their technology can help researchers through a few case studies. For instance, a happy TetraScience user has been Mathieu Gonidec, a chemist in the Whiteside’s Lab at the Harvard Department of Chemistry.

 “When running experiments, especially those stretching over long periods of time, an error can derail your timeline. Even worse, you are often unsure of what exactly went wrong”, Mathieu says.

 

In one instance, Mathieu was running a series of experiments where something seemed to not be right. What he discovered from looking at the historical temperature log was that the temperature was fluctuating in swings of 20-30 degrees from the set temperature, causing the experiment to fail.

 

TetraScience allowed Mathieu to identify the issue, resolve the problem immediately, and move onto the next step of his research.

Here’s another exemple. Jon Barnes is a synthetic chemist in the Johnson Research Group at the MIT Department of Chemistry also told his TetraScience story.

He and his lab seeks to develop new methodologies for the construction and modification of complex material libraries. For years, Jon had been frustrated by the lack of control over simple reaction parameters including temperature monitoring, the ability to turn off a hot plate, as well as to activate a syringe pump from a remote location. Day-to-day experiments often required constant in-person monitoring, which was both inefficient and frustrating.

 

TetraScience’s real-time monitoring data has granted Jon’s Industry lab peace of mind to start experiments at the end of the day, knowing that they will be immediately alerted if anything goes wrong so they can come back to the lab and take corrective action.

If anyone had tried their services, I would love to hear about your experience. Feel free to comment below!

Nine more digital tool added ot the list

Time for a quick update of the list of digital tools for researchers. A couple tools have been deleted, since they are no longer online. But many more are added. Enjoy!

Data management related tools in the broad sense of the term

  • Dat Data – Open source, decentralized data tool for distributing datasets small and large.
  • Riffyn – Cloud software for visual, collaborative, reproducible innovation.
  • Castor EDC – User friendly and affordable online data collection for medical research.
  • PCR Drive – Free platform that supports researchers in all their PCR-related processes.
  • Ovation – Simplifies your scientific life from sample tracking for startup labs to data management.
  • ELabJournal – GLP-compliant Electronic Lab Notebook and lab management tool.

A collaborative writing tool.

And a couple outreach platforms.

  • Speakezee – Bringing speakers and audiences together.
  • Science Simplified –  A science communication portal aiming to aggregate all academic public releases and serve as a direct communication channel with the general public.

Digital science industry, who are you?

Looking at thquestion-marke digital science industry develop and mature over the past few years has been truly fascinating. There is tremendous excitement from entrepreneurs about the enormous transformative potential of translating new web technologies to the scientific research world. And there are also many frustrations and challenges that come with such major changes in the research and innovation ecosystems.

Although this website and others try to capture these changes and trends, my feeling is that there is yet a clear picture of what the digital science industry looks like. What type of organizations is this emerging community formed of? How well are they doing? How do its actors see the future? Is this even a community? And how can Connected Researchers and other alike can help?

With the help of LabWorm.com, I have created a short survey that aims to fill that gap. Better data about the digital science industry will help the entrepreneur get a better sense of this emerging world-community. It will also make it possible for funders and policy makers to better support the digital science industry. Feel free to fill it in if you define yourself as a scientific tool developer.

Survey — https://thomascrouzier.typeform.com/to/jMMfil —

All results will be anonymized and published on this blog  and LabWorm blogs.

Community-driven discovery of scientific tools with LabWorm

Labworm_logoThere is now a vast ecosystem of digital resources for researchers at our disposal. Digital tools for researchers, science blogs, and databases are a few examples of how the digital revolution can help researchers be more efficient on a daily basis. With this wealth of resources, it is can be easy for researchers to get lost and miss out on important trends. For instance, which blog can help me understand a problem I have been pondering about for days, or what tool can make me write papers with my colleagues across the globe?

Sorting out the the digital resources for researchers is of course not an easy task. Here at Connected Researchers, we manually curate a list of tools, selecting only those specifically designed with researchers in mind. Another approach is to crowdsource this effort, relying on a community to both find and sort interesting resources. Reddit and others have proven the power of such an approach.

Tool Profile Page on LabwormLabWorm is a platform to discover and share digital resources for researchers. The platform is based around a community of tool users (researchers) and the tool developers. Each tool or
resource has it own profile page, with a short description and links to the website and other related resources. There is also a space for discussion, where you can give feedback about your experience or ask questions to other users and the tool developers. LabWorm allows you to up-vote those you like, create collections that you can share with your colleagues, and discuss the tools directly on the site.

There are several ways you will discover new tools on LabWorm.

  1. Every week, LabWorm highlights 5 top voted tools, helping you discover a new tools that your colleagues find useful.
  2.  Similar tools are indicated on tool profile pages. For instance at the bottom of Authorea’s profile page (a collaborative writing platform), two other tools appear. Overleaf, one of Authorea’s direct competitors, and Paperpile, the reference manager for google docs.
  3. Personalized recommendations. Based on the tools in your collection and on your upvotes, LabWorm finds other tools you might be interested in.
  4. Browse by categories and use the search bar. LabWorm sorts the tools by categories and associates tags to each tool, which allows you to search by keywords.
  5. Browse the collections of others. Other LabWorm users have collected their favorite tool and are sharing their collection with you.

Labworm_Screeenshot_1

LabWorm’s social-orientated features and crowdsourcing approach has the potential to quickly curate a large number of researchers’ favorite tools. It could come as a great complement to sites and blogs such as Connected Researchers, which are focused on raising awareness around digital tools for researchers by listing them and placing them in the context of the every day life of a researcher.

Electronic lab notebooks and the future of science discussed at Labfolder workshop

labfolder_logo

I was fortunate to attend a Labfolder workshop on the 2nd and 3rd of June 2016 in Berlin (Germany). This was the opportunity to discuss user experiences of Labfolder’s electronic lab notebook (eLN), but also to talk more generally about digital science tools and their integration the researcher’s workflow. I thought I would share what I’ve learned during that session.

The session first started with a presentation of the smartLAB initiative from the Institute of Technical Chemistry at Leibniz University Hannover (Germany). There, a research group is developing the lab of the future, both on the hardware and software integration side. They have a few fascinating videos that shows very concretely what a fully digitalized laboratory could soon look like.

A prototype of their concept developed with several partners including Labfolder has already been presented to the public earlier this year. Dr. Patrick Lindner represented the project, and mostly talked about their smartLAB’s efforts to connect laboratory instruments to the internet and about their collaboration with Labfolder to directly feed the data back to an eLN.

Then, Dr. Alexander Grossman discussed about the ScienceOpen platform that he launched in 2014. ScienceOpen now aggregates of over 15 million articles, with the possibility of post-publication peer review (commenting) and articles rating. But ScienceOpen is also set up as a publishing platform. Researchers can prepare manuscripts directly on the platform, then release their draft as a publication when they are ready. The article then relies on post-publication peer review for its quality control, receiving only an editorial check before publication. Perhaps the future of scientific publication?

Prof. Ulrich Dirnagl, director of the department of Experimental Neurology at the Charité medical university in Berlin (Germany), gave an impressive talk about their efforts to bring data management into the 21st century. They realized that electronic lab notebooks are essential to improve sustainability of data, its findability, and the reproducibility of experiments. He pointed out to an article he published at the beginning of 2016 that constitutes a handbook for introducing eLN in academic life sciences laboratories. He starts the article with this striking image of two lab notebook entries looking very similar despite over a hundred years of history separating them. His message: surely we can do better today.

Image from Dirnagl, U. & Przesdzing, I. A pocket guide to electronic laboratory notebooks in the academic life sciences. F1000Res. 5, 2 (2016).

Image from Dirnagl, U. & Przesdzing, I. A pocket guide to electronic laboratory notebooks in the academic life sciences. F1000Res. 5, 2 (2016).

Prof. Dirnagl also explained how he led efforts to equip his department with an ISO 9001- certified quality management system. First, this means they had to think about a system to manage the quality of the work conducted in the department (which is already beyond what any lab I have worked in has ever done). Then, they had to made sure this system would meet the type of strict requirements ISO norms usually entail. A courageous initiative since ISO norms are nearly never found in academic laboratories, which are more accustomed to improvisation than standardization. Although it required habit changes, Prof Dirnagl explained the personnel was overall enthusiastic about the changes and that the laboratory is now certified. Prof. Dirnagl is now assessing the impact of the certification on the quality of research and is reflecting about alternative quality-control standards that could be more adapted to academic setting.

Of course industrials routinely adopt such standards because of strict regulations and the strong marketing impact ISO-certifications can have. Dr. Sam Moré is director of a nanotechnology company called DendroPharm that develops nano-drug delivery vehicles for veterinary applications. Dr. Moré explained that their quality management system is also certified ISO 9001 and described how the use of a eLN was essential in that process.

Finally, on day 2, Joram Schimmeyer a PhD student in the Max Planck Institute of Molecular Plant Physiology in Potsdam (Germany) presented his digital research workflow. He explained how nearly all of his work is now in digital form and how an eLN fits perfectly in that workflow.

In addition to these amazing talks, the workshop was an opportunity to talk about the future of the electronic lab notebook and how they fit in the future of science as a whole. These were interesting discussions, that I leave for another post.

BioBright brings the internet of things to the lab.

Biobright_logoA digital revolution is transforming scientific research into a more open, more interconnected, more global, and more data-driven endeavor. Many of these changes are driven by new digital infrastructure.

But science is also done in the laboratory and in the field. Experimentalist need to prepare solutions, calibrate complex instruments, and make measurements on samples. This more down to earth aspect of research has gotten a bit less attention from open science and digital science enthusiasts. However new approaches and new tools may improve the way we do research in the lab. A handful of digital science companies are already thinking how digitalization and connection to the internet can improve the way we use scientific instruments. For instance Transcriptic and Emerald Cloud Lab have installed armies of robots in their Silicone Valley warehouse (or at least that is how I imagine it) that are awaiting your orders to perform experiments. The results are then delivered directly to your computer screen.

BioBright, a startup out of MIT and Harvard University , wants to connect our lab instruments. The idea is that connecting sensors to your instruments, even the most simple one, would give you more control and a better understanding of the exact conditions in which the experiment was done.

weather-1216041_640Practically, BioBright is working on a collection of sensors and software solutions that can be associated to the most common lab instruments. These extra pieces of data could provide the experimentalist with precious details about the environment in which the experiment was done, making it easier to troubleshoot or reproduce the experiment. BioBright has already mentioned connecting thermometers, but other sensors such as hydrometers, motion sensors, and light sensors could also be useful. Eventually, these measurements could be automatically associated to the data generated by the instrument, then transmitted and archived in electronic lab notebook.

BioBright is one of the first to bring the internet of things (or internet of instruments as mentioned by this Techcrunch article) to the research laboratories. It has taken years for web 2.0 technologies to reach researchers. But perhaps BioBright and others related initiatives such as TetraScience, are early signs that innovative connected scientific instruments will be developed alongside the recent and very trendy connected home technologies (and not 10 years later).

New Impactstory: fresh and free!

Impactstory-logo-2014Impactstory tracks the online impact of your research. It looks through news outlets, social media mentions, and more to quantify the reach of your research output. Impactstory is one of the first startup founded around the idea that a new set of metrics is needed to properly evaluate scientific research and researchers. The digitalization of research and scholarly communication is an amazing opportunity to harness very large quantities of quantifiable data, which can give completely new insights in the impact of research. Many now talk about altmetrics, a term originally coined on Twitter by Jason Priem, co-founder of Impactstory. These new metrics are still young and will need a few rounds of trial and error to find out what information and what representation of the information are the most meaningful. But regardless, altmetrics are bound to become essential for the future of research evaluation.

mentionsThe new profile page has a very fresh and clear look. Login is now only through ORCID, the unique identifier system for researchers. Then within seconds, Impactstory recovers your published articles and generates an overview of your mentions, which give you numbers on your online reach. But Impactstory tries to give perspective to these number though what they call achievements. These are badges focused on

  • the buzz your research is creating (volume of online discussion),
  • the engagement your research is getting, which looks at the details of who is mentioning you, and on what platform.
  • and your research’s openness, which look at how easy it is for readers to access your work.

For many of these badges, Impactstory also tell you how well you are doing compared to Softwareother researchers. One particularly interesting badge is about software reuse. There, Impactstory has integrated a tool that they recently released called Depsy. Depsy is specialized in evaluating the impact of research software, going beyond formal citations to understand how research software are being reused and to give proper credit to its contributors. This will deserve a post of its own in the future.

Hopefully, these sets of metrics and others alike, will become a standard part of your performance reviews, grant applications, and tenure packages in a very near future. You can already share your profile by directly pointing to your public Impact Story url. But new features will come shortly to make it easier to share and showcase the story of your online impact.

Visualizing DNA sequences made easier with new add-on

genomecompiler_logoMany online tools help researchers analyze and manipulate genetic data. Usually, the DNA sequence is first looked up in specialized databases, and copy-pasted into various forms. These tools have been incredibly useful to researchers, but are not visual, not collaborative, and are often very specialized. A number of online platforms now bring together sets of bioinformatic tools for genomic analysis and design. These cloud-based services make it easy to save and share data and results with collaborators. They are also directly connected to large public databases, which makes it easier to import the data you would like to work on. A few are already listed on the list of digital tools for researchers.

  • GenePattern – Genomic analysis platform that provides access to hundreds of genomics tools.
  • GenomeCompiler – Genetic design platform allowing researchers to manipulate and design everything from single genes to entire genomes.
  • InSIlico DB – Genomics made possible for biologists without programming.
  • And many others are not listed here.

These services also made the visual experience more pleasant and allows you to directly interact with the sequences you are working with.  This new way to handle and share genomic data is now taken a step further by GenomeCompiler, which has recently launched a new service called Plasmid Viewer. This free add-on can be embedded into websites that have DNA sequences repositories. This is done rather easily by pointing to a GenBank file url. The viewer then interprets the file and displays the DNA sequences as interactive sequence or circular representations along with annotations.

Screen Shot 2016-03-16 at 15.07.26

Screenshot from genomecompiler.com displaying their new plasmid viewer add-on

This new tool should help researchers share their genomic data in a more visual and meaningful way, for instance on group websites or scientific blogs. One example of how it can be put into use is this group’s website that has a list of vectors they use for cloning and use the plugin for their visualization. You will also find a demo on the GenomicCompiler.

Going beyond impact factor to evaluate researchers with Profeza

ProfezaFor many reasons, journal impact factor and number of publications are not good metrics to assess the quality of a researcher’s work. But regardless of their increasingly bad reputations, these metrics are nearly invariably used to take decisions about recruitments of researchers, their promotions, and funding their projects. The obvious reason why nothing has changed over the years is because there are no other easy way to judge the quality of a researcher and his or her work.

We ask a lot of researchers. They must be great at scientific reasoning and have bright insights but also be able to properly communicate with their teams, with the scientific community, with the general public, and with industrial partners. They also need to be able to network and work within teams, to manage projects and people, to teach, and to write skillfully in a language that is often not their own. It is easy to see that we would need a multitude of alternative metrics to properly evaluate the various aspects of the day-to-day work of researchers. 

Profeza is a young startup that would like to provide decision makers a better overview of the work of researchers. It has launched a social journal that allows researchers to showcase the divers aspects of their work by sharing the rational of experimental design, the failed hypotheses, as well as raw data, repeat data, and supporting data that would otherwise often go unpublished. For Profeza, each scientific article is only the tip of the iceberg, standing on a immense amount of work. 

Profeza’s interface is simple and clear. First, find the publications you authored through Profeza’s search engine. Profeza’s is currently using the Pubmed database and is thus better optimized for researchers in the biomedical fields. Then in three steps you are prompted to add information to the publication:

1. Select the publication you wish to add information to.

Share contributions2. Describe your contribution to the paper and invite other authors that may not be in the author list but should get recognition for their involvement in the work.

What contribution

3. Add information. You can add text and files containing the details about the rational of design, failed hypotheses, raw data, repeat and supporting data. This is a great way to help others in your field by tell them about your failures or negative results.

Additional data

The end result is a personalized page for each article containing the additional data and information. The page gives a better picture of the work that went into the publication and provides an insight in the short term impact of the articles by displaying altmetric data. 

I think Profeza is addressing a real problem head-on. The success will of course depend on the willingness of researchers to spend time formatting and entering the information and datasets. But if institutions are willing to play along, then the incentives would be in place and a more adapted evaluation system could emerge. These are still the early days. Profeza was founded in 2014 and expects to roll out new functionalities in the near future.

Also check out this well-crafted video from Profeza which gives a nice background on journal impact factors and the problems associated with them.

SJfinder gets a serious update

SJFinder logoA group of scientists and technologists from Stanford University (US) and KU Leuven (Belgium) launched SJFinder in 2013 to help researchers find the right journal to publish in based on the title and abstract of their manuscript. Since then a number of new functionalities have been added to the site. The idea is to offer a collection of tools to researchers to give them more control over their networking and communication. 

Rate Journal SJFinder

Rate journals on SJFinder

Beyond the journal suggestions, SJFinder now also allow you to rate journals based on their reading and submission experiences. In an ideal world, submission would be chosen not on impact factor, but 1) on the traditional readership of the journal (if any) and 2) on the quality of the service provided by the publisher (i.e smoothness of peer review process, delay from submission to online availability, metrics on article, promotion of article).

SJFinder also helps you discover the literature, like other tools out there. In this case the simplicity of the user interface is especially appealing. Simply click on the fields that most interest you, and SJFinder will generate a list of the latest papers in that field. You can also subscribe by journal, but I personally think there something nice about exploring the literature by field and not by journal. Perhaps it is because it makes me less journal-biased and make it more likely to stumble upon interesting works and concepts.

map SJFinder

Find labs by exploring a map on SJFinder

To help you find new collaborations and showcase your work, SJFinder rolled out two other functionalities. First, an interactive map helps you find research labs anywhere in the world. You can brows the labs by research fields, location, or by keyword, and explore the map. You can also easily add your own labs to the directory. The benefit of having a world-wide database of lab displayed on a map is pretty clear to me. I would use it to find new local collaborations. Sometime a hallway is enough to separate groups that would otherwise collaborate wonderfully. Or to find laboratories that I could easily go visit while at a conference.

And second, SJFinder launched a drag-and-drop website builder to let you build a website for your lab. This will makes it possible for the many researchers with limited resources of time and capital to create a website and showcase their work. It might sound almost old fashioned, but in my mind a website is a must for any research group. This, along with other similar tools, are great way to get started at building your online presence.