Perrine Paul-Gilloteaux, bio-image analyst, CNRS research engineer and project manager of our Bio-image informatics node, received last month the 2017 CNRS Crystal Prize, awarding her contributions to French research.
A perfect occasion to highlight her career and her work with France BioImaging. What is eC-CLEM? How can our field deal with the massive amount of data produced? What future developments can we expect in the realm of bio-image informatics? Read the interview below to find out more.
Could you introduce yourself briefly?
My name is Perrine Paul-Gilloteaux, I’m a CNRS Research Engineer. I have a background in electrical engineering, signal and image processing, and did my PhD in augmented reality for neurosurgery through surgical microscope. I started working in bio-image analysis for microscopy in Ireland, and joined the Curie Institute on the PICT IBISA facility in 2010. I moved to Nantes in 2015, and now work in a biomedical research institute. I am also the project manager of the France BioImaging node Bio-Image Informatics IPDM (Image Processing & Data Management), and work closely with the national coordination on the aspect of data management.
I define myself as a bio-image analyst, meaning that I do my best to bridge the gap between microscopy, image analysis and biology. This means that I’m involved in data management, data processing and data analysis projects, that I provide as a service in facilities or work on as research topics.
How long have you been involved with FBI, and what main projects have you carried out with us?
I’ve been involved in FBI from its inception. I started working within the transversal IPDM working group, where we first defined the state of our management systems and worked on the interoperability of our data bases. I managed the setup of the Curie Image Database, supported by France Bio Imaging, based on the OpenImadis system. In 2015, I was nominated project manager of the IPDM node, led by Jean-Christophe Olivo-Marin and Charles Kervrann. One important part of my mission is to work with the national coordination on the data management aspect in FBI. For this, we started by making a survey of resources and management system on site. This question of data management is now central, and FBI collaborates with other infrastructures at the European level: EuroBioImaging and ELIXIR, but also at the national level with other national infrastructures in biology using microscopies, and with the French Institute of BioInformatics (ELIXIR French node).
You have developed a software called eC-CLEM. Could you explain what it consists of?
For this project, I’ve worked closely with Xavier Heiligenstein (Curie Institute, FBI working group on multimodal imaging). Ec-CLEM (for Easy-Cell-Correlative Light to Electron Microscopy) is a software designed to help correlative microscopies. The purpose is to help the fusion of information obtained by different modalities of microscopy on the same sample (for example electronic microscopy, photonic microscopy, atomic force microscopy, etc.). The software allows to register, i.e. align in the same system of coordinates, multidimensional images with big scale and resolution differences, either with a manual input of the user, either automatically when possible. In addition, it provides an estimation of the error of alignment, based on statistical methods, and detects the deformations that the sample may have undergone. I’ve developed a set of algorithms implemented as plugins for the ICY platform. [note: Perrine has published a paper about eC-CLEM in Nature Methods] During the development of this set of tools, I was greatly helped by the ICY coding parties (Hackatons) organized at Pasteur with the support of FBI, and I would encourage developers to attend such events, as there are always new things to be learnt.
Bio-image informatics have taken the center stage lately, as more and more people realize how crucial image processing is for research. Could you expand a bit on that?
It is entirely true, and this is the reason why FBI has had a transversal node on that activity since its creation. I’ve cofounded a network of bio-image analysts (NEUBIAS) for that exact purpose also. The size and the number of data to be processed, the large amount of different questions to be answered from imaging and the interplaying between acquisition and processing to generate imaging data and analysis data, have led to a complexity of analysis which requires expert tools but also expert people. Bio-image informatics is a field of research by itself, and it is now recognized as such. It is bridging the gap between image processing research and biology research based on imaging. It can be seen also from the recent Nobel prizes in chemistry: in cryo-tomography or in super resolution light microscopy, both developments were relying on image processing as an essential part.
What are going to be, according to you, the next big steps and developments in the realm of image processing and data management?
The novelties in our field is two-sided: from one side we have data exploding in size and number, and on the other side, machine learning -and in particular deep learning- benefits from progress in hardware and opens the way to big progress in analysis and in particular in feature recognition (segmentation and tracking).
Regarding data management, the big issues to be solved need to involve the whole imaging community, but also to seek expertise from other fields with the same problems. Technical solutions, both from the software side (with management software such as OpenImadis, Omero, Bioemergences in France BioImaging nodes) or the hardware side (optimized hardware systems, optimized protocols of data transfer) are on their way, but will not be useful if the biologists do not put effort in data curation and data selection.
Up to this day, even with machine learning, tuning a software or a protocol to respond to a particular problem and a particular set of data requires a lot of effort, either to set up the learning network or train it in the case of machine learning, either to combine algorithm for a specific question in adapted workflows or to develop more performing algorithm. It means that we need well trained expert able to master both the image processing aspect and the biological questions behind.
On what will your FBI working group focus in 2018? What can we expect from you? (in terms of new developments, priorities, events etc.)
The priority is definitely to deal with the explosion of data we are facing. In addition to the directions exposed in the previous question (software and hardware solutions), one direction taken by IPDM on data management is the definition of quality data. For this multi-faceted topic, we have already started to set up tools to measure the quality of the data produced in term of resolution for example, based on the expertise in metrology of our facilities members, that we want to demonstrate in 2018. We will also concretize the collaborations between FBI and the other national infrastructures by running tests, for example on the speed of data transfers between node, in order to make sure that at the end of 2018, each user of the FBI nodes can easily access and process her/his data from anywhere. A technical catalogue of software and hardware resources is under construction, to allow FBI nodes and beyond to benefit wisely from the tools and networks created by FBI. In the first semester of 2018, we will be organizing an event to discuss and define the changes in bio-image informatics that deep learning could bring about (further information to come soon, please refer to the FBI site).
http://www.cnrs.fr/fr/recherche/prix/cristal.htm (in French)
Paul-Gilloteaux, Perrine, Heiligenstein Xavier et al. “EC-CLEM: flexible multidimensional registration software for correlative microscopies.” Nature Methods, vol. 14, no. 2, 2017, pp. 102–103., doi:10.1038/nmeth.4170. https://www.nature.com/articles/nmeth.4170