NEUBIAS is a COST Action which brings together life-scientists, microscopists, bioimage analysts and image analysis developers from 36 European, three neighboring countries + Australia, Singapore and the USA (www.neubias.org).
NEUBIAS is a forum to exchange the newest findings, applications, and cutting-edge developments in Bioimage Analysis, machine learning, data mining, and storage. European Bioimage Analysts, an emergent group within the bioimaging analysis community, organize this event, bringing together an international, interdisciplinary community of scientists in life and computer sciences.
Andreas Girod and Aymeric Fouquier d’Hérouël will be hosting the conference in Luxembourg, which will include a Training School for Early Career Investigators, a Training School for Bioimage Analysts, a Taggathon to continue building the NEUBIAS online resources for the Bioimage Analysis Community. Moreover, the Bioimage Analysis Symposium will be organized from the 6th to the 8th of February, 2018, which will include a new Satellite workshop open for bioimage analysts on the 5th of February (afternoon).
The symposium will highlight Keynote lectures from Susan Cox, Kevin Eliceiri, and Ivo Sbalzarini and will include talks from other 14 exciting invited speakers. Also, contributed talks will be selected from abstracts. The NEUBIAS symposium will feature signature sessions: the Call for Help or “image clinics” session (C4H), the Open source Software Lounge (OsSL), the Panel Discussions as well as company Workshops and Digital Posters.
CORBEL, EMBL, German BioImaging and NEUBIAS are delighted to announce a joint blended learning course on Machine Learning for Image Analysis.
The course will be a great mix of intensive learning, extensive hands-on and community networking. Participants will review the fundamentals of machine learning in three up-front webinars complemented by online tutorials.
The webinars will take place on 2nd, 9th and 16th October 2018, 12:00 – 14:00 CEST but a recorded alternative can be provided.
Next, they will apply their knowledge on-site (EMBL Heidelberg, 29-31st October), in small interactive groups (the workshop has 16 available seats and ~8 trainer/lecturer), to both reference datasets and their own data.
After the on-site workshop, two optional advanced training webinar, complemented by online tutorials, will be given on 9th and 16th November 2018. These will focus on simulation of data, transfer learning and boosting.
Perrine Paul-Gilloteaux, bio-image analyst, CNRS research engineer and project manager of our Bio-image informatics node, received last month the 2017 CNRS Crystal Prize, awarding her contributions to French research.
A perfect occasion to highlight her career and her work with France BioImaging. What is eC-CLEM? How can our field deal with the massive amount of data produced? What future developments can we expect in the realm of bio-image informatics? Read the interview below to find out more.
Could you introduce yourself briefly?
My name is Perrine Paul-Gilloteaux, I’m a CNRS Research Engineer. I have a background in electrical engineering, signal and image processing, and did my PhD in augmented reality for neurosurgery through surgical microscope. I started working in bio-image analysis for microscopy in Ireland, and joined the Curie Institute on the PICT IBISA facility in 2010. I moved to Nantes in 2015, and now work in a biomedical research institute. I am also the project manager of the France BioImaging node Bio-Image Informatics IPDM (Image Processing & Data Management), and work closely with the national coordination on the aspect of data management.
I define myself as a bio-image analyst, meaning that I do my best to bridge the gap between microscopy, image analysis and biology. This means that I’m involved in data management, data processing and data analysis projects, that I provide as a service in facilities or work on as research topics.
How long have you been involved with FBI, and what main projects have you carried out with us?
I’ve been involved in FBI from its inception. I started working within the transversal IPDM working group, where we first defined the state of our management systems and worked on the interoperability of our data bases. I managed the setup of the Curie Image Database, supported by France Bio Imaging, based on the OpenImadis system. In 2015, I was nominated project manager of the IPDM node, led by Jean-Christophe Olivo-Marin and Charles Kervrann. One important part of my mission is to work with the national coordination on the data management aspect in FBI. For this, we started by making a survey of resources and management system on site. This question of data management is now central, and FBI collaborates with other infrastructures at the European level: EuroBioImaging and ELIXIR, but also at the national level with other national infrastructures in biology using microscopies, and with the French Institute of BioInformatics (ELIXIR French node).
You have developed a software called eC-CLEM. Could you explain what it consists of?
For this project, I’ve worked closely with Xavier Heiligenstein (Curie Institute, FBI working group on multimodal imaging). Ec-CLEM (for Easy-Cell-Correlative Light to Electron Microscopy) is a software designed to help correlative microscopies. The purpose is to help the fusion of information obtained by different modalities of microscopy on the same sample (for example electronic microscopy, photonic microscopy, atomic force microscopy, etc.). The software allows to register, i.e. align in the same system of coordinates, multidimensional images with big scale and resolution differences, either with a manual input of the user, either automatically when possible. In addition, it provides an estimation of the error of alignment, based on statistical methods, and detects the deformations that the sample may have undergone. I’ve developed a set of algorithms implemented as plugins for the ICY platform. [note: Perrine has published a paper about eC-CLEM in Nature Methods] During the development of this set of tools, I was greatly helped by the ICY coding parties (Hackatons) organized at Pasteur with the support of FBI, and I would encourage developers to attend such events, as there are always new things to be learnt.
Bio-image informatics have taken the center stage lately, as more and more people realize how crucial image processing is for research. Could you expand a bit on that?
It is entirely true, and this is the reason why FBI has had a transversal node on that activity since its creation. I’ve cofounded a network of bio-image analysts (NEUBIAS) for that exact purpose also. The size and the number of data to be processed, the large amount of different questions to be answered from imaging and the interplaying between acquisition and processing to generate imaging data and analysis data, have led to a complexity of analysis which requires expert tools but also expert people. Bio-image informatics is a field of research by itself, and it is now recognized as such. It is bridging the gap between image processing research and biology research based on imaging. It can be seen also from the recent Nobel prizes in chemistry: in cryo-tomography or in super resolution light microscopy, both developments were relying on image processing as an essential part.
What are going to be, according to you, the next big steps and developments in the realm of image processing and data management?
The novelties in our field is two-sided: from one side we have data exploding in size and number, and on the other side, machine learning -and in particular deep learning- benefits from progress in hardware and opens the way to big progress in analysis and in particular in feature recognition (segmentation and tracking).
Regarding data management, the big issues to be solved need to involve the whole imaging community, but also to seek expertise from other fields with the same problems. Technical solutions, both from the software side (with management software such as OpenImadis, Omero, Bioemergences in France BioImaging nodes) or the hardware side (optimized hardware systems, optimized protocols of data transfer) are on their way, but will not be useful if the biologists do not put effort in data curation and data selection.
Up to this day, even with machine learning, tuning a software or a protocol to respond to a particular problem and a particular set of data requires a lot of effort, either to set up the learning network or train it in the case of machine learning, either to combine algorithm for a specific question in adapted workflows or to develop more performing algorithm. It means that we need well trained expert able to master both the image processing aspect and the biological questions behind.
On what will your FBI working group focus in 2018? What can we expect from you? (in terms of new developments, priorities, events etc.)
The priority is definitely to deal with the explosion of data we are facing. In addition to the directions exposed in the previous question (software and hardware solutions), one direction taken by IPDM on data management is the definition of quality data. For this multi-faceted topic, we have already started to set up tools to measure the quality of the data produced in term of resolution for example, based on the expertise in metrology of our facilities members, that we want to demonstrate in 2018. We will also concretize the collaborations between FBI and the other national infrastructures by running tests, for example on the speed of data transfers between node, in order to make sure that at the end of 2018, each user of the FBI nodes can easily access and process her/his data from anywhere. A technical catalogue of software and hardware resources is under construction, to allow FBI nodes and beyond to benefit wisely from the tools and networks created by FBI. In the first semester of 2018, we will be organizing an event to discuss and define the changes in bio-image informatics that deep learning could bring about (further information to come soon, please refer to the FBI site).
Paul-Gilloteaux, Perrine, Heiligenstein Xavier et al. “EC-CLEM: flexible multidimensional registration software for correlative microscopies.” Nature Methods, vol. 14, no. 2, 2017, pp. 102–103., doi:10.1038/nmeth.4170. https://www.nature.com/articles/nmeth.4170
NEUBIAS, the network for bioimage analysts, is organizing a second training school for facility staff. Training is open for all staff scientists, graduate students, post-docs or faculty who work in the context of Bioimaging facilities and provide assistance and training to users in need of Bioimage Data Analysis.
Extended deadline for abstracts submission: May 26, 2017
NEUBIAS, the network for bioimage analysts, is organizing a second training school for early career reserchers. This year, training is open for all researchers interested in acquiring knowledge about image analysis, with no prerequisite required.
The course will be an introduction to image processing and analysis, with a focus on biologically relevant examples. The attendees will learn the fundamentals of image analysis including how to do basic Macro programming in Fiji (ImageJ) for automated batch analysis of images, use different software solutions for image analysis, and will be introduced to visualisation and explorative data analysis after extraction of numerical data from images.
The teachers are Simon Nørrelykke and Szymon Stoma (IDA-ScopeM, ETH, Zurich, Switzerland), and Chong Zhang (SIMBioSys, University Pompeu Fabra, Barcelona, Spain).
The course is suitable not only for beginners in image analysis with no experience, but also for those who want to extend their knowledge of basic principles and more specialised tools.