B°Ker propose un ensemble de solutions pour centraliser les données issues de matériels de laboratoire et assurer un suivi et une sécurité des dispositifs. Une belle entreprise suivie par Rennes Atalante.

Basée sur des technologies de l’internet des objets, le système organise la remontée de données issues des équipements et de l’environnement du laboratoire permettant ainsi d’optimiser son fonctionnement. Les chercheurs peuvent de cette façon suivre à distance leurs équipements et sont alertés en cas de défaillance ou d’arrêt des analyses. Cette première solution est couplée à un service temps réel d’alerte et de données en ligne.

Source: Rennes Atalante: B°Ker sécurise le fonctionnement des laboratoires

Under the title of “High Content – Less Mess, More Mesh”, GEN (Genetic, Engeneering & Biotechnology news) has published this month an article about the advances that have made it possible to generate vaste datasets with decreasing costs and the need for comparing results and using informations collected from experiments on different cell types with different imaging systems.

“The biggest thing that needs to happen in the next few years is a more extensive interoperability of the information that is obtained from high-content screening and analysis,” insists Robert F. Murphy, Ph.D., the Ray and Stephanie Lane Professor of Computational Biology and professor of biological sciences, biomedical engineering, and machine learning at Carnegie Mellon University.

When descriptive features are used in the analysis of microscopy images, one of the challenges is to compare and integrate data across experiments, particularly when specific features captured using different experimental platforms may have different meanings for different investigators.

“One potential way to address this is to make the features interpretable,” suggests Dr. Murphy. “But that can be impossible if people use different microscopes, conditions, and objectives—and often different cells.”

Efforts to develop generative models of cellular organization and protein distribution from fluorescence microscopy images have been undertaken in Dr. Murphy’s laboratory. These efforts have led to the development of an open-source platform, the CellOrganizer project.

More details on GEN !

Phenotypic screening by High Content Screening monitors all modification of phenotypic changes. The interest of this technology is having a significant content of biological information (phenotypic and target modifications) and this, cell by cell. This important content of information results in a very high amount of data that must be analyzed globally (with PCA reduction for example). Thus, the data analysis and statistics, especially the Z-score that everybody knows, need to be reviewed. Guyon et al boast Φ-score. What do you think about that?

“Using robust statistics and a variance model, we demonstrated that the Φ-score showed better sensitivity, selectivity and reproducibility compared to classical approaches. The improved performance of the Φ-score paves the way for cell-based screening of primary cells, which are often difficult to obtain from patients in sufficient numbers.”

Source : Guyon, L. et al. Ф-score: A cell-to-cell phenotypic scoring method for sensitive and selective hit discovery in cell-based assays.Sci. Rep. 5, 14221; doi: 10.1038/srep14221 (2015)

produce a lot of (pictures, meta-data, results, etc.) and it is not always easy to integrate and  them.

In this webinar, James Adams (Senior Regional Marketing Specialist for PerkinElmer) will focus on capabilities to integrate data from multiple systems.

The ultimate goal of “ Data” is to provide timely insight that is used to improve the effectiveness and efficiency of organizations. We live in a “data rich – information poor” world where access to data is not a problem but access to actionable information is.

%d bloggers like this: