Skip to content

MIMBCD-UI/meta

Repository files navigation

About

header

The Medical Imaging Multimodality Breast Cancer Diagnosis User Interface (MIMBCD-UI) project involves the collaborative effort of three Portuguese Research Institutions: ISR, ITI and INESC-ID. The three laboratories are Associate Laboratories of IST from ULisboa - Portugal (EU). The project was created during the Master Thesis of Francisco Maria Calisto. The MIMBCD-UI project is the precursor of both MIDA and BreastScreening projects. For further information, follow the public wiki of the project. Moreover, we also provide a private wiki on the meta-private repository for team usage. Unfortunately, you need to be a member of our team to access the restricted information.

Research

Like the fact that these projects are research projects, we underline and ask for the support of our noble scientific community to help us across the path. With this project, we hope to provide valuable information regarding our research topics and theories. Also, you can follow and support our project on ResearchGate.

Citing Master Thesis

We kindly ask scientific works and studies that make use of Master Thesis to cite Master Thesis in their associated publications. Similarly, we ask open-source and closed-source works that make use of Master Thesis to warn us about this use. You can cite our work using the following BibTeX entry:

@mastersthesis{calisto2017mimbcdui,
  doi = {10.13140/RG.2.2.15187.02084},
  url = {http://rgdoi.net/10.13140/RG.2.2.15187.02084},
  author = {Francisco Maria Calisto},
  title = {Medical Imaging Multimodality Breast Cancer Diagnosis User Interface},
  school = {Instituto Superior T\'{e}cnico},
  year = 2017,
  address = {Avenida Rovisco Pais 1, 1049-001 Lisboa - Portugal (EU)},
  month = 10,
  note = {A Medical Imaging Tool for a Multimodality use of Breast Cancer Diagnosis on an User Interface.}
}

Recent Publications

We have several publications published across several research conferences and journals. In this section, we provide a short list of our recent publications. To see a more complete list, please follow this link.


Francisco Maria Calisto, Carlos Santiago, Nuno Nunes, Jacinto C. Nascimento,

Introduction of Human-Centric AI Assistant to Aid Radiologists for Multimodal Breast Image Classification,

International Journal of Human-Computer Studies,

Volume 150, 2021, 102607, ISSN 1071-5819,

https://doi.org/10.1016/j.ijhcs.2021.102607.

(https://www.sciencedirect.com/science/article/pii/S1071581921000252)

Abstract: In this research, we take an HCI perspective on the opportunities provided by AI techniques in medical imaging, focusing on workflow efficiency and quality, preventing errors and variability of diagnosis in Breast Cancer. Starting from a holistic understanding of the clinical context, we developed BreastScreening to support Multimodality and integrate AI techniques (using a deep neural network to support automatic and reliable classification) in the medical diagnosis workflow. This was assessed by using a significant number of clinical settings and radiologists. Here we present: i) user study findings of 45 physicians comprising nine clinical institutions; ii) list of design recommendations for visualization to support breast screening radiomics; iii) evaluation results of a proof-of-concept BreastScreening prototype for two conditions Current (without AI assistant) and AI-Assisted; and iv) evidence from the impact of a Multimodality and AI-Assisted strategy in diagnosing and severity classification of lesions. The above strategies will allow us to conclude about the behaviour of clinicians when an AI module is present in a diagnostic system. This behaviour will have a direct impact in the clinicians workflow that is thoroughly addressed herein. Our results show a high level of acceptance of AI techniques from radiologists and point to a significant reduction of cognitive workload and improvement in diagnosis execution.

Keywords: Human-computer interaction; Artificial intelligence; Healthcare; Medical imaging; Breast cancer


Automated Analysis of Unregistered Multi-View Mammograms With Deep Learning

Gustavo Carneiro, Jacinto Nascimento and Andrew P. Bradley

Published in: IEEE Transactions on Medical Imaging ( Volume: 36, Issue: 11, Nov. 2017 )

Page(s): 2355 - 2365

Date of Publication: 12 September 2017

DOI: 10.1109/TMI.2017.2751523


Introduction

The project is being developed by the Associate Laboratory - Institute for Systems and Robotics (ISR-Lisboa). This project proposes the development of visualizations and interaction techniques for the detection of cancer targeting among breast using Multi-Modality medical imaging and textual information.

More specifically, this project deals with the use of a recently proposed Machine Learning (ML) method in literature: Deep Convolutional Neural Networks (CNNs). These Deep Networks will incorporate information from several different modes: Magnetic Resonance Imaging (MRI) volumes, UltraSound (US) images, MammoGraphic (MG) images (both views CC and MLO) and text. The proposed algorithm, called for Multi-Modality CNN (MMCNNs) will have the ability to process multimodal information at a unified and sustained manner. This methodology needs to "learn" what are the masses and calcifications. Therefore, it is of chief importance to create and improve several visualization and interaction techniques to promote and generate data (i.e., our datasets) for the purpose.

So that is necessary to collect the ground truth, or notes of the masses and calcifications provided by medical experts. For the collection of these notes, the design and development of several User Interfaces (UI) is necessary, allowing the user (in this case, the medical specialist) to display various types of image (i.e., MG, US and MRI), and that also allows for user interaction, particularly in providing the notes of the masses and calcifications. For these reasons, it is crucial for the development of this project, cooperation with experts providing the above notes.

Goals

Development of the UI for diagnosis of breast cancer in Medical Imaging (MI) Multi-Modality. MIMBCD-UI is aiming for an intuitive UI, a user-friendly interface that allows users to rapidly diagnose several image modalities.

It is intended to develop an UI for monitoring and diagnosis of breast lesions in various medical imaging modalities. Responses of these imaging modalities are containing plenty of useful information that are extracted and analyzed by the clinicians for the purpose of analysis, detection and the diagnosis of breast lesions.

The imaging modalities to include in the work are:

  • MG (including the views CC and MLO);
  • US;
  • MRI volumes;

Having a protocol already signed with the Hospital Fernando Fonseca, it is intended that this interface has two features:

(i) - Build a database with annotations in Multi-Modality of breast images.

Provide the user (doctor) facility to draw / write down masses and calcifications, as well as the corresponding BIRADS for each imaging modality. This annotation process can be built during the examination, and thus it is possible to build a database of medical notes.

(ii) - Follow-up of the patient. With this feature is to allow the doctor automate a Multi-Modality inspection for the patient.

Based on the patient's identification (eg, via a query on the ID), and for a given type of mammography imaging, the system must return all images of this patient over a period of time (eg. Two or more years) entered by the doctor, and show these images (pre-recorded). This feature is critical for diagnosis because it allows, through information visualization, observing not only the calcifications density and the morphological evolution of the masses in that time period.

Requirements

Media

Advisorship

Our projects provide comprehensive academic advising and research opportunities. Together the projects enable students, fellows and researchers to chart their own educational journey and make the best research experience.

Advisor

Professor Jacinto Peixoto do Nascimento (ISR/IST)

Co-Advisors

Professor Daniel Gonçalves (INESC-ID/IST)

Francisco Maria Calisto (ITI/IST)

Information

The following information shows our resources and acknowledgements across the project development. It is in this section, where we link our datasets and important people, as well as projects to our research. The MIMBCD-UI will serve as base research of both MIDA and BreastScreening projects.

Dataset Resources

During the development of this project we generated a combination of interesting datasets. To publish our related datasets we used a well known platform called Kaggle. To access our project's Profile Page just follow the link.

Acknowledgements

A special thanks to Chris Hafey, the propelling person of CornerstoneJS, who also developed the cornerstoneDemo. Not forgetting the three supporters of the CornerstoneJS library, Aloïs Dreyfus, Danny Brown and Erik Ziegler. We also would like to give a special thanks to Erik Ziegler who support several issues during this path. Thanks also Pedro Miraldo, Carlos Santiago, Bruno Cardoso and Bruno Oliveira for the technical help given. In the end, a great thank to all Orthanc project team, but especially to Sébastien Jodogne.

Sponsors

fct fccn ulisboa ist hff

Departments

dei dei

Laboratories

sipg isr larsys inesc-id

Domain

eu pt