Surgery forms the mainstay in the treatment for solid tumors. However, in up to 30% of the cases, surgery is inadequate either because tumor tissue is left behind erroneously or surgical resection is too extensive compromising vital structures such as nerves. So in cancer surgery, surgeons generally operate at a delicate balance between achieving radical tumor resection and preventing morbidity from too extensive surgical resection. Within this context, there is long lasting but still unmet need for a precise surgical tool that informs the surgeon on the tissue type at the tip of his instrument and in this way can guide the surgical procedure.
To tackle these shortcomings we propose to employ an innovative approach of image-guided surgery which allows real-time intra-operative tissue recognition during surgery. To this end, we will combine the unique characteristics of ultrasound imaging (US) with the excellent tissue sensing characteristics of optical spectroscopy (DRS). The two techniques, US and DRS, both have proven track records in the field of cancer diagnosis. However, both have their critical limitations. DRS has excellent performance with respect to tissue diagnosis, distinguishing cancer from healthy tissues. Because DRS is a point measurement that samples small tissue volumes close to the measurement probe, its depth sensitivity is limited and thus the possibility to look deeper into the surgical resection plane. Ultrasound on the other hand, has more than sufficient sampling depth and resolution, but, cannot resolve cancer directly from the imaging architecture. Our approach will be to strategically combine these two techniques and using the best of two worlds within one smart device.
- Ultrasound raw data analysis and development of new algorithms for beamforming, image reconstruction, layer segmentation and elastography
- Designing and developing a multimodal Machine Learning/deep learning technique for the discrimination of cancer from healthy tissues by using the diffuse reflectance spectrum, US data (raw or processed) and elasticity data as input features.
- Enthusiastic Master student in electrical engineering, biomedical engineering, computer science, or a related field
- Interest in the intersection of machine learning and deep learning
- Understanding of basic machine learning concepts, image analysis and signal processing
- Programming experiences in MATLAB and Python
- A good team player with excellent communication skills
- A creative solution-finder
Duration: 9 months (BME or ME or MWT)
Start date: a.s.a.p.
Collaboration: Netherlands Cancer Institute (NKI)
Location: TU/e (Eindhoven) and NKI (Amsterdam)
Contact: For project details, please contact Dr. Behdad Dasht Bozorg, email: B.Dasht.Bozorg@nki.nl