1. What did you find out?
We propose a novel approach for improving needle detection in ultrasound images, particularly when the needle is barely visible. Experienced sonographers intuitively rely on temporal cues to relocate the inserted needle and proceed with the procedure. Inspired by this perceptual strategy, we introduce a mechanism to deliberately vibrate the needle and develop a deep learning model, VibNet, to capture the subtle motion induced by these vibrations. Our method significantly improves needle detection performance by focusing on temporal vibration patterns rather than relying solely on static image features.
2. What challenges did you face during your research?
One major challenge was that needles are often hard to see in ultrasound images because of the so-called speckle noise. Designing a network that could take advantage of the tiny movements caused by the needle’s vibration was also tricky. In addition, we had to ensure that the components of our deep learning model—such as how it processes motion and shape—worked well together. Finally, we needed to ensure that the added vibration was safe and would not harm the tissue.
3. Is it planned to transfer these developments into practical and industrial applications?
Yes, we plan to use this method in real medical tools. Because it works well even when the needle is hard to see, it could support robot-assisted procedures. Since the vibration is safe and similar to what doctors already do by hand, it can be integrated into current medical equipment. We’re working on turning this into a solution that can be used in hospitals or clinics.
Paper:
VibNet: Vibration-Boosted Needle Detection in Ultrasound Images; Dianye Huang, Chenyang Li, Angelos Karlas, Xiangyu Chu, K. W. Samuel Au, Nassir Navab, and Zhongliang Jiang; IEEE Transactions on Medical Imaging; June 2025; https://ieeexplore.ieee.org/document/10902567
Article: Andreas Schmitz