Machine Intelligence Laboratory

Cambridge University Department of Engineering

Dr Graham Treece, Department of Engineering

F-GMT11-2: Simple ultrasound-guided prostate biopsy of MRI-based targets

Prostate cancer is very common in older men, and biopsy is the usual way to diagnose it. But it can be very localised and the needle has to go to the right place in the prostate, under ultrasound guidance. Ultrasound is used because it is a safe, live imaging technique - these are prostate ultrasound images in 'axial' and 'saggital' planes. They are only 2D images, though. However, the biopsy target isn't visible in ultrasound: for this, an MRI scan is performed before the procedure and the target marked out in this data. There are ways of matching this MRI-based target to live ultrasound data, but they involve complex tracking of the ultrasound probe and/or anaesthetising the patient. Is a simpler approach possible? First, can we use a sequence of 2D saggital and axial ultrasound images to reconstruct a 3D ultrasound volume? Then register this volume to the MRI so we can see the target in the ultrasound data? Then subsequently match the live ultrasound images to this data so we can overlay the target position?

Prostate cancer is very common, with over 80,000 prostate biopsies per year for diagnosis. But this procedure is difficult since you have to hit the right bit of the prostate with the needle: the cancer is very localised, and quite easy to miss. Biopsy is hence usually done with some guidance, with an initial MRI scan (in which the potentially cancerous regions can usually be seen) to determine the locations which need to be biopsied. Ultrasound images are used to work out where the patient is at biopsy compared to when the MRI scan happened, since this gives live (but only 2D) images. If the patient is anaesthetised, multiple biopsies can then be performed using a mechanical constraint to determine where the needle goes.

The urologists at the hospital here (Vincent Gnanapragasam and David Thurtle) have been working on better biopsy procedures which do not require anaesthetics, and are much less prone to complications. They can still be guided by ultrasound in that the needle is visible in the live ultrasound image, but you can't see the MRI-based target in these images, so it is difficult to know where to biopsy. This project will investigate whether it is possible to develop a simple, live system which can show the MRI biopsy target on top of the live ultrasound image. There are several stages to this. First the 2D ultrasound data has to be turned into a 3D data set. Can this be done simply by manually fanning the ultrasound over the prostate, and also recording an image in an orthogonal plane to see the extent? Then this data has to be registered to the MRI data in order to work out where the target is. Then, can we match subsequent live ultrasound images with the pre-recorded ultrasound data to try to work out where they are relative to this target?

This is an algorithmic development / software project, so experience of writing software is essential. The aim is to design something simple and fast and see if the accuracy is sufficient.

Click here for other medical imaging projects offered by Graham Treece.