'We listen to the sound of light.'

Prof. Michael Schäfers on Photoacoustic Imaging
Prof. Michael Schäfers

Just recently the Cells-in-Motion Cluster of Excellence (CiM) obtained a new device for 'photoacoustic imaging'. Prof. Michael Schäfers from the team of CiM coordinators explains in an interview with Christina Heimken why this prototype is especially important.

Photoacoustics is a term which combines optics and acoustics. How does the device work?

In optical imaging laser light shines into the tissue and transfers energy to certain molecules, for example to special pigments that we have specifically introduced into the organism. Thereupon these so-called fluorochromes give off light which we can measure from outside. However, with this method we can't look very deeply into the tissue because the light is scattered too strongly for that. Photoacoustics works in a similar way – though with one decisive difference: we use the fact that, as a result of the energy produced by the laser light, the fluorochromes are made to oscillate, thereby generating ultrasonic waves. That is the so-called photoacoustic effect. We can visualize the oscillations – in principle, just like with a normal ultrasound scanner. The advantage is that with the combination of light and ultrasound we can make deeper structures visible in a very precise way.

The photoacoustic effect has been in use since the 1970 – to analyze tissues, among other things. What is so new about photoacoustic imaging?

The approach itself isn't new. What is new, however, is the fact that the technology has now been linked to an ultrasound scanner. For the first time we can now view within the living organism live – by 'listening' to the sound of the light.

In the cluster a large number of imaging processes are already being used. Why do the researchers now need yet another imaging device?

Imaging processes are not all the same. With electron microscopy, for example, we can show tiny structures to a very high resolution. Using ultrasound we can make organs visible. With molecular imaging processes we are able to depict the function and activity of molecules in the organism – metabolic processes, for example. With photoacoustic imaging we can do both at the same time – make structures and processes in the body visible. What's really good is the fact that we can link up basic research and clinical applications. What we can currently observe in mice we shall soon be able to see in humans.

 

 

PhD student Mitchell Duffy working on the new photoacustic scanner.
© CiM - Michael Kuhlmann

What disciplines are involved – and why?

One of the people we have working on the device at the moment is an American doctoral student. He has degrees in two subjects: biology and computer science. Both are very important, because on the one hand he can evaluate what he sees in the organism, and on the other hand it's a lot of work to combine the values measured into one single image – and that's where the computer science comes in. But for photoacoustic imaging we also need chemists who can "construct" for us the matching fluorochromes – or, to be more precise, absorbers. These are special fluorochromes which radiate as little light as possible, but all the more ultrasound. Incidentally, this interdisciplinary constellation which we have in the "cells in Motion" Cluster is unique – not only in Germany, but also internationally. It means we have the best possible conditions to apply photoacoustics.

What could photoacoustics be used for in the clinic in future?

One example is black skin cancer. Naturally, its cells already contain pigments that we can make visible by means of photoacoustics – to detect metastases, for example. Other possible uses are in the thyroid gland or blood vessels.

When do you expect the device to be in the clinic?

We expect to be able to use our prototypes for the first trial examinations of people in just a few weeks. The next generation of photoacoustic scanners will probably be ready for clinical use in two or three years. Then, handling them will be much simpler.