Supporting therapies
For the foreseeable future, AI will not make doctors redundant. In the development of therapies, it remains a tool that supports human experts. This also applies to the implementation of therapy decisions. Here, too, artificial intelligence is used in various sub-areas. Two examples are given below:
In radiation treatment, AI helps to localise the irradiation site so that as little surrounding tissue as possible is damaged. A new method developed in Berlin that links a radiation device with a computer tomograph is used for this purpose. AI identifies the current position of organs in the body and the structure of the tumour in real time; it uses these findings to make suggestions regarding the approach of the irradiation.
In surgical removal of tumours, AI can be used in robot-assisted surgical systems to perform more precise operations.
AI to support patients
But artificial intelligence can do more than just make doctors' work easier – it also helps patients to adjust to their treatments.
For example, AI-supported patient information systems run through therapy plans and present opportunities, risks or side effects over time. This gives patients a better basis of information for their decisions about therapies. AI-supported apps and digital measuring devices in patients' possession can monitor health in real time and transmit changes to treating physicians.
Problems with the use of AI
It has already been mentioned: AI models for analysing patient data always have performance limitations, the nature of which depends on the one hand on the model used and on the other on the quality of the learning data. Better models and better data should gradually reduce these deficits. However, there are also more fundamental problems.
For example, AI models are not transparent. People are usually unable to understand how an AI has arrived at an assessment. If a statement proves to be false, it is almost impossible to determine why. It is difficult for humans to learn from AI mistakes.
AI results are also not always reproducible. Anyone who has worked with language models like ChatGPT is familiar with the phenomenon: the same query is sometimes answered differently at different times or in different content contexts. What does this say about the quality of the AI assessment?
Many doctors currently have only limited trust in the assessments of artificial intelligence, and that is probably a good thing. However, this does little to detract from the importance of AI. Just as a tool cannot build a house without the help of a qualified craftsman, the instrument of AI also requires an experienced team of doctors to help heal a patient.
Ultimately, research is still in its infancy. As astounding as it may seem, the full potential of artificial intelligence for combating the widespread disease of cancer is still far from being foreseeable.
Text: Thorsten Kleinschmidt