More and more patients are turning to generative artificial intelligence (AI) tools such as ChatGPT and other chatbots to help them understand the complexity of their cancer diagnosis and associated genomic testing, an interprofessional team of genetics professionals and educators wrote in an article for the Clinical Journal of Oncology Nursing. However, they cautioned that the tools’ limitations could have “serious consequences” for patients.

Using a case study of a patient named Becky, the authors showed how patients are finding their test results and other health information on their electronic health record portal before having a consultation with their care team. Some patients are asking chatbots to explain their results while they wait for their appointment. However, because the AI may be trained on older or unreliable sources, patients may receive misinformation or information that’s inaccurate for their particular health profile, which could lead them to make poorly informed decisions.

Implications for Patients With Cancer

The authors listed several reasons patients may wish to consult AI for cancer and other health information:

  • Translating provided information for their literacy level or language.
  • Asking questions they may feel uncomfortable talking about with their care team.
  • Finding information and answers at any time of day.
  • Maintaining a chat history that they can review again as needed.

However, they also raised caution that studies show mixed results about AI’s output in a medical context:

  • Some of its suggestions do not align with current guidelines.
  • Without an accurate prompt about reading levels, its responses are written at the college level and lack visual aids.
  • Its answers may increase the risk for misdiagnosis, poor treatment adherence, delayed diagnosis, and inappropriate self-medication.

The authors of one of the cited studies recommended that AI chatbots “need to be evaluated like a drug or medical device.”

Simply serving in your role as your patients’ trusted educator and guide can help curtail your patients from consulting chatbots to understand their cancer diagnosis and treatment, the authors wrote. Perhaps ironically, AI can help nurses with that: The authors suggested that healthcare professionals can use chatbots—with judicious guidance and oversight—to tailor patient education for different literacy levels or learning needs.

Additionally, the authors recommended that nurses:

  • Educate patients that information they obtain using AI may be inaccurate and that they should consult their care team before making any decisions.
  • Consider placing a warning on the electronic health record patient portal advising patients to seek their healthcare professionals’ guidance in interpreting all medical results.
  • Provide patients with adequate time to ask multiple questions during all visits.
  • Point patients to validated sources of health information (e.g., American Cancer Society, National Cancer Institute).
  • Stay current on clinical recommendations for genetic and genomic testing and its implications for diagnosis and treatment so they are prepared to answer patients’ questions.

“Patients are using AI more frequently to assist with understanding their diagnoses, including complicated information such as genetic and genomic testing results,” the authors concluded. “It is important that oncology nurses and other clinicians guide patients to validated information and diligently work to develop patient-friendly educational material for complex topics such as understanding genetic and genomic implications for cancer treatment.

To read the full case study and learn more about patient use of AI, refer to the full Clinical Journal of Oncology Nursing article.