More than 85% of oncology healthcare professionals say that they need to be able to explain clinical use of artificial intelligence (AI) to their patients, but less than half report being familiar with AI in health care or receiving education on the technology, researchers reported in JAMA Network Open. In response, the National Cancer Institute’s (NCI’s) Center for Biomedical Informatics and Information Technology created a top-5 tip sheet for clinicians and researchers to use when evaluating AI tools for health care.

  1. Was it built on accurate and sufficient data that are consistent with your patient population? Healthcare professionals should consider the data sets used to train the AI tool and the standard sets against which they were tested. NCI advised paying particular attention to the sample populations’ diversity and whether it accurately represents the patients in your care.
  2. How rigorous was the testing? Specifically, NCI recommended ensuring that tools were tested in clinical trials and that the tests were fully blinded, prospective, and validated with data not used in the model’s training.
  3. Can you trust its transparency? AI tools’ functionality should be transparent and shared in open science, which allows users to report both consistent results (which add credibility) and inconsistent results (which give developers information to improve the AI tool). 
  4. How will you use it in practice? After a tool has passed the first three tests, plan and test how you will incorporate it into your unit’s workflows. NCI recommended using a pilot project model to implement AI tools in practice.
  5. What is your plan to promote responsible use? NCI recognized the risk for bias and other ethical considerations with AI tools and encouraged healthcare providers to plan for human oversight. Examples for consideration include “If you use AI for your practice and the model returns an error, who bears the responsibility?” and “If you choose a model that leads to a bias against a subpopulation of patients in your care, who is at fault?”

“Knowing the strengths and limitations of AI will help you deliver optimal patient care, while minimizing potential errors or biases,” NCI explained. “Likewise, regular training and collaboration with AI developers can help you ensure that your AI applications remain ethical and effective. Ultimately, it’s up to you to take responsibility for interpreting AI’s recommendations and making informed decisions to ensure quality care for your patients.”

Learn more about nursing’s role in AI in health care and how AI is influencing cancer care and oncology nursing practice on the Oncology Nursing Podcast.