Home > Article > Technology peripherals > Using AI to query cancer-related questions, are ChatGPT’s answers reliable?
A recent study found that the AI chatbot ChatGPT provided 97% of the correct answers to common misunderstandings and misunderstandings about cancer, but due to its indirect or possible Confused language has raised concerns, and researchers advise patients to be cautious when using chatbots to obtain cancer information.
A research report published in the journal "National Cancer Institute" delves into the application of chatbots and AI in providing cancer-related information. Researchers found that these digital resources accurately debunked some common myths and misconceptions about cancer. This groundbreaking study was conducted by a team led by Skyler Johnson, MD, Ph.D., a physicist at the Huntsman Cancer Institute and an assistant professor in the Department of Radiation Oncology at the University of Utah, to evaluate the reliability of ChatGPT in providing cancer information. and accuracy.
Johnson and his team used common cancer myths and misconceptions provided by the NCI as their training materials. They found that the answers provided by ChatGPT were 97% accurate. However, this result comes with concerning caveats. A major concern for the research team is that some of the answers provided by ChatGPT may be misunderstood or misinterpreted by patients.
Johnson said: "This may lead to cancer patients making some poor decisions." The team therefore advises patients to be cautious about whether to use chatbots to obtain cancer information.
The study found that because reviewers conducted blind reviews, meaning they did not know whether their answers came from a chatbot or the National Cancer Institute (NCI), while most answers were accurate, reviewers Researchers have found that the answers provided by ChatGPT are indirect or vague, and in some cases even wrong.
Johnson recognizes that it is difficult for cancer patients and health care professionals to obtain accurate information. Research into these sources is needed so that we can help cancer patients navigate answers to their diagnoses in the online information environment. "
And misinformation can harm cancer patients. In a previous study published in the Journal of the National Cancer Institute, Johnson and his team found that misinformation is common on social media , and may harm cancer patients.
The next research work is to evaluate how often patients use chatbots to obtain cancer information, what questions they ask, and whether AI chatbots can provide rare or rare information about cancer. Accurate answers to unusual questions.
The above is the detailed content of Using AI to query cancer-related questions, are ChatGPT’s answers reliable?. For more information, please follow other related articles on the PHP Chinese website!