Current Research Status of Explainability in Artificial Intelligence and Evaluation of Its Application Effects in Medical Fields
CSTR:
Author:
Affiliation:

Clc Number:

TP30

Fund Project:

This work is supported by National Natural Science Foundation of China (U22A2041)

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Artificial intelligence interpretability refers to the ability of people to understand and interpret the decision-making process of machine learning models. Research in this field aims to improve the transparency of machine learning algorithms, making their decisions more trustworthy and explainable. Interpretability is crucial in artificial intelligence systems, especially in sensitive and critical decision-making domains such as healthcare, finance, and law. By providing interpretability, people can better understand the reasoning behind the model’s decisions, ensuring that they are fair, robust, and ethical. In the continuously evolving field of artificial intelligence, enhancing the interpretability of models is a key step towards achieving trustworthy and sustainable AI. This article outlines the development history of interpretable artificial intelligence and the technical characteristics of various interpretability methods, with a particular focus on interpretability in the medical field. It provides a more in-depth discussion of the limitations of current methods on medical imaging datasets and proposes possible future directions for exploration.

    Reference
    Related
    Cited by
Get Citation

HE Xiaoxi, CAI Yunpeng. Current Research Status of Explainability in Artificial Intelligence and Evaluation of Its Application Effects in Medical Fields[J]. Journal of Integration Technology,2024,13(6):76-89

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:March 12,2024
  • Revised:March 12,2024
  • Adopted:
  • Online: April 15,2024
  • Published:
Article QR Code