[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Views 499
Citations 0
Editorial
March 11, 2020

Patient Perspectives on the Use of Artificial Intelligence

Author Affiliations
  • 1Perelman School of Medicine, Department of Dermatology, University of Pennsylvania, Philadelphia
JAMA Dermatol. Published online March 11, 2020. doi:10.1001/jamadermatol.2019.5013

The practice of medicine is changing with recent advances and implementation of artificial intelligence (AI). Examples of AI applications currently used in medicine include adverse drug reaction and interaction warnings when prescribing medications, patient reminder calls for appointments, decision-support tools used by clinicians, and robotic surgical systems.1 In dermatology, many AI algorithms are being developed to differentiate melanoma and other cutaneous cancers from benign skin lesions.2 Progress is quickly being made, and it is critical that this technology is designed and evaluated in a manner that enables the delivery of high-quality care that is sensitive to patient values and preferences.

Current discussions around the development of AI in dermatology have centered on the lack of diversity in clinical data and images used for training algorithms and the need for clinical validation of technology.2 While addressing these technical challenges remains central to developing AI that positively affects patient care, there has been little exploration of what patients value with respect to AI in dermatology; the perspective of the patient has become increasingly incorporated into developing best clinical practices throughout dermatology. Although AI technology has not been widely implemented in dermatology yet, it is the pivotal time to assess patients’ views on the subject to understand their knowledge base, as well as values, preferences, and concerns regarding AI. Given some of this technology will have a significant effect on patients’ lives in the future, it is important for them to have an opportunity to express their opinions before AI is widely implemented. Their viewpoints may help direct future efforts to educate patients about AI, improve consent processes, focus clinical implementation, and inform policy decisions.

In a report of a qualitative study in this issue of JAMA Dermatology, Nelson et al3 report on their interviews with 48 patients at the Brigham and Women’s Hospital and the Dana Farber Cancer Institute to explore how patients conceptualize AI and perceive the use of AI for skin cancer screening. Patients were primarily white/non-Hispanic (94%), reported use of digital services for health (90%), held a graduate or professional degree (42%), and had a total household income of $150 000 or more (42%). The majority of those enrolled had a history of skin cancer, and, overall, they gave educated answers when questioned about the risks and benefits of AI for skin cancer screening. The most commonly perceived benefits of AI tools for skin cancer screening were increased diagnostic speed (60%) and health care access (60%). Major concerns in this patient population included anxiety over the loss of human interaction, credibility of the AI system, and data privacy. Overall, the majority (94%) expressed the importance of symbiosis between the clinician and the AI system but would still recommend these AI systems to friends or family (75%). This study gives us valuable information into the views and preferences of a small cross-section of patients. However, attitudes toward AI span a broad spectrum, and more studies need to be done to capture the diverse knowledge and opinions of all of our patients. In addition, patient attitudes and preferences with respect to AI in dermatology may change over time as patients develop more familiarity with these systems.

Other studies have shown that some patients may be more skeptical of AI in medicine. In a large survey (2000 adult participants, Center for the Governance of AI),4 substantially more support for developing AI was expressed by college graduates (57%) than those with high school or less education (29%), by those with household incomes over $100 000 annually (59%) than those less than $30 000 (33%), and by those with computer science or programming experience (58%) than those without (31%). In a survey of 155 patients regarding their views of AI for diagnostics in radiology, level of trust in the AI system steadily increased for each higher category in educational level of the respondents.5

Vulnerable patients, including racial and ethnic minorities, the underinsured or uninsured, economically disadvantaged, and those with chronic health conditions, may be at risk for improper consent for or use of AI. Patients who better understand privacy protection regulations and privacy preferences in digital devices may be less concerned about the loss of privacy than those who lack the knowledge and tools to manage these options. People who are healthy may have a smaller amount of personal health information and less need to benefit from shared information, so they may not see the value in sharing their personal health information for the development of AI. Patients with certain medical issues or difficulty obtaining medical coverage have expressed concern that sharing their medical data could be used by insurance companies to deny them or their children insurance or to increase premiums.6 Patients who have numerous chronic medical conditions and seek new medical knowledge to improve their health may be more open to the risk of loss of privacy because they have a greater need for the benefits of data sharing.7 In these cases, patients may consent without fully understanding the risks of the AI technology.

Understanding patients’ general knowledge base around AI is important when discussing AI with patients or when educating the public. Each patient will be unique, but understanding general themes around AI knowledge and views may be helpful when discussing a possible AI procedure or treatment with a patient. A 2017 population survey of 2200 adults showed that those who claim to know “not much” or “nothing at all” about AI were much more likely to have an annual income less than $50 000 (49%) or no college education (52%).8 In the Nelson et al study,3 only 21% of the patients had no concept of AI before the study. For an informed consent to be valid, the patient must be given the appropriate information that allows them to make a voluntary choice, and this becomes complex in the case of AI.9 The consent discussion can become complicated by algorithm transparency issues, as well as patient and clinician lack of knowledge, fears, misinformation, or overconfidence.10 The more clinicians know about the AI systems they are using and the more we know about patient’s perceptions of AI for their care, the better we can work as a team to make choices that benefit their care.

Given the current issues in AI clinical implementation and consent, the European Union issued the General Data Protection and Regulation in May 2018, which affects AI implementation in their health care systems. The General Data Protection and Regulation requires explicit and informed consent before any collection of personal data, and the person providing the data can track what data are being collected and request its removal. In addition, patients have the right to explanation of the AI application that is being used.1 The hope is that the General Data Protection and Regulation, and similar policy just starting to be implemented in the United States, such as the California Consumer Privacy Act, will promote patient trust and engagement.

Artificial intelligence in medicine is developing rapidly, and eventually this technology will be commonplace in medicine. In dermatology, we are still early in the AI product development cycle, where it is easier to establish development guidelines for the future. It is much more difficult to retrofit ethics and remove biases once products have come to market. This is also a critical time to survey all types of patients to better understand what information to provide for education and consent, as well as how to optimize patient-centric product development and clinical workflow. This is a complex topic and patients from a variety of racial, ethnic, insurance, economic, health, and geographic backgrounds would need to be interviewed to understand the breadth of views. Nelson3 et al provide the first survey of one cross-section of the population, and using a similar model, more data can be collected to get a full view of patient perspectives of AI.

In the Position Statement on Augmented Intelligence, the American Academy of Dermatology states that, for the patient and clinician, “there should be transparency and choice on how their medical information is gathered, utilized, and stored and when, what, and how augmented intelligence technologies are utilized in their care process. There should be clarity in the symbiotic and synergistic roles of augmented intelligence and human judgment so that it is clear to the patient and provider when and how this technology is utilized to augment human judgment and interpretation.”11 For this goal to be reached, the current patient knowledge and perspective in each community needs to be understood.

Back to top
Article Information

Corresponding Author: Carrie L. Kovarik, MD, Perelman School of Medicine, Department of Dermatology, University of Pennsylvania, 3600 Spruce St, 2 Maloney Building, Philadelphia, PA 19104 (carrie.kovarik@pennmedicine.upenn.edu).

Published Online: March 11, 2020. doi:10.1001/jamadermatol.2019.5013

Conflict of Interest Disclosures: Dr Kovarik reported being a member of the artificial intelligence task force for the American Academy of Dermatology.

References
1.
He  J, Baxter  SL, Xu  J, Xu  J, Zhou  X, Zhang  K.  The practical implementation of artificial intelligence technologies in medicine.  Nat Med. 2019;25(1):30-36. doi:10.1038/s41591-018-0307-0PubMedGoogle ScholarCrossref
2.
Thomsen  K, Iversen  L, Titlestad  TL, Winther  O.  Systematic review of machine learning for diagnosis and prognosis in dermatology.  J Dermatolog Treat. 2019;1-15. doi:10.1080/09546634.2019.1682500PubMedGoogle Scholar
3.
Nelson  CA, Pérez-Chada  LM, Creadore  A,  et al.  Patient perspectives on the use of artificial intelligence for skin cancer screening: a qualitative study  [published online March 11, 2020].  JAMA Dermatol. doi:10.1001/jamadermatol.2019.5014Google Scholar
4.
Zhang  B, Dafoe  A. Artificial Intelligence: American Attitudes and Trends. Oxford, UK: University of Oxford; 2019.
5.
Ongena  YP, Haan  M, Yakar  D, Kwee  TC.  Patients’ views on the implementation of artificial intelligence in radiology: development and validation of a standardized questionnaire.  Eur Radiol. 2020;30(2):1033-1040. doi:10.1007/s00330-019-06486-0PubMedGoogle ScholarCrossref
6.
Future Advocacy. Summary of ethical, social, and political challenges and suggestions for further research: a report with the Wellcome Trust. https://wellcome.ac.uk/sites/default/files/ai-in-health-ethical-social-political-challenges.pdf. Published April 2018. Accessed January 4, 2020.
7.
Petersen  C.  Through Patients’ Eyes: Regulation, Technology, Privacy, and the Future.  Yearb Med Inform. 2018;27(1):10-15. doi:10.1055/s-0038-1641193PubMedGoogle ScholarCrossref
8.
Morning Consult. National tracking poll 170401. Survey report. https://perma.cc/TBJ9-CB5K. Updated October 22, 2018. Accessed January 21, 2020.
9.
Appelbaum  PS.  Clinical practice: assessment of patients’ competence to consent to treatment.  N Engl J Med. 2007;357(18):1834-1840. doi:10.1056/NEJMcp074045PubMedGoogle ScholarCrossref
10.
Schiff  D, Borenstein  J.  How should clinicians communicate with patients about the roles of artificially intelligent team members?  AMA J Ethics. 2019;21(2):E138-E145. doi:10.1001/amajethics.2019.138PubMedGoogle ScholarCrossref
11.
American Academy of Dermatology. American Academy of Dermatology position statement on augmented intelligence (AuI). https://server.aad.org/Forms/Policies/Uploads/PS/PS-Augmented%20Intelligence.pdf. Updated May 18, 2019. Accessed January 4, 2020.
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    ×