The “Fine Art” of Medicine

by G. Richard Holt, MD, MSE, MPH, MABE, MSAM, D Bioethics • February 2, 2026

Clinical Scenario

You are seeing Reverend Smith today for a follow-up appointment after he completed radiation therapy for a unilateral T2 glottic cancer. He is a 71-year-old minister and still leads his church’s congregation, so preservation of vocal function was very important to him in his decision for primary radiation therapy. He is accompanied today by his wife of 50 years. During the greeting, you indicate to Reverend Smith that you will be using a virtual scribe today for the visit if he approves. After you provide a detailed description of the artificial intelligence-based ambient scribe system, Reverend Smith gives his approval.

Following your evaluation, Reverend Smith inquires about his prognosis for the future. You indicate that you obtained AI-generated statistics on his particular cancer and proceed to relate the evidence-based prognosis to him and Mrs. Smith. When he questions how those statistics apply specifically to him and his disease, and what would be his remaining longevity of life, you laughingly indicate that you are “not God,” so you cannot give more information than the population evidence you retrieved. As an early-career otolaryngologist, you personally believe that evidence is the most important factor in patient care.

Reverend Smith then indicates to you that Mrs. Smith was just diagnosed with stage III breast cancer, and their individual cancer outcomes will be very impactful for them and their family. You briefly extend your condolences to Mrs. Smith, followed by an indication that your schedule is very busy today, and you will now need to schedule the next appointment for Reverend Smith. You inform the Smiths that while the use of the virtual scribe has saved time for you, you have instructed the schedulers to fill the saved time with more patients, and you must stay on schedule.

Reverend and Mrs. Smith glance at each other, and with a sad gravity, Reverend Smith says, “Too bad you aren’t using this extra time to spend talking with your patients, Doctor.” This statement lingers expectantly in the air, awaiting a sincere response.

Discussion

Perhaps the state of AI’s influence on the practice of medicine in general, and otolaryngology specifically, is currently at a seminal point of inflection. As AI becomes increasingly embedded in the everyday practice of patient care, the enduring human elements of medicine (the “fine art”) remain faithful reminders for us. Empathy, understanding, compassion, and our therapeutic presence enhance our altruistic approach to patient care. These fundamental virtues, however, face unprecedented challenges in the milieu of AI integration into nearly every facet of our specialty. This recognition raises the question of whether the use of innovative technology and the duty to provide compassionate patient care can effectively coexist. For otolaryngologists who practice relational medicine, the answer to this question is at once existential and an ethical imperative.

An over-dependence on AI-enabled clinical tools can result in a shifting of a physician’s attention away from the patient, who may be anxious, hopeful, or vulnerable. While AI clinical tools may have the capability to supplement evidence gathering, they should not replace a physician’s capability for discernment regarding their patient’s holistic being. At the center of this evolving technologically-rich environment remains the fundamental tool that otolaryngologists must possess—a meaningful connection with the patient in the context of professional concern.

AI may augment diagnostic accuracy, but it cannot replace the physician’s internal moral compass, empathy, integrity, or professional judgment. Ethical practice requires an otolaryngologist to consciously cultivate the “fine art” of medicine, a complex blend of clinical knowledge, emotional intelligence, humility, and respect for a patient’s humanism. In the face of increasing reliance on AI tools, the otolaryngologist must explain the uncertainties of disease honestly, invite questions, acknowledge a patient’s deep concern, communicate warmth during encounters, and actively listen to the patient, particularly when serious illnesses are at stake. These actions are not a drop-down menu, but are, rather, central to trust and honesty. AI findings and application to the patient’s disease must be interpreted through the lens of individualized patient care. Patients are more likely to follow recommendations, accept their own role in their healthcare, and trust the physician when they feel heard and seen as full human beings.

Otolaryngologists benefit as well as patients, as empathetic engagement may improve diagnostic accuracy through active narrative listening, and reaffirms the purpose of the medical profession. In the world of otolaryngology, where patient stories often involve chronic symptoms, anxiety, communication challenges, fear of the unknown, and life-altering surgeries, empathy and understanding are fundamental to the therapeutic mission.

Within otolaryngology, AI is providing advancements that promise efficiency, standardization, early detection, and possibly improved diagnostic accuracy. They also introduce questions about privacy, explainability, bias, and too much reliance on their interpretations. The otolaryngologist must now negotiate an exam room where digital systems are active observers and contributors. Framed in that perspective, transparency must be discussed, and patient concerns addressed with respect to the otolaryngologist’s role. The ethical implications are clear and significant.

For one, clinicians must ensure that AI does not become a barrier to human presence and understanding. When an ambient scribe transcribes every utterance during the encounter, or when a treatment plan produced by AI is used to direct a patient’s care without experienced clinical insights applied, the opportunity for shared decision making and professional judgment may be affected. When a patient receives an end-of-visit copy of a virtually scribed encounter, the language of a concerned and caring otolaryngologist will likely not be embedded in the report. Professional observations, such as patient demeanor, facial and body language, and furtive glances at a partner, may all be lost in the scribed report. As AI-generated language becomes increasingly sophisticated, the distinction between simulated empathy and authentic human understanding becomes critical. The virtual scribe will likely not be able to entertain the importance of the tone and meter of a clinician’s explanation of the impact a cancer diagnosis may have on the patient, or the understanding of how patients respond to bad news.

Patients may have internal concerns about the encounter being recorded, particularly with respect to machine language output, privacy of information, and accuracy of the report, which may be mixed with a loss of personal connection with the otolaryngologist. Understanding these potential concerns can proactively lead to a discussion with the patient regarding the otolaryngologist’s ultimate responsibility to review, revise, and improve the scribed report.

The same technology that risks eroding compassion, empathy, and understanding, however, can also support them if deliberately guided by otolaryngologists who understand the importance of relationship-centered care. If virtual scribed reports improve efficiency without diminishing the accuracy of the encounter, and if the otolaryngologist’s time availability for patients is increased, then the effect can be positive. The main concern is that additional patients will be inserted into the daily schedule, missing the vital opportunity to spend more time with each patient in their encounter, which could improve communication and strengthen the patient–physician relationship.

In this scenario, the otolaryngologist (aka Dr. Jones) has missed several opportunities to better understand and explore the impact of the disease on both Reverend and Mrs. Smith and to provide important empathy and understanding to them. As an early-career otolaryngologist, Dr. Jones is still acquiring the clinical skills to provide excellent evidence-based care within the overarching context of humanism. Dr. Jones’ undergraduate and graduate medical education heavily emphasized evidence-based practice and a growing reliance on AI-supported clinical care. Unfortunately, Dr. Jones was not particularly supported in developing the “fine art of medicine” during residency training and thus may seem distant and technical in dealing with patients. All is not lost, however, as Dr. Jones has acquired a senior otolaryngologist mentor in his new practice setting, who is respected for her physician personhood and known to be very focused on virtues and ethics in patient care.

In the first missed opportunity, Dr. Jones was very technical in his explanation of potential outcomes when asked by Reverend Smith. In general, we use population-based data to practice sound evidence-based medicine. But it should not be the only factor in discussing outcomes with a concerned cancer patient. The question was a perfect opportunity for Dr. Jones to further explore Reverend Smith’s views on quality of life, personal visions for his future, and perhaps how his wife’s cancer impacts his own sense of family and their life moving ahead. Second, Dr. Jones appeared inconsiderate or clueless when he tried to joke about not being “God” when discussing outcomes with Reverend Smith—the wrong statement to the wrong patient at the wrong time.

The third missed opportunity was in Dr. Jones’ response to learning of Mrs. Smith’s cancer diagnosis. What a perfect moment for Dr. Jones to display compassion, empathy, and understanding, yet it went unfulfilled. These moments are exactly the best way to spend the extra saved minutes gained from the efficiency of the virtual scribe. The Smiths, into their special twilight of life, are facing double jeopardy from cancer, and if Dr. Jones had spent five additional minutes inquiring about their concerns and fears as they moved toward difficult times, so many important insights might have been gained. Additionally, the diagnosis of cancer, particularly breast cancer, is very worrisome owing to the extent of the treatment and the outcomes in an older female. Although an otolaryngologist, Dr. Jones should be capable of sufficient professional concern to demonstrate empathy and understanding. If AI is to improve healthcare for the patient, it should not dehumanize a clinician’s duty to care and be supportive. This is indeed the “fine art of medicine.”

Dr. Holt is professor emeritus and clinical professor in the department of otolaryngology–head and neck surgery at the University of Texas Health Science Center in San Antonio.

ENTtoday - https://www.enttoday.org/article/the-fine-art-of-medicine/

Filed Under: ENT Perspectives, Everyday Ethics, Home Slider Tagged With: AI, everyday ethics

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *