Subtotal: $
Checkout-
The Artificial Pancreas
-
From Scrolls to Scrolling in Synagogue
-
Computers Can’t Do Math
-
The Tech of Prison Parenting
-
Will There Be an AI Apocalypse?
-
Taming Tech in Community
-
Tech Cities of the Bible
-
Send Us Your Surplus
-
Masters of Our Tools
-
ChatGPT Goes to Church
-
God’s Grandeur: A Poetry Comic
-
Who Gets to Tell the Story?
-
Editors’ Picks: The Genesis of Gender
-
A Church in Ukraine Spreads Hope in Wartime
-
Readers Respond
-
Loving the University
-
Locals Know Best
-
Gerhard Lohfink: Champion of Community
-
When a Bruderhof Is Born
-
Peter Waldo, the First Protestant?
In Defense of Human Doctors
When generative AI can provide the diagnosis and treatment plan, what will be the role of the doctor?
By Brewer Eberly
June 21, 2024
In my final year of medical school, I took a “Literature and Medicine” elective taught by a local English professor. One of my classmates wrote a short story imagining a future patient encounter with an artificial intelligence that had long since assimilated every known medical fact and clinical study. Through a series of sensitive prompts and specific questions, the algorithm offered the patient a perfect differential diagnosis.
Afterward, a cadre of tubes and probes collected samples from the patient, giving the AI the final clues needed to narrow its differential to a single pristine diagnosis, accompanied by a tailor-made pill matched precisely to the patient’s genome. This futuristic clinical intelligence had proven decades earlier in randomized controlled trials to be superior to human doctoring. The only contribution required from humans was that of surgical subspecialist technicians, who had long ago set down their instruments in favor of twisting tiny dials for a fleet of drone-like surgical nanobots.
Not exactly The Andromeda Strain – but the writing was tight, funny, and a welcome break from worrying about residency. We offered feedback, chuckled to each other nervously about our futures, and moved on.
My classmate’s speculations have come back to haunt me. Like C. S. Lewis, I delight in science fiction – the kind of writing that, as he put it, “gives an imaginative vehicle to speculations about the ultimate destiny of our species.” But as AI captures the imaginations of friends, family, and patients, I wonder about the ultimate destiny of my profession. (“Did you hear ChatGPT passed medical boards?”)
There is an implication that patients, having long been admonished by doctors to cease consulting Dr. Google, might find new confidence in a second opinion from ChatGPT – or at least a clinician who augments her clinical judgment with artificial intelligence.
The New England Journal of Medicine – one of academic medicine’s premier platforms – this year launched a new journal dedicated solely to applications of artificial intelligence in clinical practice. I hear advertisements almost weekly for clinical artificial intelligence tools like Nuance DAX, Freed, or Glass Health. Nuance DAX and Freed are ambient clinical intelligences that boast the ability to document patient encounters far better than a virtual or in-person scribe: “The exam room of the future has arrived where clinical documentation writes itself.” Glass AI, like ChatGPT, employs a large language model maintained by master clinicians to generate differential diagnoses and draft clinical plans – all based on diagnostic one-liners submitted by the clinician-user.
All doctors have a “peripheral brain” – their own personal database of clinical pearls, caveats, and frequently used schemas. Most clinicians use templates called “macros” or “smartphrases” to quickly generate plans and save time on documentation. I’ve been slowly building my peripheral brain since I was a medical student, updating it almost daily. But generative AI is well beyond this kind of pocket medicine.
It is disturbing to imagine a future in which primary care clinicians become mere “prompt engineers” of diagnostic one-liners to submit to supervising clinical AIs.
It is true that the accuracy of Glass AI, the freedom of Freed, and the “nuance” of Nuance DAX depend on the quality of the content it is given (implying first principles that depend decidedly on human perception), but it seems inevitable that future iterations will be able to generate more consistently accurate medical documentation and counsel then I could ever hope to offer. I can imagine clinical artificial intelligence rising beyond accuracy to warmth – even wit. As a JAMA editorial put it in 1972: “What if in the course of an automated consultation the machine surprises us with a quotation from Voltaire: ‘Doctors are men who prescribe medicines of which they know little to cure diseases of which they know less in human beings of whom they know nothing.’” The age of William Osler, the “father of modern medicine” who could feasibly hold it all within his brilliant mind, has long come to an end. As my friend and fellow physician Benjamin Frush has written, the first “I don’t know” moment is a seminal step in the formation of any medical trainee. It’s important to clarify which of the knowledge past that moment AI can actually supply.
I recognize that as AI becomes integrated into the warp and woof of health care, it is likely that patients will enjoy more efficient diagnoses, less medical error, and improved outcomes. At the same time, while I wrestle with what AI might mean for primary care, I’m reminded that the healing arts are less about prompt-driven intelligence than we might think. I worry that we doctors, enchanted by AI’s promise of more knowledge and better information, will lose a surprising feature of wisdom central to the patient-physician relationship – the intelligence of silence. As T. S. Eliot lamented in his 1934 play The Rock:
The endless cycle of idea and action,
Endless invention, endless experiment,
Brings knowledge of motion, but not of stillness;
Knowledge of speech, but not of silence;
Knowledge of words, and ignorance of the Word….
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?
When I was a senior resident physician in training, I remember showing off one of my smartphrases to the team I was leading, gushing about frailty scores and all the information I had assimilated into this so-called “smartphrase.” My attending graciously let me finish before wryly commenting, “or you could just go look at the patient.”
This attending wasn’t undermining the usefulness of preparation, only noting that she sensed there was not only something impractical about this kind of cookbook mentality but something insidious. It suggested false confidence and false conclusions – like quoting a book one hasn’t read. At a deeper level it betrayed my inclination to catalogue information rather than care wisely for the patient before me. I was becoming more like a computer than a healer.
If medicine is purely the propagation of the most up-to-date intelligence, then the appropriately named “UpToDate” (arguably modern medicine’s leading clinical-decision support tool), in combination with ChatGPT or something like Glass AI, should be enough for most clinicians to generate a sufficient clinical plan.
But anyone who has experienced the art of medicine understands that the secret to healing is not just the right knowledge but the right application in response to the needs of the patient. This is the way of phronesis or practical wisdom – where clinical experience meets moral friendship. Physician Francis Peabody famously named this simple truth a century ago, just months before his death: “the secret of the care of the patient is in caring for the patient.”
By contrast, the presence of AI in a clinical encounter calls the clinician to give her computer ever more attention while the patient waits in the corner, wondering if the poor keyboard will be ok. Artificial intelligence can distract, losing its value as more and more is assimilated. In signal-noise theory in clinical reasoning, for example, the pertinent “signal” or diagnostic clue can be harder to recognize the more “noise” or information is added to a patient’s case. One of the doctors I work with calls this the difference between the old way of concentrating on the needs of the patient and the new pressures of shoehorning patients into the needs of abstract clinical indications.
Wendell Berry points to something similar in the witness of poet and pediatrician William Carlos Williams:
When Williams denounced ideas apart from things – disembodied ideas – he was speaking as a doctor who treated his patients as individual persons, as neighbors, rather than as “cases” or “types.”… The doctor who is a neighbor who is a poet is well placed to see how in the dematerializing (through materialist) modern world the individual person, place, or thing is forever disappearing into averages, statistics, and lists. But Williams does not wait with his compassion and imagination until the wagonload of abstract categories comes lumbering up.
In our eagerness for artificial intelligence’s wagonloads of fresh-cut facts, we risk eroding the kind of practices necessary to receive those facts in the first place – let alone consolidate the best prompts to feed to an AI.
My father is also a family physician. He once told me, “Don’t fill dead space with dead words.” He wasn’t dismissing small talk so much as calling me to be slow to speak, teaching me that not only are the right words often brief, they seem to come more easily after a period of silence. He had learned from his relationships with his own patients that dead space is not dead at all, but a kind of soil from which the right words might sprout. In the healing encounter, the right words arise out of the silence shared by two fellow creatures wayfaring together, with a kind of precision that finds purity precisely in the waiting, watching, and deep listening.
When the French artist Marie Michèle Poncet was hospitalized for many months, immobile for long stretches of time between rounds of physical therapy, she captured small moments of such attention, presence, and accompaniment between herself and the healers who cared for her. She titled one of her sketches, “My Nurse Jean Michel, without Chatter, Returns Life to Life.” It depicts her nurse, larger than life in the frame, tenderly bending down to pick up her patient. The strokes are thick, childlike, and vulnerable – reminiscent of Picasso. Along with her title, this small piece of art is a witness to the power of life-giving silence.
To borrow a phrase from Sharon Rose Christner, patients bring their stories “in many languages and silences.” Indeed, for Stoics and early Christian contemplatives, language and words are bound up in a mysterious pattern of stillness leading to understanding. The Word was understood as a kind of seed, planted in silence, from which the world could be apprehended in all its beauty, depth, and woundedness. As the philosopher Douglas Christie writes in The Blue Sapphire of the Mind: Notes on a Contemplative Ecology:
there is a hunger, a longing, an openness for a Word; a moment of stillness and attention and silence; a Word arising from the silence that provokes thought, reflection and response; and a descent, again, into a space of silence. The work of living into the Word, of absorbing it into one’s life, takes place in silence, stillness.
I love words. My profession loves words. Medicine is first an oral tradition, with a primacy in training on both learning the art of clear communication with patients and demonstrating clinical acumen to superiors through the verbal sparring tradition of “pimping” which buzzes and trills with the impressive jargon of the medical profession.
And yet, as pediatrician Margaret Mohrmann writes, a physician’s “silent presence” is at the heart of clinical practice. It sets the foundation for clear words and artful communication. Silent presence “allows a true sharing of the burdens of knowledge and fear that pass between healer and sufferer.” Cardinal Robert Sarah calls it “the almost imperceptible start of decision” – a launchpad born of silence.
I notice it almost daily with my patients: in an age of chatter, silence has a particular intelligibility. I am continuously amazed at what patients will tell me after a period of silence – traumas they’ve never disclosed, vulnerable questions they’ve never asked – as if listening long, without a clicking keyboard to compete with, triggers the recognition of a moral friend with whom the patient’s hope of future healing might be shared. An opportunity for accompaniment arises out of silence that is not a waste of time but is altogether practical. What I do end up saying – and the clinical action we build together – becomes more apparent. The medicine becomes smarter than my smartphrases. More importantly, a clinical relationship has tempered into trust.
The place for such still and silent presence in clinical practice started to erode long before AI. Medical students already think that sitting in silence would be “a waste of time.” As for primary care clinicians, we’ve been contorted into algorithmic data-collecting robots for decades. In residency, I was pressed to see more and more patients in less and less time, which functioned in practice as seeing patients in five- to twenty-minute slots. There remains an assumption that it only takes that long – through prewritten templates, skillful interruptions, and tortured rituals of expectation-setting (“we only have time to discuss one problem today”) – for the necessary data to be strip-mined from the patient’s story.
Of course, it only takes one widower with dementia or one broken marriage or one scared toddler to see how inadequate and miserable this model of medicine is for both patient and clinician – at least, in primary care. I recognize my emergency and surgical colleagues labor within different temporal demands. But as neurologist Aaron Rothstein wrote when wondering if computers should replace physicians, “fifteen-minute visits are the exception to the kind of medicine most physicians need to practice.”
The data-driven vision of health care betrays an undergirding metaphysical assumption that bodies and patients are simply depositories of information themselves – searchable databases waiting passively for the right prompts. “The form in which ideas are expressed affects what those ideas will be,” the critic Neil Postman wrote of the typographic mind in Amusing Ourselves to Death. “The press was not merely a machine but a structure for discourse, which both rules out and insists upon certain kinds of content.” Because prompt-driven artificial intelligence participates in the typographic world, it is not merely a tool but a structure for discourse – a structure that insists upon a certain medical imagination of categorization and control while ruling out the kind of non-typographic intelligence gleaned from silent presence.
I can’t state this strongly enough: the relational work of silent presence in primary care is not a moral ornament on an encounter that can otherwise be reduced to the exchange of information between client and data-entry clerk. As pastors and counselors have known for decades, the art and quality of the therapeutic relationship often is the health care. As psychiatrist and theologian Warren Kinghorn and psychiatrist Abraham Nussbaum argue in Prescribing Together: A Relational Guide to Psychopharmacology, the effectiveness of even the most routine prescriptions improves in relationship, “accounting for 20% or more of outcome variance” that can’t be fully explained by a placebo effect. They write:
The prescribing clinician is not primarily an expert dispenser, but rather a seasoned accompanist and collaborator who walks with the patient through the terrain of his or her life, seeking both to understand the problem (for which diagnoses are useful heuristic guides) and to discern helpful ways forward.
A proportionally small amount of time spent with my patients is dedicated to the kind of content generation and heuristics served by artificial intelligence tools. The major diagnostic and therapeutic decisions are often determined up front. Like all physicians who bear the responsibility of lifelong learning, I too study UpToDate and consult my journals and peripheral brain. But the looking, accompaniment, candor, lament, coaching, and looking again are also the work of staying up to date. Such presence communicates to patients that they are not machines and neither am I.
I’m sure artificial intelligence will keep evolving toward a future like what my old medical school classmate imagined. Well-known pharmacies and insurance companies are already partnering toward that end. The best medical intelligence has a way of percolating downstream in due time, manifesting as a standard of care through position statements, decision trees, and the other fare of continuing medical education. Perhaps these are the fitting places to employ AI, rather than at the point of care. I don’t want to relive the folk tale of John Henry, in which a heroic Luddite bests a machine only to die of a broken heart.
At the same time, it is disturbing to imagine a future in which primary care clinicians become mere “prompt engineers” of diagnostic one-liners to submit to supervising clinical AIs. As psychiatrist Justin Key predicts in his science fiction short story “The Algorithm Will See You Now,” a breezy future of AI-augmented satisfaction already seems less likely than “a lot of algorithm-induced anxiety.” My cursor hovers over subscription links to Glass AI and Nuance DAX, betraying my fear of being surpassed or replaced. My medical assistant, a pre-medical student himself, is already keeping me on my toes by feeding questions to a clinical artificial intelligence between patients.
As a Christian as well as physician, I agree with Pope Francis: “If we wish to follow Christ closely, we cannot choose an easy, quiet life.” Medicine is rarely quiet. It is certainly not easy. There is a difference between seeking a quiet life and choosing a life that places one among the disquieted. Primary care clinicians who long to practice silence may need to make the difficult choice of seeking practice models or forms of daily work that allow them to behold patients as creatures and not as machines or computers. As the Irish journalist Robert Lynd wrote, “In order to see birds it is necessary to become a part of the silence.” We might say that in order to see patients, it is necessary to become a part of the silence.
It takes love to risk silence, because silence welcomes the beings before us to unfurl at their own pace.
The aphorist Nicolás Gómez-Dávila said that “the two wings of intelligence are learning and love.” Primary care in the age of AI is at risk of becoming flightless, cursed with a beefy wing of intelligence that, powerful as it is, can only beat furiously against the wind while the wing of love drags along the ground, underused and atrophied.
It takes love to risk silence, because silence welcomes the beings before us to unfurl at their own pace. When it comes to the first moves of primary care, in which we are often welcoming patients to speak of their sufferings for the first time, silence may be the most intelligent form of communication we have. Henri Nouwen called this kind of listening the “spiritual hospitality by which you invite strangers to become friends, to get to know their inner selves more fully, and even to dare to be silent.”
I can only use my peripheral brain wisely if I have first dared to practice that intelligence. Maybe then I can become the kind of doctor who is a neighbor who is a poet, who does not wait with his compassion and imagination until the wagonload of smartphrases comes lumbering up. Maybe in daring to be silent I can become the kind of healer who, without chatter, returns life to life.
Already a subscriber? Sign in
Try 3 months of unlimited access. Start your FREE TRIAL today. Cancel anytime.
Rebecca Martin
Thank you for this powerful reflection, Brewer. I was struck by the image you cite at the end, of intelligence beating with just one wing of learning, and flailing impotently because its partner wing of love lies useless and atrophied. This piece also called to mind a recent moment in which I, now accustomed to the long spaces of silence that my own field of palliative medicine allows, became unsettled by the rapid-fire pace of a political interview broadcast on TV the other evening. That sense of disconnect -- really almost of violence, as if the interviewer was treading far too heavily on the sacred ground of their guest's soul -- demonstrated the profoundly different formations wrought by our respective fields (i.e. the highly particular spaces of political media and palliative medicine). It was both jarring and reassuring to notice how my own innate sense of rhythm and pacing had been formed over years of attending to patients with the presence that palliative medicine affords. Thank you for this opportunity to pause and notice the value of silence.
John schuster
Aside from the benefits to patients of personalization of diagnosis and treatment through attentive care, I find after a long primary care internal medicine practice the rewards I take away from successful care are not financial but based on relationship achieved by “ getting to know” the patient.
Michelle Vas
I loved your article. It is so profound. I find that it combines the essence of being human in relationships irrespective of whether you are a doctor or a patient; the concept of spiritual hospitality is truly divine, capturing the essence of true counselling, which is to me a calling for each and every human being.
Katie O
Thank you for this thoughtful and beautiful article. As someone currently searching for the right primary care doctor for my child and a health care worker myself I see daily the importance of listening and silence. For AI to create health it must be able to hear the whole person, not just signs and symptoms but fears and passions. Health is more than just biological smooth-working. Thank you again for a balanced article with people at the heart of health.
James Pile
More questions than answers here to be certain, but as a physician navigating the lengthening shadows of my late career, as well as the parent of a first-year medical student, I find myself pondering similar issues on a regular basis. I'm grateful to have been able to practice medicine when I did, but remain cautiously hopeful that my daughter and her contemporaries will have equally fulfilling, if vastly different, careers. As an aside, I'm regularly astonished by the beauty of the language in Plough's articles; this piece is no exception.
Ruth Ann Gattis
The thought of AI robots as doctors is very scarry and very sad. It also makes me cross. I am an RN and worked for years in a small rural clinic. As a child, I got to know our family doctor as a friend, who listened to this silly little girl with her complaints. Years later, he was the counselor for our youth group. More years passed and I was working with him. as an RN. From him I learned the importence of listening. Forward again, I had my own children. As he did their routine check-ups, he knew which questions to ask so I, as a mother, could ask those silly questions you'd never make an appointment for, but brought peace of mind for just asking. He even told us young mothers,"The only silly question is the one not asked". I always felt relaxed and seen as a mother, who wanted to learn. He even had to tell me to come more often with questions, because my daughter had a lot of ear troubles. The very thought of this kind of caring, face-to-face, being facted out of an appointment is what makes me cross. We are human beings, we need the feeling of a caring person talking to us. This is already being lost in the over booking of doctors' time. If AI robot doctors take over, there will be an increase in loneliness and worse I am sure.