On one of my last days in hospital, a medical student wandered into my room. She was by far the greenest staffer I had met; until then I’d been cared for by junior doctors, senior consultants and an army of experienced nurses and auxiliaries.
This student was just starting her training. She explained that she was learning to take medical histories. Although neither of us knew it, I’d been preparing for this moment for 25 years.
When I had first packed for the hospital, I’d optimistically brought with me the last bits of work I hadn’t cleared off my desk: a journal article to peer-review, a dissertation chapter to comment on. But I never imagined that being a historian would be directly helpful to my care. Much less had I realised all the other ways that thinking like a historian can be relevant in medicine.
Picture this: I am flat on my back in the empty pre-op room outside the operating theatre, chatting with the young nurse from the ward who will stay with me until they call me in for anaesthesia. By this time, we’ve become friendly.
We talk about her training in nursing and my work at a university. She tells me that her sister read history at university before becoming a midwife. I ask whether studying history has influenced her sister’s approach to midwifery. The nurse has been pacing, but she stops. “Absolutely. My sister accepts nothing, questions everything, and is never afraid to argue out a point. That’s her history training.”
The study of history is increasingly misunderstood as obscurantist, attacked or written off as the self-indulgent preserve of wealthy undergraduates. Luckily, the drive to recognise the value of history and the humanities is also growing. But these debates by necessity see the humanities and sciences as opposites.
From my particular hospital bed, it seemed increasingly, blindingly clear how much humanities and sciences – in this case history and medicine – truly complemented each other. As Gretchen Busl wrote last year, training in the humanities teaches us “the language necessary to navigate a complex and rapidly shifting world”. For me, that world was the Victoria Hospital in Kirkcaldy.
I spent three weeks in hospital last year with a sudden, rare and aggressive infection. The long days dissolved into cycles of drug rounds, ward rounds and blood tests, and my options diminished until substantial surgery became the only feasible treatment. Throughout my time as an inpatient, the historian in me accepted nothing, and questioned everything, as I sought a coherent story that would explain the behaviour of my uncooperative body.
My nurse was adamant that her sister’s training in critical thinking – which is what we historians teach – made her a better advocate as a midwife. Similarly, my own training helped me to find the right questions to ask my doctors, the right words to describe my uncertainties, and the courage to discuss them (politely!).
But arguing and thinking critically is only part of what historians, like doctors, are trained to do.
We are also trained to do the best we can with ambiguity. When I sit down with a sheaf of 17th-century manuscripts, I have to surmise not only what their long-dead authors were thinking, but how they thought. I need to work out for myself not only why they wrote the words they did, but also what they didn’t say: what seemed obvious to them, what they wanted to hide, what parts of their language were coded or formulaic or deliberately vague.
I do this by immersing myself in the scraps of their world that have survived, and by weighing my own findings against those of my colleagues. I will never be able to pin down my authors to my satisfaction. The more we learn about any historical moment, the more it turns out to be contradictory and unresolvable.
With my own medical questions, that comfort with ambiguity turned out to be essential. The illness that had felled me resisted clear definitions and treatment. It cared no more about the models in medical textbooks than my manuscript authors care about my current grant proposals.
Ever stronger antibiotics didn’t cure me, and my surgery in turn revealed a far more pernicious, widespread infection than anybody had expected. My initial diagnosis had been insufficient. Two of the five or six consultants at my surgery still continue to disagree: did I have one widespread infection, or two separate ones, coincidentally showing symptoms and needing treatment at the same time, perhaps preying on hidden pre-existing conditions. “Do you want the truth? The real truth?” one of consultants challenged me: “I don’t know. We will never know.”
I think he meant to unnerve me. Shouldn’t surgery give us firm answers? Isn’t diagnosis the beginning of recovery? But I am a historian. And like all historians, my research requires me to construct the most convincing interpretation possible given imperfect, incomplete, contradictory evidence, and to reassess whenever we learn more. I could handle the truth. Learning to think like a historian made me a better patient.
But that medical student, trying to become a good doctor, knew none of this about me. She asked me no biographical information, or even my name – just my age and nothing more. She took out a notebook and asked, “Why are you here?” So I described, first, the original diagnosis that had brought me in. Then, the symptoms and tests that had led up to the diagnosis. The treatment prescribed for me in hospital, and the complications that ensued. The new treatment and its prognosis. The unanswered questions left to me and the doctors.
Unconsciously, I applied exactly the sorts of divisions I use when I write up my own research: the hypothesis, the context and background, the evidence, the problems, the lasting import. When I finished, she was silent a moment, and then she said: “I must say, you are the best historian I have ever met.”
She was learning to think like a doctor, and she thanked me for helping her.
Emily Michelson is senior lecturer in history at the University of St Andrews.