The Small Matter of Implicit Bias

“No amount of money can reward the work and sacrifice of policemen, teachers and nurses. Their reward is in Heaven!”
– the late Michael Ghansah

I make the effort to see my patients before they are rolled back for surgery. I can easily say that 3 out of 4 times, when I walk up to the patients, the assembled family members assume I am an orderly who is taking the patient back to surgery. When I introduce myself as the anesthesiologist, the surprise or disappointment is always palpable.
Am I bothered by the reaction? Sure, who wouldn’t be in my situation. One just learns to live with it. The bigger question though is: what drives such an assumption?
Walk through most US hospitals. Most of the doctors are white, the janitors and orderlies are black. The doctors are male, the nurses are women. So implicitly, most people see a black guy and subconsciously think “Orderly”. I know several female colleagues who are addressed as “Nurse”!
It is not the fact that one feels demoted by being seen as an orderly or a nurse. No! It is the implicit bias inherent in the assumption that is bothersome.

Implicit bias!
“….implicit stereotypes and implicit attitudes that are shaped by both history and cultural influences. Implicit biases encompass the myriad fears, feelings, perceptions, and stereotypes that lie deep within the subconscious; they act on those memory records and exist without an individual’s permission or acknowledgement. In fact, implicit bias can be completely contradictory to an individual’s stated beliefs—a form of conscious-unconscious divergence.”

Compared to explicit bias, implicit bias is unconscious. So you may think all women who work in the hospital are nurses even though you do not realize it. An example of explicit bias is known and accepted prejudice or even hatred for a race or people – racism!.

With patients not really being in a position of power relative to me, their biases do not significantly impact my day. The situation is quite different when the roles are reversed. Where the one with the bias is in the position of power. Say the physician one sees in the ER or a sentencing judge or a cop with a gun.
In a 2012 study looking at how pediatricians treated their patients with pain, white patients were more likely to get pain medicine prescribed for pain than black ones.
Researchers found that when controlling for numerous factors like severity of the primary offense, number of prior offenses, use of force etc, individuals with the most prominent Afrocentric features received longer sentences than their less Afrocentrically featured counterparts. So if you had dark skin, a wide nose, and full lips, you were toast!
Racial bias in policing has been in the news lately. Events like the killing of Michael Brown in Ferguson, MO or Tamir Rice in Cleveland have thrust the issue into the news. The killing of Alton Sterling yesterday in Baton Rouge and Philando Castille in Minnesota today have escalated the tensions. Movements like Black Lives Matter, All Lives Matter and Blue Lives Matter have all arisen to defend different parts of the debate.

One thing is for sure though. Implicit bias plays a huge role in law enforcement and the broader judicial system in this country.
Cops intervene disproportionately with blacks and Hispanics. They are arrested or ticketed, searched, stopped or even surveilled more. Blacks are also apt to have force used against them more. One reason is that there could be more crime among minority groups. The other reason is police bias and prejudice. Implicit bias.

Armed

In 1999, a 23-year-old Guinean immigrant, Amado Diallo, was mistaken for a wanted serial rapist and shot by four New York City plainclothes cops. 41 shots were fired. 19 hit him. He died. The cops were indicted, tried and exonerated.
Shortly thereafter, researchers in Chicago and Denver started looking at the issue of implicit bias in the way white cops treated black suspects. They recruited subjects from the community as well as cops. They were shown scenarios where they had to decide to “Shoot” or “Don’t Shoot”. The scenarios contained armed and unarmed whites and blacks.
The subjects who were white ,were more apt to shoot an armed male more quickly if he was black than if he was white. However, they were quicker NOT to shoot an Unarmed White than a Black. Which means, they would shoot an Unarmed Black but not an armed White.
Researchers also found a more pronounced neurophysiological threat response when subjects were faced with a Black suspect and that this correlated with how fast they pressed “Shoot”.
A later study in Denver found that compared to people from the community, cops were less prone to have this bias – that of shooting an Unarmed Black over an Armed White. They attributed this to high quality use-of-force training that several police departments had instituted.
Interestingly, a recent study in Spokane showed that in some cases, cops may use less force against blacks – possibly putting the cop in danger – due to the media and legal backlash.

The good news is that in medicine as well as in Law Enforcement and the Judicial systems, the issue of implicit bias is now recognized and several psychologists are doing some great work in developing training programs to reduce this. The bad news is that progress is slow.
Can implicit bias be totally eradicated? I do not think so then after all we are human and having prejudices is as old as the human race itself. It surely does not excuse the killing of unarmed suspects but it helps to pinpoint where changes can be made.
Often the debate about Policing and Race is split along color lines with most minorities cognizant of a problem and most Whites thinking there is none. Well to my white friends who think there is no problem, I have news for you – THERE IS A PROBLEM!
One should appreciate the work cops do – laying their lives on the line daily to protect us. It is an unenviable job in a society awash in guns. However, the issue of disproportionate use of force against black suspects is an issue that won’t go away unless addressed. It creates mistrust, loss of life and makes cops afraid to do their job because of media and legal backlash.

I’ll end with another experience. We were flying back home and made a connection in Atlanta. As we were boarding, I couldn’t help but notice the passport of the gentleman in front of me. The inscription was in Arabic. My heart started pounding. On board the plane, I looked our for him and kept an eye on him all through the flight. When he headed for the bathroom, my fear went up a thousand notches. My relief when we landed was beyond description. When I decided to write this today, the memory of my reaction came back to me. It was the memory of my own implicit bias.

Role of Disease in Sub-Saharan Africa – Another Take

Sub-SaharanAfrica (SSA) seems to be the crucible of disease. Most of our modern day epidemics seem to emanate form this area – HIV, Ebola – to mention just two that have had significant mortality.
Disease in SSA is however nothing new. The region has always had numerous infectious and vector-borne diseases.
I seek to argue that the prevalence of disease in SSA might have changed the course of it’s history.

Lets go back several hundred years to about 1490. This is the period when Columbus landed in what is now Central America and initiated the massive migration of Europeans to the New World, as it was called. Through the activities of the migrant Europeans and disease they introduced, millions of native Americans were literally wiped out.
Now, SSA was “found” around this same time period. It ultimately became a the source of manual labor for the cotton and sugar cane plantations in the so-called New World. So why didn’t SSA see the same level of migration of Europeans like the Americas saw?

One argument is that black Africans were seen as an optimal manual labor force and so their bodies were priced over their lands. Some have also argued that SSA was more densely populated than the American continent. Yet another is that the Africans mounted a much stronger resistance against the Europeans than the Native Americans.
The argument, which I tend to favor, is the role of disease and specifically malaria. Malaria, a disease to which most indigenous Africans develop some form of immunity to over time, is devastating for anyone contracting it for the first time. It killed quite a number of European settlers.This dampened any desire for an exploration of the continent. A glimpse of what could have been is seen in South African a region with a climate and disease profile much kinder to the Europeans settlers.

Malaria as a disease was known since the time of Hippocrates. In the ancient times, it was attributed to bad air. The term Malaria was coined in Florence by the historian and chancellor of Florence Leonardo Bruni in his Historia Florentina around1400:
Avuto i Fiorentini questo fortissimo castello e fornitolo di buone guardie, consigliavano fra loro medesimi fosse da fare. Erano alcuni a’ quali pareva sommamente utile e necessario a ridurre lo esercito, e massimamente essendo affaticato per la infermità e per la mala aria e per lungo e difficile campeggiare nel tempo dell’autunno e in luoghi infermi….
After the Florentines had conquered this stronghold, after putting good guardians on it they were discussing among themselves how to proceed. For some of them it appeared most useful and necessary to reduce the army, more so as it was extremely stressed by disease and mala aria (bad air)…
It was introduced into England 1740 by Horace Walpole:
“There is a horrid thing called the malaria, that comes to Rome every summer, and kills one”, and into medical literature by John MacCulloch in 1827.

So Europeans knew of malaria and found out about other diseases that killed them in droves like dengue, yellow fever and the bugs that caused dysentery. Even David Livingstone, the Explorer and Missionary, died form malaria and dysentery. The cattle that the Europeans tried to raise were also killed off.
Unlike in the America, in Australia, the Polynesian islands and part of South Africa where European diseases killed off the natives, the opposite occurred in SSA.
A true exploration of the continent started in the mid-1800s and this was shortly after quinine was discovered to be a cure for malaria. And then you saw the true face of European colonization.

For Native Americans and Africans from the sub-saharan region, the “discovery” of their respective continents by the European explorers of the 15th century has spelled nothing but misery. For most, the misery still continues.
Unlike the Native Americans, most Africans still have control of their lands, even if they are still massively exploited by richer nations and their own corrupt leaders.
Even as disease continues to be a major factor in the lives of most people in SSA, let’s not forget that malaria might have been the one thing that saved us from extermination.

Role of Disease in Sub-Saharan Africa – Another Take

Sub-SaharanAfrica (SSA) seems to be the crucible of disease. Most of our modern day epidemics seem to emanate form this area – HIV, Ebola – to mention just two that have had significant mortality.
Disease in SSA is however nothing new. The region has always had numerous infectious and vector-borne diseases.
I seek to argue that the prevalence of disease in SSA might have changed the course of it’s history.

Lets go back several hundred years to about 1490. This is the period when Columbus landed in what is now Central America and initiated the massive migration of Europeans to the New World, as it was called. Through the activities of the migrant Europeans and disease they introduced, millions of native Americans were literally wiped out.
Now, SSA was “found” around this same time period. It ultimately became a the source of manual labor for the cotton and sugar cane plantations in the so-called New World. So why didn’t SSA see the same level of migration of Europeans like the Americas saw?

One argument is that black Africans were seen as an optimal manual labor force and so their bodies were priced over their lands. Some have also argued that SSA was more densely populated than the American continent. Yet another is that the Africans mounted a much stronger resistance against the Europeans than the Native Americans.
The argument, which I tend to favor, is the role of disease and specifically malaria. Malaria, a disease to which most indigenous Africans develop some form of immunity to over time, is devastating for anyone contracting it for the first time. It killed quite a number of European settlers.This dampened any desire for an exploration of the continent. A glimpse of what could have been is seen in South African a region with a climate and disease profile much kinder to the Europeans settlers.

Malaria as a disease was known since the time of Hippocrates. In the ancient times, it was attributed to bad air. The term Malaria was coined in Florence by the historian and chancellor of Florence Leonardo Bruni in his Historia Florentina around1400:
Avuto i Fiorentini questo fortissimo castello e fornitolo di buone guardie, consigliavano fra loro medesimi fosse da fare. Erano alcuni a’ quali pareva sommamente utile e necessario a ridurre lo esercito, e massimamente essendo affaticato per la infermità e per la mala aria e per lungo e difficile campeggiare nel tempo dell’autunno e in luoghi infermi….
After the Florentines had conquered this stronghold, after putting good guardians on it they were discussing among themselves how to proceed. For some of them it appeared most useful and necessary to reduce the army, more so as it was extremely stressed by disease and mala aria (bad air)…
It was introduced into England 1740 by Horace Walpole:
“There is a horrid thing called the malaria, that comes to Rome every summer, and kills one”, and into medical literature by John MacCulloch in 1827.

So Europeans knew of malaria and found out about other diseases that killed them in droves like dengue, yellow fever and the bugs that caused dysentery. Even David Livingstone, the Explorer and Missionary, died form malaria and dysentery. The cattle that the Europeans tried to raise were also killed off.
Unlike in the America, in Australia, the Polynesian islands and part of South Africa where European diseases killed off the natives, the opposite occurred in SSA.
A true exploration of the continent started in the mid-1800s and this was shortly after quinine was discovered to be a cure for malaria. And then you saw the true face of European colonization.

For Native Americans and Africans from the sub-saharan region, the “discovery” of their respective continents by the European explorers of the 15th century has spelled nothing but misery. For most, the misery still continues.
Unlike the Native Americans, most Africans still have control of their lands, even if they are still massively exploited by richer nations and their own corrupt leaders.
Even as disease continues to be a major factor in the lives of most people in SSA, let’s not forget that malaria might have been the one thing that saved us from extermination.

It can be chronic

One day if I get the platform, this is an address I’ll love to give to the Ghanaian public. The topic will be on the issue of Chronic Diseases.
It’ll probably be in the evening and I’ll probably start like this…

Good evening, ladies and gentleman. Thanks for tuning in. Tonight, I want to talk to you about the issue of Chronic Diseases.
Everyone knows what a disease is and what it entails. It makes you sick and forces you to seek treatment. Some of you go to see a doctor. Others opt to see a traditional healer or herbalist. A few may do the unthinkable and just hope the disease goes away by itself. Whatever the measure taken, the hope is that the doctor, traditional healer, herbalist or time heals the body of the disease and brings back normalcy and good health. That is the general expectation.
However, what if the disease is such that no matter what the doctor or herbalist does, it does not go away or keeps coming back? There is a chance that the condition is not being treated well. However, there is also the possibility the disease is chronic.
Chronic diseases are long-term medical conditions that are generally progressive or persistent. They last longer than three months. Examples of chronic diseases are high blood pressure, diabetes, end-stage kidney disease, asthma, hepatitis C and cancer.

It used to be thought that the most important health issue in sub-Saharan Africa (SSA) was that of the infectious disease e.g. malaria, tuberculosis, cholera. Infectious diseases are still a major health risk in SSA but I contend that a few chronic conditions also are exacting a heavy toll on the populations in the region. I further seek to explain why the mentality of the population might affect how these chronic diseases are managed.

Of the examples on chronic diseases stated above, the two most common in Ghana seem to be high blood pressure (HTN) and diabetes.
In 2004, a study of the Greater Accra area found an urban prevalence of HTN to be 32.9% and a rural prevalence of HTN to be 24.1%. Similar studies in the Ashanti Region yielded prevalences of 33.4% and 27% respectively. A review of about seven studies by Addo has the rural prevalence at about 19% and the urban at about 55%
Diabetes on the other hand has a prevalence of about 6 – 9%, that is over 2 million people.
The mean age of diagnosis for both diabetes and high blood pressure seems to be in the mid-thirties.

Now both high blood pressure and diabetes can lead to very serious healthcare problems down the line if not managed well.
High blood pressure can lead to stroke, congestive heart failure and to loss of kidney function and even death.
Diabetes can lead to loss of limbs, blindness, loss of kidney function and death.

So it is of utmost importance for these diseases to be managed well. That is where the problem of the mentality comes in.
If a disease is always seen as a condition that can be treated away (acute), then getting patients to accept the fact that a disease can be chronic presents a problem.
Patients may not believe the diagnosis and seek a second opinion from another physician or traditional healer of even worse, do nothing about it. The fact that some diseases cannot be cured can be a bitter pill to swallow.

Then is what I call the “The Falsehood of Eternal Youth and Health”. Excuses like “I’m too young to have high blood pressure or too active to have heart disease” are common. Well, youth passes, even if slowly and good health is not always assured. Besides black people seem to have a propensity for certain ailments, one of them being high blood pressure.

The other issue is that chronic diseases demand a much higher level of patient involvement in managing the disease. Insulin must be injected, tablets taken daily for that high blood pressure, diets, exercise….it gets overwhelming and the onus is totally on the patient. Further, managing chronic diseases can be a financial as well as time drain. Consider having to undergo hemodialysis three times a week or the cost of medications to manage high blood pressure.
It is not at all surprising that faced with these prospects, some patients seek another way out – a cure from, say, a traditional healer or herbalist or just denial. In our Ghanaian culture, the rush to attribute a chronic ailment to a supernatural cause is ever present. That unfortunately leads down a path of figuring out the supernatural cause instead of treating the disease.

The Swiss psychiatrist, Elisabeth Kübler-Ross describes in her 1989 book, On Death and Dying, five stages that patients go through when faced with the diagnosis of severe illness. They are Denial, Anger, Bargaining, Depression and Acceptance. One is supposed to move from one stage to the next and finally end up accepting the diagnosis. It is in the acceptance of the diagnosis that propels a patient to deal and live with it. If one never gets to the acceptance stage, the results are the early end-stage effects of a disease.

This can be devastating if the patient is young, say, in their thirties.
If diseases like diabetes and high blood pressure, diagnosed when one is thirty are mismanaged or neglected, then by the time one is in his forties, the end effects of these diseases are apparent and lead to early morbidity and mortality.
These diseases need to be taken seriously and managed well, the minute a diagnosis is made, irrespective of the age.

The great thing is that, with all the advances in modern medicine, chronic diseases can be so managed as to ensure a long life. It however takes an active, very active participation of the patient.
Diabetes and high blood pressure are not the only chronic diseases. Like I mentioned earlier, I used them since they are the most common. There are of course people dealing with diseases like HIV, Parkinson’s, cancer, arthritis, asthma, kidney failure and congestive heart disease. Whatever the disease is, the same principle applies – acceptance.

So to summarize, not all diseases are acute e.g. malaria, and can be treated away. There are the class of diseases that persist and are chronic. Recognizing them for their severity and accepting them as manageable conditions si the first step and avoiding the late term effects of these diseases.

Thank you for listening and take good care of yourselves. You deserve that. Have a good night.

Once Upon a Time

“Somewhere, something incredible is waiting to be known.” – Dr. Carl Sagan

In August of 1810, the English writer, Frances Burney, then living in Paris with her husband, General Alexandre D’Arblay and son, Alexander, developed pain in her right breast. She saw several French doctors. Fearing she may have breast cancer, a right mastectomy was ultimately recommended. The surgery was scheduled for September 30, 1811. In a period where surgery was done without anesthesia, she was given only a 2 hour notice. This was so she wouldn’t be frightened off. The surgery was attended by “7 men in black, Dr. Larrey, M. Dubois, Dr. Moreau, Dr. Aumont, Dr. Ribe, & a pupil of Dr. Larrey, & another of M. Dubois”. She was given some wine cordial after which she lay on the mattress designated as the operating surface. Her face was then covered with a transparent handkerchief. Below is part of her account of what happened:

“When the dreadful steel was plunged into the breast—cutting through veins—arteries—flesh—nerves—I needed no injunctions not to restrain my cries. I began a scream that lasted unintermittingly during the whole time of the incision—& I almost marvel that it rings not in my Ears still! So excruciating was the agony!”

Frances Burney survived the the mastectomy and lived another 29 years, dying at the age of 87. After the operation, she couldn’t think or speak of it for several months, had terrible headaches but had the courage to write an account of it to her sister.

Surgery before the discovery of anesthesia was so barbaric, some surgeons got drunk together with the patient for the ordeal. Surgeons were known to enter the theater with a bottle of wine in each hand – one for the patient and one for the surgeon. John Hunter, an early Scottish surgeon described surgery as ‘a humiliating spectacle of the futility of science’ and the surgeon as ‘a savage armed with a knife’! Techniques that were used to make it bearable for the patients included alcohol, opium, knocking the patient out with a blow to his jaw, rubbing them with stinging nettles and hypnosis. Patients would vomit and aspirate, blood loss was massive and the screams were haunting.
Now surgery is a painless, safe affair thanks to all the advances in anesthesiology, surgery and pharmacology.

Medicine and surgery have come really far from the days of Hippocrates. Matter of fact, they have even come father since a hundred years ago! Beliefs have changed and practices improved due to research. Whole diseases like the small pox have been eradicated even if we have replacements like HIV.

Talking about beliefs, the ancient Greeks, based on the teachings of Plato and Hippocrates, believed a woman’s womb was a separate creature with a mind of it’s own. When a woman did not bear children or abstained from sex, her uterus, hungry for children, could dislodge and float freely about her body causing shortness of air, seizures and mania. Women were advised to marry young and have a ton of kids. For a womb that had already broken free, doctors would “fumigate” the patient’s head with sulfur and pitch while simultaneously rubbing scented oils between her thighs. Why? The womb would flee from the bad smells and move back into its rightful place!

Or take the practice of bloodletting. The practice was common in ancient Egypt and got carried over to the Greeks, with Hippocrates and Galen being huge proponents. It was believed then that the human body contained the four “humors” – blood, phlegm, black bile, yellow bile – each centered in a particular organ—brain, lung, spleen, and gall bladder – respectively. Disease was thought to be from over-abundance of the humor blood so “letting” blood out of the body, brought the humors back in balance and healed the sick. Menstruation was seen as the body’s natural way of bloodletting to balance out the humors.
Now there are conditions like polycythemia vera, where the body produces too many red blood cells. A treatment option is bloodletting (Phlebotomy) and that is done even today. However bloodletting then was done for any and all ailments!
Bloodletting was common practice till the 18th century in Europe and the 19th in the US. As late as 1942, there were medical textbooks with bloodletting as a therapeutic procedure for all sorts of conditions. Dr Benjamin Rush, one of the signatories of the Declaration of Independence, was a fierce proponent of the practice.
The most famous victim of bloodletting in the US is probably the first president, Gen. George Washington. He had been sick and unable to swallow from a severe throat infection for a few days. On December 14, 1799, he asked his physicians to perform bloodletting on him. 124 – 126 ounces (3.75L) of blood was let out over a ten hour period by his physicians. The president weighed about 230 lb (104.5 kg) and was 75 in tall. Since the blood volume in an adult male is 70 ml/kg, his total blood volume was about 7.3 L. This means that, on the day he died, his physicians let out half of his blood volume. We know today that, that amount of blood loss leads to profound hypotension, shock, organ hypoperfusion and death. No wonder the president appeared calm before his death. He was probably in shock!
Luckily, there were men like Pierre Louis (1787-1872) and John Hughes Bennett (1812–1875) whose statistical analysis of medical data then helped put an end to the practice.

It’s not only beliefs and practices that have changed over time but also some drugs used. To mind comes mercury. A heavy metal, it is now known as a very toxic substance. It comes in three forms – elemental mercury, inorganic salts, and organic compounds. The organometallic Methylmercury is the most poisonous. Nowadays, mercury is found mostly in whale and dolphin meat, certain fungicides and skin lightening products.
It causes irreversible damage in fetuses, infants, and young children. It is damaging to the the neurons and causes cerebral infarctions, causing Minimata disease (Ataxia, Visual-field loss, Psychiatric disturbances, Sensory loss and Chronic Paresthesias). It is also toxic to the kidneys. The phrase “mad as a hatter” comes from the times when mercury compounds were used for the production of felt hats and led to the poisoning of the workers in the 18th – 19th centuries. So we can agree that mercury is bad.
Well, the Persians, Greeks and Chinese thought it increased vitality. The Chinese Emperor Qin Shi Huang died after ingesting mercury pills designed to make him immortal! From the 17th to 18th century, a drug called “Blue Mass”, containing 33% mercury was used to treat syphilis tuberculosis, constipation, toothache, parasitic infestations, and the pains of childbirth. From about 1930 till 1999, Thimerosal, a mercury-containing preservative was used in some vaccines. Thankfully, there is great awareness of the dangers of mercury now.

It is definitely a great time to be practicing medicine. Even as we hurtle along, ever adding newer drugs, treatment options and procedures to our armamentarium, lets not forget where we as healers have come from.

Take Subjectivity Out

“Nothing that has value, real value, has no cost. Not freedom, not food, not shelter, not healthcare” – Dean Kamen

A young woman presented for a thoracotomy to remove a mass in her right chest. On my way to see her, her nurse accosted me and told me of the patient’s demands. She didn’t want to wake up in pain – fair enough – but she didn’t want an epidural either! It went on – no ribs were to be broken, she didn’t want a foley catheter and she wanted to be discharged the next day. I actually laughed out loud when the nurse told me this.
I walked up to the patient with a smile on my face that got wiped off by the chilly reception. I tried to explain to her that her demands were unrealistic. She wouldn’t hear of it. I called in the surgeon for reinforcement. We lost the battle. She walked out.
Now imagine there was a tool online that allowed patients to rank the quality of care they received at a hospital based on several questions about interacting with the doctors and nurses, like “Were they responsive to your needs?”
How do you imagine this young woman’s ranking will look like?
Oh yeah, there are already such online tools and it is interesting to read through them.
Which brings me to today’s question – “Using the responses from patients about their care in an inpatient setting, can one really extrapolate the quality of care received?”
My answer to that is big NO!
In medicine (and probably other professional fields too), there is always this big disconnect between physicians who actually take care of patients and those experts who hardly take care of patients so have time to make policy.
Irrespective of what so-called policy makers in some big institutions might preach, using patient satisfaction surveys as a window into the quality of care received in the inpatient setting is going down a very slippery slope of subjectivity.
Even surveys about patient satisfaction in the outpatient setting have been shown not to capture the real issue at hand but rather how long patients waited!
Healthcare is a huge sector. Certain sections can be evaluated using consumer feedback e.g nursing care, drug development, emergency care, public health, the control of chronic diseases. However, when it comes to inpatient care, do you honestly think you can get a patient to objectively tell you through a questionnaire how well a surgeon took out a tumor or replaced a heart valve or a knee or treated pneumonia or a heart attack?
How many of us know the surgeon with the hands of Asclepius but the demeanor of Sergeant Hartmann from the 1987 movie “Full Metal Jacket”?
What is mostly obtained from these patient surveys is the quality of nursing care and the human-to-human interactions. They also capture complications and I’ll come to that later. The reviews however miss the meat – the quality of medical care or surgical interventions.
Now, is it necessary for the public to have an idea of the quality of care they’ll get from a hospital? YES! Healthcare is a service industry and I think it is important for patients to have such a tool.
Let’s say you need to have a hip replaced. Imagine you could go into a tool that showed you each orthopedic surgeon in town, how many hips they did a year, which age group, which ASA class, length of a procedure, number and nature of complications, incidence of transfusions, length of stay and cost. Wouldn’t that be a much better tool than patient surveys?
Now how would such a tool be set up and populated? Setting it up would be the least of any developer’s worries. Populating it is the problem. The only way to get that information is to make hospitals report outcomes for each physician who works at that hospital. That way, consumers can compare. No hospital in the US is going to do that! They collect it but they aren’t sharing it!
And so policy makers grasp at straw by designing surveys based on patient experiences that seek to eke out the quality of care.
However, I think hospitals should publicize this data. It will allow competition in the marketplace, weed out the bad practitioners and lower costs. It will allow patients to choose the best surgeon for their needs. An ASA Class III or IV patient can look for a surgeon who does mostly ASA Class III and IV patients.
It also prevents doctors and hospitals from having to deal with policies that make practicing more difficult but do not really improve quality of care. I think we in healthcare should be more proactive in measures that bring patient care to a pact between a doctor and a patient, excluding insurance companies and government.
So next time you fill out a patient satisfaction survey, ask if it really captured the quality of care? It probably did not but these surveys can capture outcome in terms of complications. The question then is, “Are complications an indicator of quality of care?” To that I’ll respond with yes and no. Complications can be a window into quality of care or also very much patient related. Going back to hip surgery, if a surgeon does 350 hips in a year and out of that 50% have infections, there is a problem. However if a patient after valve replacement surgery with a mechanical valve goes home, forgets to take his warfarin and the valve clots off, that is on the patient.
As a consumer of healthcare, I’ll like to know what I’m getting when I walk in for a procedure. I wish there was a better way to tell than through subjective responses.

Protocolize It!

We live in the era of big data. With the introduction of electronic medical records, big data is also alive and well in medicine. Mining that data can help establish therapies that are most effective in the majority of patients. The mined data plus results from large scale prospective, randomized studies then result in recommendations and protocols that are supposed to improve patient outcomes.
A majority of physicians have historically looked at medicine as an art. Each physician had his or her way of treating ailments, often tailoring them to fit individual patients.
Medicine however is moving in a direction where the “Medicine as an Art” crowd is on the edge of extinction.
Who is right? Should the practice of medicine be based on protocols or should it be practiced as an art?

I’ll start off the discussion with two examples:
Close to a million Americans suffer from strokes each year and it’s the number 4 killer in the US.
For years, different hospitals and physicians have managed patients with strokes differently. Studies show that if patients having ischemic strokes are given intravenous tissue plasminogen activator (tPA) to bust the clot causing the stroke within 60 min of arriving at a hospital, their chances of survival go up significantly. However, a study in 2014 showed that less than 30% of ischemic stroke patients were being treated this way. On the other hand, hospitals that had established protocols to facilitate this recommendation lowered the incidence of death and disability from stroke.
Another area of concern is that of medical errors. The “To Err is Human” report sounded the alarm bell in 1999. That in part led to the institution of the Surgical Safety Checklist and the “Time Out” for all surgical procedures. A 2009 study in the NEJM showed a drop in death from errors from 1.5% to 0.8% since institution of the checklist and “Time Out”. Inpatient complications dropped form 11% to 7%.

These two examples illustrate the fact that protocols based on science and solid evidence can positively affect outcomes.
Should this then be extrapolated to all of medicine? Should every decision we make be decided by protocols culled from studies and hard data?

Which brings me to the other side of the coin.
Say a study S looks at therapy for say, Prostate Cancer, in a 1000 men. if this therapy is effective in 86% of the men and it gets adopted, what happens to the 14% who do not benefit from the therapy.? if one extrapolates that to a million subjects, that 140,000 men who do not benefit from this new therapy. A good protocol has to allow a physician to cater to this group.
Recent recommendations about two tests that affect men and women have raised the ire of patients. The first is mammography to screen for breast cancer in women and the Prostate Specific Antigen (PSA)to screen for prostate cancer in men. In both cases, based on data, the opinion was that they led to an increase in the false positive diagnosis of a cancer. In other words, patients were thought to have cancer who did not. This led to further unnecessary testing and procedures. With the PSA, it is thought that a lot of small prostate cancers could be diagnosed, which left alone would not grow to be a problem. Now imagine telling a patient:
“You have cancer but it is so small we are going to leave it alone. You will outlive it.”
Sure, in a calm and reassuring manner, a doctor can try to make a patient understand but how many will bear to live with that uncertainty. Then there is also the probability that that small cluster of cells could get bigger….So why not get them out now?
Even if these cancers are small, isn’t it the smart thing to do to diagnose them and follow them? Doesn’t that make these screening tests then necessary? Doesn’t that give the patient a choice?

The point I am trying to make is, in spite of all the data, there are these people called Patients who we are supposed to serve. They are ruled by emotion and are not always as rational as the data and evidence. Is it part of “doctoring” to do whatever is possible, besides causing harm, to reassure these patients?
So on one side are the those who preach a strict adherence to the evidence and on the other those who want to tailor things to the needs of the patient and the habits of the physician.

Into this fray drops Genomic Medicine. This is an emerging discipline that bases therapy on a patient’s genome. It is a well known fact that some drugs (e.g. Plavix) do not work in some patients because of lack of or too much of certain enzymes. Before a particular therapy is initiated, the genetic make-up of a patient is determined. It is now used extensively in psychiatry to get effective therapy.
This shows that in spite of the data or evidence, there are still individual variations.

All these arguments may not matter because of the Affordable Care Act.
The Affordable Care Act has decimated the private medical practice to the point where the majority of physicians are now employed by hospitals. The Act also rewards physicians whose practices are in line with the latest most effective therapy and management modalities. Hospital administrators are then going to compel their physicians to practice in accordance with protocol that fit the best recommendations. In that sense, the autonomy of the physician may already be a thing of the past and patients’ choice may be slowly narrowed to a few options.

All this makes me wonder what role the physician may play in medicine in the future. If every decision we make is based on a protocol, what will happen to the practice of medicine as we know it? Besides surgeons, are any other specialties even needed if all one needs is to follow a protocol? Protocols so simple that even a caveman can follow them? What are we then good for?

Disruption

“All the companies in the United States and Europe and Japan, they have experts, and the experts are surgeons and they said it is absolutely not possible. We would kill the patients on the table.” – Alain Cribier, Cardiologist, Innovator of the TAVR Procedure

cribier

On April 9, 2002, a 57-year old man presented at the Cardiology Clinic of L’Hôpital Charles Nicolle, in Rouen, France. He had severe aortic stenosis with a valve area of 0.6 sq cm (normal is 4 – 6 sq cm), a mean gradient of 30 mmHg (normal < 5 mm Hg), was in cardiogenic shock (systolic blood pressure was 80) and had a left ventricular ejection fraction of 14% (normal 55 – 60%). Due to all his co-morbidities – chronic pancreatitis, severe vascular disease, silicosis, history of lung cancer – he had been refused surgery at several centers.

A cardiologist at the hospital, Alain Cribier, had in 1985, successfully used a ballon to open up stenotic aortic valves, a procedure termed ballon valvuloplasty.
That day in April of 2002, Dr Cribier performed balloon valvuloplasty on the aortic valve of the patient. He improved initially with the gradient falling to 13 mm Hg and the area increasing to 1.07 sq cm. However over the next week, the patient’s condition steadily declined. By about the 6th day after the valvuloplasty, his ejection fraction was 8 – 12% and in spite of support with medications, his systolic blood pressure was only 70.
A week after his valvuloplasty, on April 16, 2002, Alain Cribber made a decision that made history and is disrupting cardiac care like not seen since the introduction of cardiac stents about 10 years earlier.

After obtaining permission from the hospital’s Institutional Ethics Committee, he took the patient to the cardiac cath lab where, by accessing the femoral vein in the patient’s groin, he introduced an investigational heart valve to replace the stenotic aortic valve of the sick 57-year-old man. He didn’t open the patient’s chest like was the norm. He did everything from outside the body! The patient’s condition improved significantly. He however died 4 months later from unrelated issues.
Dr Alain Cribber had just performed the first percutaneous valve replacement in a human! The rest, like they say, is history.
In 2004, Edwards bought the company Dr Cribier had started to make the valves and the equipment to deploy it for $125 million.
That procedure, now called Transcatheter Aortic Valve Replacement (TAVR), has been approved by the FDA for patients with aortic stenosis (NOT regurgitation) who are terrible surgical candidates. These are patients not expected to survive surgery. It is presently performed in about 400 centers in the US. As I write, approval is pending for the use in patients with intermediate risk and it is quite possible that TAVR will replace the surgical approach in the next few years. In Europe, TAVR is used 60% of the time for the treatment of aortic stenosis.

A disruptive technology is one that displaces an established technology and shakes up the industry or a ground-breaking product that creates a completely new industry. Disruptive technologies are innovations often seen in the Tech world – like what Netflix did to Blockbuster or what the iPad did to personal computing or what the DVD did to the VHS.
The percutaneous approach to replacing the aortic valve is one such disruptive innovation.

For years, the only way to replace a stenotic aortic valve was to open the patient’s sternum while under general anesthesia, place him or her on cardiopulmonary bypass, open the aorta, excise out the old valve and sew in a new one. That demanded a team made up of the surgeon, anesthesiologist who acted also as the echocardiographer and a perfusionist.
For a TAVR, the cardiologist of surgeon punctures the femoral artery in the groin, introduces a wire with a ballon at the tip into the aorta and feeds the valve, in a collapsed state, all the way up into the aorta where the valve is, over this wire. The new valve is then inflated with the ballon. The new valve squishes the old valve against the wall of the aorta and in the process, takes it’s place. In most centers now, only a cardiologist or a surgeon performs the procedure!

If this becomes the dominant way of doing aortic valve replacements, tell me it’s not disruptive.

Years of accepted surgical practice will all of a sudden be made almost archaic. Why almost? Well, there will always be conditions where TAVR might not work and surgery is the only way (eg Endocarditis) or the patients may need other procedures. However it is possible that the majority of the estimated 1.5 million patients with aortic stenosis will undergo percutaneous and not surgical replacement.
Also, what this approach has done is spawn attempts to develop percutaneous approaches to replace or repair other heart valves. The Melody valve is already available to replace the pulmonary valve, a procedure often needed in children. There are also several percutaneous mitral valves in the works. The mitral clip is used to treat severe mitral regurgitation in the severely ill.

TAVR is not without it’s complications but it provides an alternative way of providing a needed service.
The history of cardiac care is marked with disruptive innovation. By the 1970’s, coronary artery bypass graft surgery (CABG) was the main way to treat coronary artery disease. Then Gruentzig developed percutaneous transluminal coronary angioplasty (PTCA) in the 1980’s and brought about an exciting new dimension in the treatment of coronary artery disease (CAD). (Gruentzig later migrated to the US and joined the faculty at Emory!) Then followed the development of stents. These developments were also very disruptive and dislodged the grip surgeons had on the treatment of coronary artery disease.
These very stents were what planted the idea of percutaneous valves in the mind of the Danish cardiologist, Henning Rud Andersen of Aarhus University in Jutland. Even though Cribber was the first to perform a percutaneous implantation in a human, Andersen was the first try this out in pigs.
And now we are seeing a tomorrow of percutaneous valves.

In 1995, Clayton M. Christensen coined the term, “Disruptive Innovation”, in his book “The Innovator’s Dilemma”. He defined it as:
“An innovation that creates a new market and value network and eventually disrupts an existing market and value network, displacing established market leaders and alliances.”
I argue that the TAVR is doing just that.

You are because of what you do

Human nature…..a very interesting thing indeed.
After working with people for a while, one starts noticing little things that tend to be intriguing and interesting.
Who we are is a function of a lot of things including our character, upbringing and environment. However, is there also the chance that we are what we do?
Let me explain by asking a question:
Would the stern demeanor a teacher has to assume around rowdy elementary school children every single day soon translate into a stern bearing?
In dealing with patients, I tend to notice certain tendencies that are peculiar to certain professions. Now, the following descriptions are purely observational and are not backed by any kind of science. These observations are also in no way a form a profiling because they do not have any bearing on how I treat them. I just wonder if they somehow support the claim that you are what you do.
So let’s get started:
I have noticed that patients who are teachers still maintain that stern demeanor even when facing surgery and anesthesia. Facing a teacher, I always feel like I forgot my homework. They seem to have assumed control and I come across as being there to do their bidding. Even when they are asleep, they still look and exude that teacher look! It is an interesting dynamic.
Accountants offer a very straightforward kind of affect. Much like, “You are the doctor, I am the patient. You have the obligation to take care of me, so do your job already and stop the chit-chat!” If they are nervous, they never show it. It’s almost like business as usual. I have had a few auditors as patients as well and they seem to be even a level more intense.
With lawyers, one always has the feeling like they are circling the wagons, looking, sniffing, ready to pounce. Almost like, “A-ha!, I’ll see you in court!’ (Kojo Ace, I greet you!????) Everything one says to them is weighted, compared and balanced against the scales of something unseen.
Veterans are an amazing bunch. (The few left from WW II and the Korean War are in a special class of awesome!) Describing them as stoic is an understatement. (Obie, greetings!) Most times, I expect them to say, “Doc, I don’t need anesthesia for that amputation. Give me a shot of bourbon and a bullet to bite on!” If one ever asked, I wouldn’t know where to get a bullet at the hospital. I wonder if the pharmacy stocks them.
Staying with the military, drill sergeants are a special breed. They cannot help just instilling fear wherever they are. At least in me they do. I always feel like I have to drop and give them fifty.
Cops, firemen and soldiers are sort of stoic to a level too, but not as much as veterans. They almost exude the feeling, “I need to be out there so please hurry up!”
Probably the most challenging group to take care of are patients who are in the medical field. Those who do not work in the perioperative setting are the most interesting. A short read-up on the kinds of anesthetics and practices out there 24 hours prior to surgery is often enough for these colleagues to dub themselves specialists in anesthetic care. Some demands are, well, interesting.
“Are you going to do a spinal for my thyroidectomy?”
“Huh?”
Most of us in medicine tend to be attentive to detail and want to be in charge. Well, we seem not to be able to let go in the perioperative phase too.
So what do my unscientific and probably biased observations really show? Do they support my initial claim that we are what we do? Probably not but these observations show that humans are a diverse bunch and hence the reactions to the same set of conditions will vary widely. Whether what someone does affects how they react may well be true but a real study may be needed to figure that out.
In the mean time, I’ll keep adding to my observational sample size by listening to my patients, calming down their fears and giving them the best care I can.
Don’t we all?

Keep it Simple

Back in 1998, I heard a trauma surgeon talk about communicating with patients. His words have stayed with me all these years.
The gist of his message was:
Physicians are as a group, are highly educated. A lot of the patients we deal with do not understand medicine, surgery, anatomy or physiology like we do. If we need to explain a procedure, the need for it or a disease process to a patient, we need to keep it simple.
Now that coming from a surgeon is deep!
It’s one of those things I’ve never forgotten. To keep it simple.
One can tell a patient:
“I am going to place a central line in your right internal jugular, float a pulmonary artery catheter and also place an arterial line in your left radial artery. You need that for your aortic valve replacement.”
Or one can say:
“To better take care of you during your operation, I need to place a larger iv in that vein in the right side of your neck. It helps us give you blood faster if you need it. Also, we feed a tube through it into your heart that helps us measure how much blood is being pumped in an out. You also need a better way of measuring your blood pressure. Feel your left wrist. Feel that pulse? That is an artery. I’ll put a small tube in there that will help measure your blood pressure better.”
Sure, the latter takes longer but you don’t have a patient who stares at you after you are done speaking like you just dropped from Pluto! We must all try to talk to patients in terms that are understandable to them. Terms that we take for granted may sound like Greek to most lay people. Even a term as simple as “colonoscopy” has befuddled some patients.
Some steps that can help me are:
I imagine explaining a procedure or even a disease process to one of my older uncles or aunties or to my kids. I break it down to a level they can understand.
I use diagrams that I sketch. I find drawing out the anatomy and pointing structures out and what is going to be done helps immensely. A lot of patients in Kentucky believe epidurals are the number one cause of paralysis in the world. A small drawing of the layers a needle goes through to reach the epidural space and it’s relatinship to the spinal cord helps immensely to allay some fears.
I encourage questions. If a patient can repeat what you said and base a question on that, your work is done.
Do not look at patients with disdain. It is not their fault that they do not understand what a myxomatous mitral valve is. I bet you do not know what Capital Structure Theory is either. A degree of empathy is needed to understand where patients are coming from. Without that empathy, it is difficult to relate to the patients and explain things to them at a level that is understandable.