With expertise comes certain benefits. An expert has the ability to make a living, to make an impact, and to exercise influence based on their expertise. But expertise has limits, as knowledge is never absolute or complete. Do the benefits of expertise blind us to these limits? If so, doesn’t this create an inherent conflict of interest?
What defines an expert? Some come by their expertise through a combination of schooling, credentials, and experience. Whether it is the academic who has earned degrees, tenure, and awards, and has written hundreds of publications over the course of a career, or the skilled house painter who has learned his or her craft on the job over many years, this kind of expertise earns the holder the right to claim credibility.
The second way an expert is created involves the granting of credibility based solely on the so-called expert behaving in an authoritative manner. In this manner, anyone with skills at creating an expert persona can rise to positions of authority and power by acting believable, with or without any basis in reality. Acting absolutely certain of one’s self, particularly when delivering a popular message, has been the path to power for many ambitious and potentially destructive leaders.
Leaving this second false credibility model for the moment, let’s examine the temptations that plague the true expert. Given the limits of knowledge, the expert is often faced with questions for which the answers are unknown. Herein lies one of the most difficult, and frequently ignored ethical dilemmas for the expert.
To analyze this dilemma, let’s first assess the value of the expert’s opinion. There is little ethical concern if an educated guess is likely to produce a correct solution. But in medicine, chances are fairly high that we experts will guess in error when we simply opine. Were this not the case, we could sit under trees and puzzle out the cure to cancer. To my knowledge, that hasn’t happened. Worse than our inability to be truly certain of the accuracy of our opinions, getting it wrong may mean harming the patient when the guesswork is medical. Moreover, medical research is very difficult to do correctly. Much of the science on which we base our educated guesswork is observational, which can only tell us about coincidence and not about cause and effect. (More on this in a future blog.) Suffice it to say that a lot of our interventions are instructed by observational research if not our personal experience.
Now, in the face of a strong possibility of causing harm by guessing wrong, let’s compound the problem with the fact that the expert’s credibility, livelihood, ego gratification, work enjoyment, etc. are all dependent on being an expert. This means that the stakes could be quite high, and the cost quite severe, if the expert admits that they don’t know. Herein lays the conflict of interest. Whether or not there is research with which to answer the question, the expert is potentially no longer an expert if they have no factual answer. So the expert makes a guess. We call that guess “expert opinion”.
In fairness, I believe the typical expert guesses for reasons far more complicated than her or his own ego or livelihood. Most doctors really do care whether their patients get better and would, in fact, prefer that patients benefit from their interventions rather than being harmed. And of course I am oversimplifying to make a point. Not all educated guesses or expert opinions are as poorly informed as I make them out to be. But many are. And when experts opine with absolute confidence in an information void, we may no longer have the expert who is credible. Now we may have someone who is simply acting credible.
There are two issues that this conflict of interest raises. First, we experts need to teach ourselves to embrace uncertainty. Who better to opine, in the absence of absolute knowledge, than the humble expert who understands the potential for error. We need to be able to be comfortable saying that we don’t know, and understand that even when we do know, new practice and new science may make our knowledge obsolete and wrong. If we cannot accept these limits, and embrace our beliefs as fact with absolute certainty, we become immobile and inflexible when new facts emerge, and, in the extreme, we become tyrannical and even fascist.
Second, the resources for access to up to date and unbiased scientific data must be made better available so that we can better assess the limits of our knowledge.
This is not just a great idea. It is a moral imperative.
We would love to keep you informed about new entries from our physician contributors.
Please let us know your email address to subscribe to the blog.