The internet has democratized our voices. That doesn’t make us all experts.
I am not an expert in infectious diseases.
I hadn’t read, heard, or spoken the word “coronavirus” until about four months ago.
Sure, like you, I’m voraciously consuming information about the Covid-19 pandemic. I’m trying to stay up to date on the models, the impact, and the guidelines from the CDC. I’m staying home, and doing what I can to support our country’s essential employees that risk their health every day.
But I don’t have an opinion to offer about when it’s safe to reopen our businesses, nor a prediction of what the fatality rate of this coronavirus will be. Because I’m not an expert — and I’m not closer to being one after reading four months’ worth of news and science coverage.
Dr. Anthony Fauci, one of the lead members of the White House Coronavirus Task Force, is an expert. He went to medical school at Cornell University, where he graduated first in his class in 1966. In 1980, Dr. Fauci was appointed the Chief of the Laboratory of Immunoregulation. He has served under six presidents, beginning with Ronald Reagan. He helped the United States combat a host of infectious diseases, including HIV, SARS, and Ebola. He has authored or contributed to more than one thousand scientific publications.
Of course, that doesn’t mean he’s going to be perfect. That doesn’t mean he’ll always be right. But it does mean that his opinion on matters related to the coronavirus pandemic counts vastly more than mine; it counts measurably more than the opinions of our world and business leaders; and it counts relatively more than many other members of the medical community that haven’t specialized or made a career in studying infectious diseases.
But why do we find that difficult to accept?
Traditionally, experts have had a proportionally louder microphone than the rest of us, which they typically use to communicate messages, research, and insights. They have access to industry leaders across an abundance of fields, be it business, social, or political. Their peer-reviewed findings are published in scientific journals, where practitioners can translate them into new programs or courses of action. They’re tapped to provide commentary in the news that we read, watch, and listen to.
But the internet has delivered a wrecking ball to the traditional channels of communication and news consumption. It’s reshuffled the field, and afforded anyone with access to a camera or keypad the chance to shape public opinion and influence our worldview. And, yes, that’s certainly not all bad. It’s provided us with the opportunity to hear from a more diverse audience — the chance to read an assorted medley of arguments and stories.
But the internet didn’t equalize the playing field; it re-leveled it. By its very nature, its intoxicating cocktail of avatars and anonymity, it has amplified opinions and advice from people who may be articulate, engaging, and intelligent — but whose background we, ultimately, know very little about. And, sure, in the real world some charisma and intelligence can also get you quite far. But in most matters of specialization, in matters of science and consequence, eventually everyone will be screened and filtered. They’ll be asked to provide receipts. But not on the internet, a lawless arena where insufficiently contemplated arguments (offering hyperlinks to insufficiently constructed research) spread like wildfire.
And from the harsh discordant of digital opinions, we incrementally begin to trust the real experts less. We consume the talking points, the straw man arguments, the opinions of those ranging from field generalists to armchair intellectuals, and we regurgitate them as ironclad facts to our social circles. On and on it goes — until we slowly begin to feel like experts ourselves.
While the internet has intensified these sentiments, the science isn’t new.
Social psychologists David Dunning and Justin Kruger advanced the Dunning-Kruger effect in 1999, a cognitive bias whereby we tend to overestimate our knowledge or ability in a certain area. Granted this bias exists, to some degree, in all of us (in one study, participants that scored as high as in the 80th percentile of a particular skill still overestimated their ability). We all think we’re above-average drivers, and that our logic or sense of humor is superior to others. But this bias has been shown to be particularly severe in individuals with little to no skill in the measured subject. In the original study, participants who were rated in the 12th percentile for a given skill estimated themselves to be, on average, in the 62nd percentile.
Said simply, we’re terrible at measuring how much we don’t know. In a New York Times article in 2010, Dunning went even further:
“People often come up with answers to problems that are OK, but are not the best solutions. The reason they don’t come up with [the best] solutions is that they are simply not aware of them . . . Unknown solutions haunt [them] without their knowledge. The average detective does not realize the clues he or she neglects. The mediocre doctor is not aware of the diagnostic possibilities or treatments never considered. The run-of-the-mill lawyer fails to recognize the winning legal argument that is out there.”
This is what is so dangerous about the viral nature of digital content. Right now, Covid-19 information is being disseminated to us by experts. But it’s also being disseminated to us by those that are somewhat less knowledgeable; and by those with no expertise in the field whatsoever; and by those with ulterior motives, be it economic, social, or political ones. Lost in this noise, it becomes difficult for us to tell the difference between fact and fiction. And perhaps more dangerously, it becomes difficult for us to separate the superior theories from the mediocre ones. So, like most things on the internet, we use social equity as the tiebreaker. The more shares, likes, comments — the more we see the theory — the more valid it must be. Right?
Among the more threatening straw man arguments I’ve seen proliferating the internet in recent weeks is one that is often attached to conspiracy theories and fringe ideas. It astutely preys on our biases like the Dunning-Kruger effect, on our inability to accurately assess our own gaps in knowledge.
The argument goes something like this: I might not be an expert, and neither are you, but here are the clues, now you can go form your own conclusions.
It’s an attractive argument to stand behind, because it nestles into our competency blindspots. It suggests that we can take those clues and form as compelling of an argument as an achieved trial lawyer. It suggests that we can make a diagnosis as accurately as a field expert or an acclaimed scientist. That all we needed, all along, was the clues themselves.
But, of course, we can’t. Instead, it’s akin to placing your toddler in front of the groceries, and expecting to be served a Michelin-starred, four course meal.
As the future of this pandemic and its sobering effects remain largely uncertain, let’s restore our faith in the experts. Let’s entrust them to formulate theories grounded in scientific evidence. Allow them to lay the foundation from which we can make informed decisions. In a society increasingly expectant that we each have an opinion, this is an instance when — for nearly all of us — it’s prudent not to.