“How do you categorize and rank all this information into a hierarchy of honest, unbiased research and separate it from the ‘blood-letters’?”
Article by 17160 Stephen Kalyta
In the Middle Ages, medical intervention for acute illness included such measures as blood-letting. We would naturally view the act of removing blood in order to drain out the sickness as ridiculous. Today, we have access to empirical data, largely thanks to the microscope, to refute such an archaic practice. The limited knowledge of medical experts of the era caused them to attribute the patient recovery to the practice. In reality, patients managed to survive despite the practice.
The Dunning & Kruger effect is the presence of cognitive bias or misconception purported by self-proclaimed experts. They may wear lab coats, troll the internet touting their advice or offer services in a prominent place like a poster wrapped around a telephone pole. Generally, these people rarely separate objective and opposing opinions from a personal attack. If you dare to question their sources, which in science is referred to as institutional disconfirmation, you may face ridicule or worse.
The scientific process of hypothesis testing is suspended in favor of fads or popularism or even online bullying. Consider the fact that the current self-help publisher’s market is worth an estimated $13.2 Billion, which is roughly $2 of revenue from every person on the planet. Is there really that much truth out there? How do you categorize and rank all this information into a hierarchy of honest, unbiased research and separate it from the “blood-letters”? How do we hold accountable the misinformed while still offering a path for evolving and responsible free-thinkers? The last thing we want in society is to limit the next “Banting” (Canadian who discovered insulin hormone treatments) from coming forward with his/her discovery.
The reality is we are responsible individually for our beliefs, regardless of the source. As military leaders, we have the added responsibility of following the chain of command while simultaneously subordinating our personal bias and adjusting our assumptions in the field. When our data sources are in conflict, we push the information up the chain to seek direction. The real-time intelligence of on-site observation marries directly with the collective intelligence of command and ideally avoids a black swan event. The final conclusion is not without bias, but at least it represents an environmental database of more knowns than unknowns and relies on bonified experts who are masters of their craft.