A recent comment made to a posted essay by DB of DB's Medical Rants spoke of a term I had not heard before, namely aleatoric uncertainty. But with the web one can quickly learn that it is merely another term for our old friend- random or stochastic uncertainty or variation.
One can talk abut two types of uncertainty. Stochastic is the random or some would say inherent or irreducible uncertainty. The other is Epistemic uncertainty which is that related to incomplete or inadequate information.
At least some of the former type of uncertainty can disappear as knowledge becomes more complete and adequate.
This intro gives me a chance to quote one of my favorite insights borrowed from others namely the thought experiment described by Dr. Steve Goodman of Johns Hopkins which I have talked about before.
Here is my paraphrased version of it which is supposed to elucidate the difference between the stochastic and the deterministic:
Mr. Jones is faced with the need for surgery.The particular procedure is generally accepted to pose a 15% risk of death. Let us magically produce 100 clones of Mr. Jones. When they all undergo surgery, what will happen? In the random process model ( stochastic interpretation) , 15 will die but we cannot tell beforehand who they will be. In the deterministic model, either all 100 will live or all 100 will die depending on whether Mr. J. and all his clones have or do not have some biochemical or physiological condition(s) that is/are in fact what causes the mortality risk of the procedure.
If we learn the reason for the death some or all of the uncertainty vanishes.Before it was learned that about 1 out of 300 people have a single nucleotide polymorphism (leading to TMPT deficiency, and toxic accumulation of 6MP type drugs) no one could predict which child with ALL treated with 6MP would experience catastrophic bone marrow failure. Now with at least this mechanism in our knowledge base, patients can be tested for TMPT deficiency and those who are deficient can be safely treated with a lower dose. Knowledge of the mechanism shrinks the level of stochastic or random uncertainty. The more we know, the fewer are the circumstances that appear to be random. But even as we learn more "mechanisms" we still have the issue of how to apply aggregate data to the individual even though more and more we hear about genetic perturbations that strongly influence and sometimes apparently even determine the response of various tumors to chemotherapy.
The British physician- author, Kieran Sweeney, writes about this age old problem of "balancing the general with the particular" in his book "Complexity in Primary Care".
In this balancing act, a force that may push the clinician in the wrong direction is the lure of the "ecological fallacy". Sweeny writes"
With the advent of evidence-based medicine, clinicians were encouraged to interpolate from population data to individuals. In so doing, however, we were at the mercy of the ecological fallacy-assuming that any and all conclusions derived from population data could be applied to individuals in the data set.
No where did (does) this appear more glaring than in the large number of drug company driven, medical education company arranged "CMEoid" dinner experiences in which practitioners (docs and mid-levels alike) are exhorted to treat to goal, whether this be blood pressure or HbA1C or whatever else. Previously the big guns that were hyped to get the blood sugar down were the glitazones now less so and I see invitations coming in for drugs that are attempting to favorably harness the integrin family. For an interesting and thoughtful take on the basis ( or lack of same) for treating cholesterol to goal see this recent essay by Dr. Howard Body.