Clinical Utility curve assists clinical decision making

In medicine, clinicians frequently make decisions based on incomplete information, and they must balance the potential benefits and risks from any decision, such as a decision to perform a diagnostic procedure. For example, when a patient presents with chest pain, the cardiologist must decide whether to perform a coronary angiogram in order to detect coronary artery disease. The procedure is expensive and not without risk to the patient, and the cardiologist wants to avoid performing angiography on patients without coronary artery disease. In these situations, the cardiologist uses a prediction model to obtain an estimate of the probability that an angiogram will reveal disease. Prediction models can also be used to assess whether a new test performed before the procedure helps select patients to undergo the procedure. The Clinical Utility (CU) curve is a new way of presenting information from a prediction model that helps clinicians choose which patients should undergo a procedure.

Fig. 1. ROC curves (A) and CU curves (B) for the prediction of coronary artery disease in men (solid lines) and women (dashed lines) presenting with chest pain. NNCOC, number needed to capture one patient with coronary disease. The CU curve also shows the absolute risk (probability of disease) that corresponds to different sensitivities and NNCOC values. From Campbell BMC Res Notes 2016;9:219.

A prediction model is created by compiling information from a previous group of patients that all had the diagnostic procedure, and analysing how well the details about the patient, their clinical presentation and the investigations performed before the definitive procedure predict the presence of disease. Prediction is expressed as an absolute risk score, or percent probability that disease is present. In most situations the prediction model shows overlap in the risk scores of patients with and without disease. Patients with risk scores above a particular threshold are said to be captured, or chosen to undergo the procedure.

The standard way to present information from a prediction model is the receiver operator characteristic (ROC) curve. The ROC curve is a graph with “sensitivity” on the y-axis and “1-specificity” on the x-axis over the range of possible threshold risk scores (Fig. 1). Sensitivity is the proportion of patients with disease captured by different risk scores, whereas “1-specificity” represents the proportion of patients without disease that are captured by the risk score. However, in addition to the sensitivity, the clinician wants to know the number of patients without disease captured by any particular risk score (false positives, representing unnecessary procedures). This information is not immediately available from the ROC curve.

The CU curve also has sensitivity on the y-axis, but it has the number needed to capture one case (NNCOC) on the x-axis over the range of possible threshold risk scores (Fig. 1). The NNCOC is the number of patients that need to undergo a procedure in order to detect one patient with disease, which is the sum of one true positive and the number of false positives. Thus, the CU curve provides information about true positives and false positives that the clinician needs to balance the risks and benefits in deciding the threshold risk score to use for selecting patients to undergo the procedure.

The figure shows the ROC and CU curves for the prediction of coronary artery disease in men and women presenting with chest pain. The CU curves demonstrate differences in the performance of the prediction model between men and women not shown by the ROC curves. For probabilities of coronary artery disease > 60%, more women than men without coronary artery disease will undergo coronary angiography.

Duncan J Campbell
St. Vincent’s Institute of Medical Research, Australia

 

Publication

The clinical utility curve: a proposal to improve the translation of information provided by prediction models to clinicians.
Campbell DJ
BMC Res Notes. 2016 Apr 14

Facebooktwittergoogle_pluslinkedinmailFacebooktwittergoogle_pluslinkedinmail

Leave a Reply