The Metrics

You can use the Confusion Matrix to compute metrics to associate with different needs.

Here's how to read the metrics.
Metrics Definitions Formula
Classification Rate Proportion of targets accurately classified by the preditive model when applied on the validation data source. (TP+TN)/N
Sensitivity Proportion of actual positive targets that have been correctly predicted. TP/(TP+FN)
Specificity Proportion of actual negative targets that have been correctly predicted. TN/(FP+TN)
Precision Proportion of predictive positive targets that are actually positive targets. TP/(TP+FP)
F1 score Harmonic mean of Precision and Recall (Recall and Precision are evenly weighted). 2 / ((1/Precision) + (1/Sensitivity))
Fall-out Proportion of negative targets that have been incorrectly detected as positive. FP/(FP+TN) or (100% - Specificity)
Definition:

N = Number of observations

TP (True Positive) = Number of correctly predicted positive targets.

FN (False Negative) = Number of actual positive targets that have been predicted negative.

FP (False Positive) = Number of actual negative targets that have been predicted positive.

TN (True Negative) = Number of correctly predicted negative targets.