metrics module¶
This module contains custom metric functions for evaluating model performance. It includes utilities to calculate accuracy, F1 score, and find optimal thresholds for classification.
- metrics.accuracy(th, true_labels, predicted_labels)[source]¶
Calculates the accuracy score for a specific threshold.
- Parameters:
th (float) – The decision threshold to apply.
true_labels (numpy.ndarray) – The ground truth binary labels.
predicted_labels (numpy.ndarray) – The continuous prediction probabilities.
- Returns:
The accuracy score.
- Return type:
float
- metrics.best_accuracy(true_labels, predicted_labels)[source]¶
Finds the probability threshold that maximizes accuracy on the given set.
- Parameters:
true_labels (numpy.ndarray) – The ground truth binary labels (0 or 1).
predicted_labels (numpy.ndarray) – The continuous prediction probabilities from the model.
- Returns:
A tuple containing the best accuracy achieved and the corresponding threshold.
- Return type:
tuple(float, float)
- metrics.best_eq_accuracy(true_labels, predicted_labels)[source]¶
Finds the threshold that minimizes the difference between the accuracy of the two classes. This is useful for balancing performance on unbalanced datasets.
- Parameters:
true_labels (numpy.ndarray) – The ground truth binary labels.
predicted_labels (numpy.ndarray) – The continuous prediction probabilities.
- Returns:
A tuple containing (Accuracy Class 0, Accuracy Class 1, Best Threshold).
- Return type:
tuple(float, float, float)
- metrics.best_f1_score(true_labels, predicted_labels)[source]¶
Finds the threshold that maximizes the F1 Score.
- Parameters:
true_labels (numpy.ndarray) – The ground truth binary labels.
predicted_labels (numpy.ndarray) – The continuous prediction probabilities.
- Returns:
A tuple containing the best F1 score and the corresponding threshold.
- Return type:
tuple(float, float)
- metrics.class_accuracy(th, true_labels, predicted_labels)[source]¶
Calculates the accuracy individually for both classes.
- Parameters:
th (float) – The decision threshold to apply.
true_labels (numpy.ndarray) – The ground truth binary labels.
predicted_labels (numpy.ndarray) – The continuous prediction probabilities.
- Returns:
A tuple containing (Accuracy Class 0, Accuracy Class 1).
- Return type:
tuple(float, float)
- metrics.f1_score(th, true_labels, predicted_labels)[source]¶
Calculates the F1 score for a specific threshold.
- Parameters:
th (float) – The decision threshold to apply.
true_labels (numpy.ndarray) – The ground truth binary labels.
predicted_labels (numpy.ndarray) – The continuous prediction probabilities.
- Returns:
The F1 score.
- Return type:
float