class_check {HRTnomaly} | R Documentation |
Evaluate the Accuracy of Outlier Classification Results
Description
The function computes the confusion matrix between the logical output of an outlier detection algorithm and a reference (ground-truth) logical vector. The function also calculates the overal accuracy of the results from the confusion matrix, including recall, precision, and F1-scores for the two classes (regular, versus outlier).
Usage
class_check(pred, truth)
Arguments
pred |
A logical vector with the classification output from an anomaly detection algorithm. |
truth |
A logical vector with the observed classification as a reference (or ground truth). |
Details
The function computes the confusion matrix using the function table
. True positive and false negative are successively evaluated to compute overall accuracy, recall, precision, and F1-scores.
Value
An S3 class named checkwise
with the confusion matrix, and other accuracy metrices appended as attribues.
attr(, "overall")
A numeric value between zero and one with the overall accuracy.
attr(, "recall")
A numeric vector of values between zero and one with the recall index for regular and outlier cells.
attr(, "precision")
A numeric vector of values between zero and one with the precision index for regular and outlier cells.
attr(, "f1-score")
A numeric vector of values between zero and one with the F1-scores for regular and outlier cells.
Author(s)
Luca Sartore drwolf85@gmail.com
Examples
# Load the package
library(HRTnomaly)
set.seed(2025L)
# Load the 'toy' data
data(toy)
# Detect cellwise outliers using Bayesian Analysis
res <- cellwise(toy[sample.int(100), ], 0.5, 10L)
class_check(res$outlier, res$anomaly_flag != "")