eval_ermod {BayesERtools} | R Documentation |
Evaluate exposure-response model prediction performance
Description
This function evaluates the performance of an exposure-response model using various metrics.
Usage
eval_ermod(
ermod,
eval_type = c("training", "kfold", "test"),
newdata = NULL,
summary_method = c("median", "mean"),
k = 5,
seed_kfold = NULL
)
Arguments
ermod |
An object of class |
eval_type |
A character string specifying the evaluation dataset. Options are:
|
newdata |
A data frame containing new data for evaluation when
|
summary_method |
A character string specifying how to summarize the
simulation draws. Default is |
k |
The number of folds for cross-validation. Default is 5. |
seed_kfold |
Random seed for k-fold cross-validation. |
Value
A tibble with calculated performance metrics, such as AUROC or RMSE, depending on the model type.
Examples
data(d_sim_binom_cov_hgly2)
d_split <- rsample::initial_split(d_sim_binom_cov_hgly2)
d_train <- rsample::training(d_split)
d_test <- rsample::testing(d_split)
ermod_bin <- dev_ermod_bin(
data = d_train,
var_resp = "AEFLAG",
var_exposure = "AUCss_1000",
var_cov = "BHBA1C_5",
# Settings to make the example run faster
chains = 2,
iter = 1000
)
metrics_training <- eval_ermod(ermod_bin, eval_type = "training")
metrics_test <- eval_ermod(ermod_bin, eval_type = "test", newdata = d_test)
metrics_kfold <- eval_ermod(ermod_bin, eval_type = "kfold", k = 3)
print(metrics_training)
print(metrics_test)
print(metrics_kfold)
[Package BayesERtools version 0.2.3 Index]