XGBoostSub_bin {BioPred} | R Documentation |
XGBoost Model with Modified Loss Function for Subgroup Identification with Binary Outcomes
Description
Function for training XGBoost model with customized loss function for binary outcomes
Usage
XGBoostSub_bin(
X_data,
y_data,
trt,
pi,
Loss_type = "A_learning",
params = list(),
nrounds = 50,
disable_default_eval_metric = 1,
verbose = TRUE
)
Arguments
X_data |
The input features matrix. |
y_data |
The input y matrix. |
trt |
The treatment indicator vector. Should take values of 1 or -1, where 1 represents the treatment group and -1 represents the control group. |
pi |
The propensity scores vector, which should range from 0 to 1, representing the probability of assignment to treatment. |
Loss_type |
Type of loss function to use: "A_learning" or "Weight_learning". |
params |
A list of additional parameters for the xgb.train function. |
nrounds |
Number of boosting rounds. Default is 50. |
disable_default_eval_metric |
If 1, default evaluation metric will be disabled. |
verbose |
Logical. If TRUE, training progress will be printed; if FALSE, no progress will be printed. |
Details
XGBoostSub_bin: Function for Training XGBoost Model with Customized Loss Function for binary outcomes
This function trains an XGBoost model using a customized loss function based on the A-learning and weight-learning.
This function requires the 'xgboost' library. Make sure to install and load the 'xgboost' library before using this function.
After running this function, the returned model can be used like a regular xgboost model.
Value
Trained XGBoostSub_bin model.
Examples
X_data <- matrix(rnorm(100 * 10), ncol = 10) # 100 samples with 10 features
y_data <- rbinom(100, 1, 0.5) # binary outcomes (0 or 1)
trt <- sample(c(1, -1), 100, replace = TRUE) # treatment indicator (1 or -1)
pi <- runif(100, min = 0.3, max = 0.7) # propensity scores between 0 and 1
# Define XGBoost parameters
params <- list(
max_depth = 3,
eta = 0.1,
subsample = 0.8,
colsample_bytree = 0.8
)
# Train the model using A-learning loss
model_A <- XGBoostSub_bin(
X_data = X_data,
y_data = y_data,
trt = trt,
pi = pi,
Loss_type = "A_learning",
params = params,
nrounds = 5,
disable_default_eval_metric = 1,
verbose = TRUE
)
# Train the model using Weight-learning loss
model_W <- XGBoostSub_bin(
X_data = X_data,
y_data = y_data,
trt = trt,
pi = pi,
Loss_type = "Weight_learning",
params = params,
nrounds = 5,
disable_default_eval_metric = 1,
verbose = TRUE
)