XGBoostSub_sur {BioPred}R Documentation

XGBoost Model with Modified Loss Function for Subgroup Identification with Survival Outcomes

Description

Function for training XGBoost model with customized loss function for survival outcomes

Usage

XGBoostSub_sur(
  X_data,
  y_data,
  trt,
  pi,
  censor,
  Loss_type = "Weight_learning",
  params = list(),
  nrounds = 50,
  disable_default_eval_metric = 1,
  verbose = TRUE
)

Arguments

X_data

The input features matrix.

y_data

The input y matrix.

trt

The treatment indicator vector. Should take values of 1 or -1, where 1 represents the treatment group and -1 represents the control group.

pi

The propensity scores vector, which should range from 0 to 1, representing the probability of assignment to treatment.

censor

The censor status vector. Should take values of 1 or 0, where 1 represents censoring and 0 represents an observed event.

Loss_type

Type of loss function to use: "A_learning" or "Weight_learning".

params

A list of additional parameters for the xgb.train function.

nrounds

Number of boosting rounds. Default is 50.

disable_default_eval_metric

If 1, default evaluation metric will be disabled.

verbose

Logical. If TRUE, training progress will be printed; if FALSE, no progress will be printed.

Details

XGBoostSub_sur: Function for Training XGBoost Model with Customized Loss Function for survival outcomes

This function trains an XGBoost model using a customized loss function based on the A-learning and weight-learning.

This function requires the 'xgboost' library. Make sure to install and load the 'xgboost' library before using this function.

Value

Trained XGBoostSub_sur model.

Examples

X_data <- matrix(rnorm(100 * 10), ncol = 10)  # 100 samples with 10 features
y_data <- rexp(100, rate = 0.1)  # survival times, simulated as exponential
trt <- sample(c(1, -1), 100, replace = TRUE)  # treatment indicator (1 or -1)
pi <- runif(100, min = 0.3, max = 0.7)  # propensity scores between 0 and 1
censor <- rbinom(100, 1, 0.7)  # censoring indicator (1 = censored, 0 = observed)

# Define XGBoost parameters
params <- list(
  max_depth = 3,
  eta = 0.1,
  subsample = 0.8,
  colsample_bytree = 0.8
)

# Train the model using A-learning loss
model_A <- XGBoostSub_sur(
  X_data = X_data,
  y_data = y_data,
  trt = trt,
  pi = pi,
  censor = censor,
  Loss_type = "A_learning",
  params = params,
  nrounds = 5,
  disable_default_eval_metric = 1,
  verbose = TRUE
)

# Train the model using Weight-learning loss
model_W <- XGBoostSub_sur(
  X_data = X_data,
  y_data = y_data,
  trt = trt,
  pi = pi,
  censor = censor,
  Loss_type = "Weight_learning",
  params = params,
  nrounds = 5,
  disable_default_eval_metric = 1,
  verbose = TRUE
)


[Package BioPred version 1.0.2 Index]