LSBoostRegressor.Rd
LSBoost Regressor
LSBoostRegressor(
n_estimators = 100L,
learning_rate = 0.1,
n_hidden_features = 5L,
reg_lambda = 0.1,
row_sample = 1,
col_sample = 1,
dropout = 0,
tolerance = 1e-04,
direct_link = 1L,
verbose = 1L,
seed = 123L,
solver = c("ridge", "lasso"),
activation = "relu",
n_clusters = 0,
clustering_method = "kmeans",
cluster_scaling = "standard",
degree = 0,
weights_distr = "uniform"
)
int, number of boosting iterations.
float, controls the learning speed at training time.
int
of nodes in successive hidden layers.
float, L2 regularization parameter for successive errors in the optimizer (at training time).
float, percentage of rows chosen from the training set.
float, percentage of columns chosen from the training set.
float, percentage of nodes dropped from the training set.
float, controls early stopping in gradient descent (at training time).
bool, indicates whether the original features are included (True) in model's fitting or not (False).
int, progress bar (yes = 1) or not (no = 0) (currently).
int, reproducibility seed for nodes_sim=='uniform', clustering and dropout.
str, type of 'weak' learner; currently in ('ridge', 'lasso')
str, activation function: currently 'relu', 'relu6', 'sigmoid', 'tanh'
int, number of clusters for clustering.
str, clustering method: currently 'kmeans', 'gmm' (Gaussian Mixture Model)
str, scaling method for clustering: currently 'standard', 'minmax', 'robust'
int, degree of polynomial interactions features.
str, distribution of weights for the hidden layer currently 'uniform', 'gaussian'
An object of class LSBoostRegressor
if (FALSE) {
library(datasets)
X <- as.matrix(datasets::mtcars[, -1])
y <- as.integer(datasets::mtcars[, 1])
n <- dim(X)[1]
p <- dim(X)[2]
set.seed(21341)
train_index <- sample(x = 1:n, size = floor(0.8*n), replace = TRUE)
test_index <- -train_index
X_train <- as.matrix(X[train_index, ])
y_train <- as.double(y[train_index])
X_test <- as.matrix(X[test_index, ])
y_test <- as.double(y[test_index])
obj <- mlsauce::LSBoostRegressor()
print(obj$get_params())
obj$fit(X_train, y_train)
print(obj$score(X_test, y_test))}