AdaOpt.Rd
AdaOpt classifier
AdaOpt(
n_iterations = 50L,
learning_rate = 0.3,
reg_lambda = 0.1,
reg_alpha = 0.5,
eta = 0.01,
gamma = 0.01,
k = 3L,
tolerance = 0,
n_clusters = 0,
batch_size = 100L,
row_sample = 1,
type_dist = "euclidean-f",
cache = TRUE,
n_clusters_input = 0,
clustering_method = "kmeans",
cluster_scaling = "standard",
seed = 123L
)
number of iterations of the optimizer at training time
controls the speed of the optimizer at training time
L2 regularization parameter for successive errors in the optimizer (at training time)
L1 regularization parameter for successive errors in the optimizer (at training time)
controls the slope in gradient descent (at training time)
controls the step size in gradient descent (at training time)
number of nearest neighbors selected at test time for classification
controls early stopping in gradient descent (at training time)
number of clusters, if MiniBatch k-means is used at test time (for faster prediction)
size of the batch, if MiniBatch k-means is used at test time (for faster prediction)
percentage of rows chosen from training set (by stratified subsampling, for faster prediction)
distance used for finding the nearest neighbors; currently euclidean-f
(euclidean distances
calculated as whole), euclidean
(euclidean distances calculated row by row), cosine
(cosine distance)
if the nearest neighbors are cached or not, for faster retrieval in subsequent calls
number of clusters a priori on inpu data
either "kmeans" or "gmm" (Gaussian mixture)
either 'standard', 'minmax', 'robust'
reproducibility seed for initial weak learner and clustering
An object of class AdaOpt
if (FALSE) {
library(datasets)
X <- as.matrix(iris[, 1:4])
y <- as.integer(iris[, 5]) - 1L
n <- dim(X)[1]
p <- dim(X)[2]
set.seed(21341)
train_index <- sample(x = 1:n, size = floor(0.8*n), replace = TRUE)
test_index <- -train_index
X_train <- as.matrix(iris[train_index, 1:4])
y_train <- as.integer(iris[train_index, 5]) - 1L
X_test <- as.matrix(iris[test_index, 1:4])
y_test <- as.integer(iris[test_index, 5]) - 1L
obj <- mlsauce::AdaOpt()
print(obj$get_params())
obj$fit(X_train, y_train)
print(obj$score(X_test, y_test))}