Sequential Bagging on Regression (SQB)
Methodology: Remove one observation. Training the rest of data that are sampled without replacement and given this observation’s input, predict the response back. Replicate this N times and for each response, take a sample from these replicates with replacement. Average each responses of sample and again replicate this step N time for each observation. Approximate these N new responses and generate another N responses y’. Training these y’ and predict to have N responses of each testing observation. The average of N is the final prediction. Each observation will do the same.

Non-Negative Tensor Decomposition (nnTensor)
Some functions for performing non-negative matrix factorization, non-negative CANDECOMP/PARAFAC (CP) decomposition, non-negative Tucker decomposition, and generating toy model data. See Andrzej Cichock et al (2009) <doi:10.1002/9780470747278> and the reference section of GitHub README.md <https://…/nnTensor>, for details of the methods.

Covariate-Adjusted Tensor Classification in High-Dimensions (catch)
Performs classification and variable selection on high-dimensional tensors (multi-dimensional arrays) after adjusting for additional covariates (scalar or vectors) as CATCH model in Pan, Mai and Zhang (2018) <arXiv:1805.04421>. The low-dimensional covariates and the high-dimensional tensors are jointly modeled to predict a categorical outcome in a multi-class discriminant analysis setting. The Covariate-Adjusted Tensor Classification in High-dimensions (CATCH) model is fitted in two steps: (1) adjust for the covariates within each class; and (2) penalized estimation with the adjusted tensor using a cyclic block coordinate descent algorithm. The package can provide a solution path for tuning parameter in the penalized estimation step. Special case of the CATCH model includes linear discriminant analysis model and matrix (or tensor) discriminant analysis without covariates.

Advertisements