Simulation of Piecewise Linear Fuzzy Numbers (Sim.PLFN)
The definition of fuzzy random variable and the methods of simulation from fuzzy random variables are two challenging statistical problems in three recent decades. This package is organized based on a special definition of fuzzy random variable and simulate fuzzy random variable by Piecewise Linear Fuzzy Numbers (PLFNs); see Coroianua et al. (2013) <doi:10.1016/j.fss.2013.02.005> for details about PLFNs. Some important statistical functions are considered for obtaining the membership function of main statistics, such as mean, variance, summation, standard deviation and coefficient of variance. Some of applied advantages of ‘Sim.PLFN’ package are: (1) Easily generating / simulation a random sample of PLFN, (2) drawing the membership functions of the simulated PLFNs or the membership function of the statistical result, and (3) Considering the simulated PLFNs for arithmetic operation or importing into some statistical computation. Finally, it must be mentioned that ‘Sim.PLFN’ package works on the basis of ‘FuzzyNumbers’ package.

Over Sampling for Time Series Classification (OSTSC)
The OSTSC package is a powerful oversampling approach for classifying univariant, but multinomial time series data in R. This article provides a brief overview of the oversampling methodology implemented by the package. A tutorial of the OSTSC package is provided. We begin by providing three test cases for the user to quickly validate the functionality in the package. To demonstrate the performance impact of OSTSC, we then provide two medium size imbalanced time series datasets. Each example applies a TensorFlow implementation of a Long Short-Term Memory (LSTM) classifier – a type of a Recurrent Neural Network (RNN) classifier – to imbalanced time series. The classifier performance is compared with and without oversampling. Finally, larger versions of these two datasets are evaluated to demonstrate the scalability of the package. The examples demonstrate that the OSTSC package improves the performance of RNN classifiers applied to highly imbalanced time series data. In particular, OSTSC is observed to increase the AUC of LSTM from 0.543 to 0.784 on a high frequency trading dataset consisting of 30,000 time series observations.

Random Perturbation of Count Matrices (perturbR)
The perturbR() function incrementally perturbs network edges and compares the resulting community detection solutions from the rewired networks with the solution found for the original network. These comparisons aid in understanding the stability of the original solution. The package requires symmetric, weighted (specifically, count) matrices/networks.

Estimation and Inference in Random Attention Models (ramchoice)
It is widely documented in psychology, economics and other disciplines that socio-economic agents do not pay full attention to all available choices, rendering standard revealed preference theory invalid. This package implements the estimation and inference procedures documented in Cattaneo, Ma, Masatlioglu and Suleymanov (2017) <http://…-Masatlioglu-Suleymanov_2017_RAM.pdf>, which utilizes standard choice data to partially identify and estimate decision maker’s preference. For inference, different simulation-based critical values are provided.

Europe SpatialPolygonsDataFrame Builder (spMaps)
Build custom Europe SpatialPolygonsDataFrame, if you don’t know what is a SpatialPolygonsDataFrame see SpatialPolygons() in ‘sp’, by example for mapLayout() in ‘antaresViz’. Antares is a powerful software developed by RTE to simulate and study electric power systems (more information about ‘Antares’ here: <> ).

Gaussian Process with Histogram Intersection Kernel (gpHist)
Provides an implementation of a Gaussian process regression with a histogram intersection kernel (HIK) and utilizes approximations to speed up learning and prediction. In contrast to a squared exponential kernel, an HIK provides advantages such as linear memory and learning time requirements. However, the HIK only provides a piecewise-linear approximation of the function. Furthermore, the number of estimated eigenvalues is reduced. The eigenvalues and vectors are required for the approximation of the log-likelihood function as well as the approximation of the predicted variance of new samples. This package provides approximations for a single eigenvalue as well as multiple. Further information of the variance and log-likelihood approximation, as well as the Gaussian process with HIK, can be found in the paper by Rodner et al. (2016) <doi:10.1007/s11263-016-0929-y>.