Advertisements

WhatIs-X

X-12-ARIMA Seasonal Adjustment Program X-12-ARIMA is a seasonal adjustment software that was produced by the Census Bureau.
Features include:
· Extensive time series modeling and model selection capabilities for linear regression models with ARIMA errors (regARIMA models);
· Many seasonal and trend filter options;
· Diagnostics of the quality and stability of the adjustments achieved under the options selected;
· The ability to efficiently process many series at once.
The X-12-ARIMA seasonal adjustment program of the US Census Bureau extracts the different components (mainly: seasonal component, trend component, outlier component and irregular component) of a monthly or quarterly time series. It is the state-of-the- art technology for seasonal adjustment used in many statistical offices. It is possible to include a moving holiday effect, a trading day effect and user-defined regressors, and additionally incorporates automatic outlier detection. The procedure makes additive or multiplicative adjustments and creates an output data set containing the adjusted time series and intermediate calculations.
XDATA XDATA is developing an open source software library for big data to help overcome the challenges of effectively scaling to modern data volume and characteristics. The program is developing the tools and techniques to process and analyze large sets of imperfect, incomplete data. Its programs and publications focus on the areas of analytics, visualization, and infrastructure to efficiently fuse, analyze and disseminate these large volumes of data.
X-GAN Image reconstruction including image restoration and denoising is a challenging problem in the field of image computing. We present a new method, called X-GANs, for reconstruction of arbitrary corrupted resource based on a variant of conditional generative adversarial networks (conditional GANs). In our method, a novel generator and multi-scale discriminators are proposed, as well as the combined adversarial losses, which integrate a VGG perceptual loss, an adversarial perceptual loss, and an elaborate corresponding point loss together based on the analysis of image feature. Our conditional GANs have enabled a variety of applications in image reconstruction, including image denoising, image restoration from quite a sparse sampling, image inpainting, image recovery from the severely polluted block or even color-noise dominated images, which are extreme cases and haven’t been addressed in the status quo. We have significantly improved the accuracy and quality of image reconstruction. Extensive perceptual experiments on datasets ranging from human faces to natural scenes demonstrate that images reconstructed by the presented approach are considerably more realistic than alternative work. Our method can also be extended to handle high-ratio image compression.
xgboost Gradient Boosting (GBDT, GBRT or GBM) Library for large-scale and distributed machine learning, on single node, hadoop yarn and more.
xGEM This work proposes xGEMs or manifold guided exemplars, a framework to understand black-box classifier behavior by exploring the landscape of the underlying data manifold as data points cross decision boundaries. To do so, we train an unsupervised implicit generative model — treated as a proxy to the data manifold. We summarize black-box model behavior quantitatively by perturbing data samples along the manifold. We demonstrate xGEMs’ ability to detect and quantify bias in model learning and also for understanding the changes in model behavior as training progresses.
X-Means Extending K-Means with efficient estimation of the number of Clusters.
XNOR Neural Engine Binary Neural Networks (BNNs) are promising to deliver accuracy comparable to conventional deep neural networks at a fraction of the cost in terms of memory and energy. In this paper, we introduce the XNOR Neural Engine (XNE), a fully digital configurable hardware accelerator IP for BNNs, integrated within a microcontroller unit (MCU) equipped with an autonomous I/O subsystem and hybrid SRAM / standard cell memory. The XNE is able to fully compute convolutional and dense layers in autonomy or in cooperation with the core in the MCU to realize more complex behaviors. We show post-synthesis results in 65nm and 22nm technology for the XNE IP and post-layout results in 22nm for the full MCU indicating that this system can drop the energy cost per binary operation to 21.6fJ per operation at 0.4V, and at the same time is flexible and performant enough to execute state-of-the-art BNN topologies such as ResNet-34 in less than 2.2mJ per frame at 8.9 fps.
Xu The exponential growth of information on the Internet has created a big challenge for retrieval systems in terms of yielding relevant results. This challenge requires automatic approaches for reformatting or expanding users’ queries to increase recall. Query expansion (QE), a technique for broadening users’ queries by appending additional tokens or phrases bases on semantic similarity metrics, plays a crucial role in overcoming this challenge. However, such a procedure increases computational complexity and may lead to unwanted noise in information retrieval. This paper attempts to push the state of the art of QE by developing an automated technique using high dimensional clustering of word vectors to create effective expansions with reduced noise. We implemented a command line tool, named Xu, and evaluated its performance against a dataset of news articles, concluding that on average, expansions generated using this technique outperform those generated by previous approaches, and the base user query.
Xy Simulating Supervised Learning Data drawing . With Xy() you can convienently simulate regression data. The simulation can be very specific, since the user has many degrees of freedom. For instance, the functional shape and hence the polynomial degree of nonlinearity can be manipulated. Interaction can be formed and (co)variances altered. For a more specific motivation you can visit our blog
Advertisements