Advertisements

What is …

0 | 1 | 2 | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | X
|What is …| = 5732

0

0.632 Boostrap Sampling with replacement. Each data point has probability (1 – 1/n)n of being selected as test data: Training data = 1 – (1 – 1/n)n of the original data. A particular training data has a probability of (1 – 1/n) of not being picked. This means the training data will contain approximately 63.2% of the instances.

abc,abc.data,abctools

1

1-Nearest-Neighbor-Based Multiclass Learning This paper deals with Nearest-Neighbor (NN) learning algorithms in metric spaces. Initiated by Fix and Hodges in 1951 , this seemingly simplistic learning paradigm remains competitive against more sophisticated methods and, in its celebrated k-NN version, has been placed on a solid theoretical foundation. Although the classic 1-NN is well known to be inconsistent in general, in recent years a series of papers has presented variations on the theme of a regularized 1-NN classifier, as an alternative to the Bayesconsistent k-NN. Gottlieb et al. showed that approximate nearest neighbor search can act as a regularizer, actually improving generalization performance rather than just injecting noise. In a follow-up work, showed that applying Structural Risk Minimization to (essentially) the margin-regularized datadependent bound in yields a strongly Bayes-consistent 1-NN classifier. A further development has seen margin-based regularization analyzed through the lens of sample compression: a near-optimal nearest neighbor condensing algorithm was presented and later extended to cover semimetric spaces ; an activized version also appeared. As detailed in , margin-regularized 1-NN methods enjoy a number of statistical and computational advantages over the traditional k-NN classifier. Salient among these are explicit data-dependent generalization bounds, and considerable runtime and memory savings. Sample compression affords additional advantages, in the form of tighter generalization bounds and increased efficiency in time and space.
1-of-n Code A special case of constant weight codes are the one-of-N codes, that encode log_2 N bits in a code-word of N bits. The one-of-two code uses the code words 01 and 10 to encode the bits ‘0’ and ‘1’. A one-of-four code can use the words 0001, 0010, 0100, 1000 in order to encode two bits 00, 01, 10, and 11.
ACDm

2

2P-DNN Machine Learning as a Service (MLaaS), such as Microsoft Azure, Amazon AWS, offers an effective DNN model to complete the machine learning task for small businesses and individuals who are restricted to the lacking data and computing power. However, here comes an issue that user privacy is ex-posed to the MLaaS server, since users need to upload their sensitive data to the MLaaS server. In order to preserve their privacy, users can encrypt their data before uploading it. This makes it difficult to run the DNN model because it is not designed for running in ciphertext domain. In this paper, using the Paillier homomorphic cryptosystem we present a new Privacy-Preserving Deep Neural Network model that we called 2P-DNN. This model can fulfill the machine leaning task in ciphertext domain. By using 2P-DNN, MLaaS is able to provide a Privacy-Preserving machine learning ser-vice for users. We build our 2P-DNN model based on LeNet-5, and test it with the encrypted MNIST dataset. The classification accuracy is more than 97%, which is close to the accuracy of LeNet-5 running with the MNIST dataset and higher than that of other existing Privacy-Preserving machine learning models
Advertisements

1 thought on “What is …”

  1. Wonderful, what a web site it is! This webpage provides valuable facts to us, keep
    it up.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.