Transformer: A Novel Neural Network Architecture for Language Understanding

Neural networks, in particular recurrent neural networks (RNNs), are now at the core of the leading approaches to language understanding tasks such as language modeling, machine translation and question answering. In Attention Is All You Need we introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well-suited for language understanding. In our paper, we show that the Transformer outperforms both recurrent and convolutional models on academic English to German and English to French translation benchmarks. On top of higher translation quality, the Transformer requires less computation to train and is a much better fit for modern machine learning hardware, speeding up training by up to an order of magnitude.


Train, Score, Repeat, Watch Out! Zillow’s Andrew Martin on modeling pitfalls in a dynamic world.

The $1 Million Zillow Prize is a Kaggle competition challenging data scientists to push the accuracy of Zestimates (automated home value estimates). As the competition heats up, we’ve invited Andrew Martin, Sr. Data Science Manager at Zillow, to write about how his team handles the challenges of delivering new predictions on a daily basis and how the mechanics of the Zillow Prize competition have been structured to account for these challenges.


Top 10 Machine Learning Use Cases: Part 1

This post is the first in a series whose aim is to shake up our intuitions about what machine learning is making possible in specific sectors?—?to look beyond the set of use cases that always come to mind.


How rOpenSci uses Code Review to Promote Reproducible Science

At rOpenSci, we create and curate software to help scientists with the data life cycle. These tools access, download, manage, and archive scientific data in open, reproducible ways. Early on, we realized this could only be a community effort. The variety of scientific data and workflows could only be tackled by drawing on contributions of scientists with field-specific expertise. With the community approach came challenges. How could we ensure the quality of code written by scientists without formal training in software development practices? How could we drive adoption of best practices among our contributors? How could we create a community that would support each other in this work? We have had great success addressing these challenges via the peer review. We draw elements from a process familiar to our target community – academic peer review – and a practice from the software development world – production code review – to create a system that fosters software quality, ongoing education, and community development.


RcppAnnoy 0.0.9

An new version 0.0.9 of RcppAnnoy, our Rcpp-based R integration of the nifty Annoy library by Erik, is now on CRAN. Annoy is a small and lightweight C++ template header library for very fast approximate nearest neighbours.


Probability functions intermediate

In this set of exercises, we are going to explore some of the probability functions in R by using practical applications. Basic probability knowledge is required. In case you are not familiarized with the function apply, check the R documentation. Note: We are going to use random numbers functions and random processes functions in R such as runif. A problem with these functions is that every time you run them, you will obtain a different value. To make your results reproducible you can specify the value of the seed using set.seed(‘any number’) before calling a random function. (If you are not familiar with seeds, think of them as the tracking number of your random number process.) For this set of exercises, we will use set.seed(1).Don’t forget to specify it before every exercise that includes random numbers.


Text featurization with the Microsoft ML package

Last week I wrote about how you can use the MicrosoftML package in Microsoft R to featurize images: reduce an image to a vector of 4096 numbers that quantify the essential characteristics of the image, according to an AI vision model. You can perform a similar featurization process with text as well, but in this case you have a lot more control of the features used to represent the text.


TensorFlow Tutorial and Examples for beginners

This tutorial was designed for easily diving into TensorFlow, through examples. For readability, it includes both notebooks and source codes with explanation. It is suitable for beginners who want to find clear and concise examples about TensorFlow. Besides the traditional ‘raw’ TensorFlow implementations, you can also find the latest TensorFlow API practices (such as layers , estimator , dataset , …).


177 Data Science Key Terms, Explained

This is a collection of 177 data science key terms, explained with a no-nonsense, concise approach. Read on to find terminology related to Big Data, machine learning, natural language processing, descriptive statistics, and much more.
Advertisements