**10 Machine Learning Algorithms You Should Know in 2018**

• Decision tree

• Random forest

• Logistic regression

• Support vector machine

• Naive Bayes

• k-NearestNeighbor

• k-means

• Adaboost

• Neural network

• Markov

• Random forest

• Logistic regression

• Support vector machine

• Naive Bayes

• k-NearestNeighbor

• k-means

• Adaboost

• Neural network

• Markov

**Unreasonable Effectivness of Deep Learning**

We show how well known rules of back propagation arise from a weighted combination of finite automata. By redefining a finite automata as a predictor we combine the set of all k -state finite automata using a weighted majority algorithm. This aggregated prediction algorithm can be simplified using symmetry, and we prove the equivalence of an algorithm that does this. We demonstrate that this algorithm is equivalent to a form of a back propagation acting in a completely connected k -node neural network. Thus the use of the weighted majority algorithm allows a bound on the general performance of deep learning approaches to prediction via known results from online statistics. The presented framework opens more detailed questions about network topology; it is a bridge to the well studied techniques of semigroup theory and applying these techniques to answer what specific network topologies are capable of predicting. This informs both the design of artificial networks and the exploration of neuroscience models.

**Rethinking–or Remembering–Generalization in Neural Networks**

In this post, I discuss part of our argument, looking at the basic ideas of Statistical Learning Theory (SLT) and how they actually compare to what we do in practice. In a future post, I will describe the arguments from Statistical Mechanics (Stat Mech), and why they provide a better, albeit more complicated, theory.

**Fastai Collaborative Filtering with R and Reticulate**

Jeremy Howard and Rachel Thomas are founders of fast.ai whose aim is to make deep learning accessible to all. They offer a course called Practical Deep Learning for Coders (Part 1). The last session, taught by Jeremy, was in Fall 2017 and the videos were released early January 2018. Their approach is top down by showing different applications first as black boxes followed by progressive peeling of the black box to teach the details of how things work. The course uses python and they have developed a python library fastai that is a wrapper around PyTorch.

Advertisements