Introduction to Gradient Descent Algorithm (along with variants) in Machine Learning

Optimization is always the ultimate goal whether you are dealing with a real life problem or building a software product. I, as a computer science student, always fiddled with optimizing my code to the extent that I could brag about its fast execution. Optimization basically means getting the optimal output for your problem. If you read the recent article on optimization, you would be acquainted with how optimization plays an important role in our real-life. Optimization in machine learning has a slight difference. Generally, while optimizing, we know exactly how our data looks like and what areas we want to improve. But in machine learning we have no clue how our “new data” looks like, let alone try to optimize on it. So in machine learning, we perform optimization on the training data and check its performance on a new validation data.


Deep Learning Resource Matrix

For those of you who have an interest, and or involvement in “Deep Learning” or want to learn more I’ve created this matrix. It’s by no means all inclusive. It will provide you with a landscape of some Deep Learning resources to get you started or complement resources you might already have. The original version is available here as a 5-page PDF document. You can click on the 5 images below to zoom in.


AI vs Deep Learning vs Machine Learning

Which of these terms means the same thing: AI, Deep Learning, Machine Learning? Are you sure? While there’s overlap none of these is a complete subset of the others and none completely explains the others.


5 algorithms to train a neural network

The procedure used to carry out the learning process in a neural network is called the training algorithm. There are many different training algorithms, whith different characteristics and performance.


K-Means & Other Clustering Algorithms: A Quick Intro with Python

Clustering is the grouping of objects together so that objects belonging in the same group (cluster) are more similar to each other than those in other groups (clusters). In this intro cluster analysis tutorial, we’ll check out a few algorithms in Python


Iris Dataset and Xgboost Simple Tutorial

I had the opportunity to start using xgboost machine learning algorithm, it is fast and shows good results. Here I will be using multiclass prediction with the iris dataset from scikit-learn.


Software Engineering vs Machine Learning Concepts

Not all core concepts from software engineering translate into the machine learning universe. Here are some differences I’ve noticed.


Building views with R


R 3.3.3 is released!

R 3.3.3 (codename “Another Canoe”) was released yesterday


Ensemble Learning in R

Previous research in data mining has devised numerous different algorithms for learning tasks. While an individual algorithm might already work decently, one can usually obtain a better predictive by combining several. This approach is referred to as ensemble learning. Common examples include random forests, boosting and AdaBost in particular.
Advertisements