Ople: AI-Driven Data Science – World-class deep learning in days, not months

Ople uses advanced machine learning to automate and optimize Data Science processes, accelerating the delivery of new, more accurate models from months to minutes. With Ople, business executives will gain first movers’ advantage on new opportunities, team leaders will deliver faster, more accurate results, and, Data Scientists will now be able to focus on actual business objectives, not just workflow.


New Decimal Systems – Great Sandbox for Data Scientists and Mathematicians

We illustrate pattern recognition techniques applied to an interesting mathematical problem: The representation of a number in non-conventional systems, generalizing the familiar base-2 or base-10 systems. The emphasis is on data science rather than mathematical theory, and the style is that of a tutorial, requiring minimum knowledge in mathematics or statistics. However, some off-the-beaten-path, state-of-the-art number theory research is discussed here, in a way that is accessible to college students after a first course in statistics. This article is also peppered with mathematical and statistical oddities, for instance the fact that there are units of information smaller than the bit. You will also learn how the discovery process works, as I have included research that I thought would lead me to interesting results, but did not. In all scientific research, only final, successful results are presented, while actually most of the research leads to dead-ends, and is not made available to the reader. Here is your chance to discover these hidden steps, and my thought process!


GFLASSO: Graph-Guided Fused LASSO in R

While the field of machine learning advances swiftly with the development of increasingly more sophisticated techniques, little attention has been given to high-dimensional multi-task problems that require the simultaneous prediction of multiple responses. This tutorial will show you the power of the Graph-Guided Fused LASSO (GFLASSO) in predicting multiple responses under a single regularized linear regression framework.


Visualizing Time Series Data of Stock Prices

Time series data, simply put, is a set of data points collected at regular time intervals. We encounter time series data every day in our lives – stock prices, real estate market prices, energy usage at our homes and so on. So why should we care about this data? Because understanding time series data, especially of stock prices, could help you to be on a path to make $$$. Visualizing time series data play a key role in identifying certain patterns in graphs and predicting future observations in the data for making informed decisions. Some properties associated with time series data are trends (upward, downward, stationary), seasonality (repeating trends influenced by seasonal factors), and cyclical (trends with no fixed repetition). Instead of focusing on forecasting analyses, we’ll guide you through the first step in time series analysis: Visualisation.


Announcing the Google Cloud Platform Research Credits Program

Scientists across nearly every discipline are researching ever larger and more complex data sets, using tremendous amounts of compute power to learn, make discoveries and build new tools that few could have imagined only a few years ago. Traditionally, this kind of research has been limited by the availability of resources, with only the largest universities or industry partners able to successfully pursue these endeavors. However, the power of cloud computing has been removing obstacles that many researchers used to face, enabling projects that use machine learning tools to understand and address student questions and that study robotic interactions with humans, among many more. In order to ensure that more researchers have access to powerful cloud tools, we’re launching Google Cloud Platform (GCP) research credits, a new program aimed to support faculty in qualified regions who want to take advantage of GCP’s compute, analytics, and machine-learning capabilities for research. Higher education researchers can use GCP research credits in a multitude of ways — below are just three examples to illustrate how GCP can help propel your research forward.


Building Convolutional Neural Network using NumPy from Scratch

Using already existing models in ML/DL libraries might be helpful in some cases. But to have better control and understanding, you should try to implement them yourself. This article shows how a CNN is implemented just using NumPy. Convolutional neural network (CNN) is the state-of-art technique for analyzing multidimensional signals such as images. There are different libraries that already implements CNN such as TensorFlow and Keras. Such libraries isolates the developer from some details and just give an abstract API to make life easier and avoid complexity in the implementation. But in practice, such details might make a difference. Sometimes, the data scientist have to go through such details to enhance the performance. The solution in such situation is to build every piece of such model your own. This gives the highest possible level of control over the network. Also, it is recommended to implement such models to have better understanding over them. In this article, CNN is created using only NumPy library. Just three layers are created which are convolution (conv for short), ReLU, and max pooling.


Deep Learning – A Primer – Slide Deck

Deep Learning is one of the ‘hot’ topics in the AI area – a lot of hype, a lot of inflated expectation, but also quite some impressive success stories. As some AI experts already predict that Deep Learning will become ‘Software 2.0’, it might be a good time to have a closer look at the topic. In this session I will try to give a comprehensive overview of Deep Learning. We will start with a bit of history and some theoretical foundations that we will use to create a little Deep Learning taxonomy. Then we will have a look at current and upcoming application areas: Where can we apply Deep Learning successfully and what does it differentiate from other approaches? Afterwards we will examine the ecosystem: Which tools and libraries are available? What are their strengths and weaknesses? And to complete the session, we will look into some practical code examples and the typical pitfalls of Deep Learning. After this session? you will have a much better idea of the why, what and how of Deep Learning, including if and how you might want to apply it to your own work.


Choosing the Right Metric for Evaluating ML Models – Part 1

In the world of postmodernism, Relativism has been, in its various guises, both one of the most popular and most reviled philosophical doctrines. According to Relativism, there is no universal and objective truth; rather each point of view has its own truth. You must be wondering why I am discussing it and how it is even related to Data Science. Well, in this post, I will be discussing the usefulness of each error metric depending on the objective and the problem we are trying to solve. When someone tells you that “USA is the best country”, the first question that you should ask is on what basis is this statement being made. Are we judging each country on the basis of their economic status, or their health facilities etc.? Similarly each machine learning model is trying to solve a problem with a different objective using a different dataset and hence, it is important to understand the context before choosing a metric.
Advertisements