Unlike my regular blog posts, this one is going to be a very short one - crisp and to the point. Deep Learning has been touted as the next big thing in data analytics and things have gotten so hyped that a lot of people (even practitioners) have started to consider it as magic. I'm … Continue reading Neural Networks – A Linear Algebra Perspective
Mixture Density Networks: Probabilistic Regression for Uncertainty Estimation
Uncertainty is all around us. It is present in every decision we make, every action we take. And this is especially true in business decisions where we plan for the future. But in spite of that, all of our predictive models that we use in business ignore uncertainty. Suppose you are the manager of the … Continue reading Mixture Density Networks: Probabilistic Regression for Uncertainty Estimation
Neural Oblivious Decision Ensembles(NODE) – A State-of-the-Art Deep Learning Algorithm for Tabular Data
Deep Learning brought about revolutions in many machine learning problems from the field of Computer Vision, Natural Language Processing, Reinforcement Learning, etc. But tabular data still remains firmly under classical machine learning algorithms, namely the gradient boosting algorithms(I have a whole series on different Gradient Boosting algorithms, if you are interested). Intuitively, this is strange, … Continue reading Neural Oblivious Decision Ensembles(NODE) – A State-of-the-Art Deep Learning Algorithm for Tabular Data
PyTorch Tabular – A Framework for Deep Learning for Tabular Data
It is common knowledge that Gradient Boosting models, more often than not, kick the asses of every other machine learning models when it comes to Tabular Data. I have written extensively about Gradient Boosting, the theory behind and covered the different implementations like XGBoost, LightGBM, CatBoost, NGBoost etc. in detail. The unreasonable effectiveness of Deep … Continue reading PyTorch Tabular – A Framework for Deep Learning for Tabular Data
How to Train and Deploy Custom AI-Generated Quotes using GPT2, FastAPI, and ReactJS
The Problem Good quotes help make us stronger. What is truly inspiring about quotes is not their tone or contentedness but how those who share them reflect life experiences that really serve others. I didn't write the above quote about quotes(Quote-ception), but an AI model I trained did. And it says it better than I … Continue reading How to Train and Deploy Custom AI-Generated Quotes using GPT2, FastAPI, and ReactJS
Intermittent Demand Forecasting with Deep Renewal Processes
Let's face it, anyone who has worked on Time Series Forecasting problems in the retail, logistics, e-commerce etc. would have definitely cursed that long tail which never behaves. The dreaded intermittent time series which makes the job of a forecaster difficult. This nuisance renders most of the standard forecasting techniques impractical, raises questions about the … Continue reading Intermittent Demand Forecasting with Deep Renewal Processes
Does Imagenet Pretraining work for Chest Radiography Images(COVID-19)?
We are at siege. A siege by an unknown enemy. An enemy with which we are befuddled. And unless you were living under a rock for the past couple of months(like Jared Leto), you know what I'm talking about - COVID-19. Whether you turn on the news, or scroll through social media, the majority of … Continue reading Does Imagenet Pretraining work for Chest Radiography Images(COVID-19)?
The Gradient Boosters VI(A): Natural Gradient
We are taking a brief detour from the series to understand what Natural Gradient is. The next algorithm we examine in the Gradient Boosting world is NGBoost and to understand it completely, we need to understand what Natural Gradients are. Pre-reads: I would be talking about KL Divergence and if you are unfamiliar with the … Continue reading The Gradient Boosters VI(A): Natural Gradient
Deep Learning and Information Theory
If you have tried to understand the maths behind machine learning, including deep learning, you would have come across topics from Information Theory - Entropy, Cross Entropy, KL Divergence, etc. The concepts from information theory is ever prevalent in the realm of machine learning, right from the splitting criteria of a Decision Tree to loss … Continue reading Deep Learning and Information Theory