Author: Sudhanshu Kumar

XGBoost for Regression[Case Study]

Using Gradient Boosting  for Regression Problems Introduction : The goal of the blogpost is to equip beginners with basics of gradient boosting regressor algorithm and quickly help them to build their first model.  We will mainly focus on the modeling side of it . The data cleaning and preprocessing parts would be covered in detail in an upcoming post. Gradient Boosting for regression builds an ad...

XGBoost for Classification[Case Study]

Boost Your ML skills with XGBoost Introduction : In this blog we will discuss one of the Popular Boosting Ensemble algorithm called XGBoost. XGBoost is the most popular machine learning algorithm these days. Regardless of the data type (regression or classification), it is well known to provide better solutions than other ML algorithms. Extreme Gradient Boosting (xgboost) is similar to gradient bo...

Understanding Principal Component Analysis(PCA)

Principal Component Analysis Implement from scratch and validate with sklearn framework Introduction : “Excess of EveryThing is Bad” The above line is specially in machine learning. When the data becomes too much in its dimension then it becomes a problem for pattern learning. Too much information is bad on two things : compute and execution time and quality of the model fit. When the dimension of...

Simple Logistic Regression[Case Study]

Logic behind Simple Logistic Regression Introduction : The goal of the blogpost is to get the beginners started with fundamental concepts  of the Simple logistic regression concepts and quickly help them to build their first Simple logistic regression model.  We will mainly focus on learning to build your first logistic regression model . The data cleaning and preprocessing parts would be covered ...

Simple Linear Regression[Case Study]

Simple Progression Towards Simple Linear Regression Introduction : The goal of the blogpost is to get the beginners started with basics of the linear regression concepts and quickly help them to build their first linear regression model.  We will mainly focus on the modeling side of it . The data cleaning and preprocessing parts would be covered in detail in an upcoming post. Linear Regression are...

Random Forest for Regression[Case Study]

Using Random Forests for Regression Problems Introduction : The goal of the blogpost is to equip beginners with basics of Random Forest Regressor algorithm and quickly help them to build their first model.  We will mainly focus on the modeling side of it . The data cleaning and preprocessing parts would be covered in detail in an upcoming post. Ensemble methods are supervised learning models which...

Random Forest for Car Quality[Case Study]

Find your way out of the Data Forest with Random Forest Introduction : In this blog we will discuss one of the most widely used Ensembling Machine Learning Algorithm called Random Forest. The goal of the blogpost is to get the beginners started with fundamental concepts of a Random Forest and quickly help them to build their first Random Forest model. Motive to create this tutorial is to get you s...

Polynomial Logistic Regression[Case Study]

Understand Power of Polynomials with Polynomial Regression Polynomial regression is a special case of linear regression. With the main idea of how do you select your features. Looking at the multivariate regression with 2 variables: x1 and x2. Linear regression will look like this: y = a1 * x1 + a2 * x2. Now you want to have a polynomial regression (let’s make 2 degree polynomial). We will c...

PCA for Fast ML

Speeding Up and Benchmarking Logistic Regression With PCA Introduction : When the data becomes too much in its dimension then it becomes a problem for pattern learning. Too much information is bad on two things : compute and execution time and quality of the model fit. When the dimension of the data is too high we need to find a way to reduce it. But that reduction has to be done in such a way tha...

Naive Bayes Algorithm [Case Study]

Simple Progression Towards Simple Linear Regression Introduction : It is a classification technique based on Bayes’ Theorem with an assumption of independence among predictors. In simple terms, a Naive Bayes classifier assumes that the presence of a particular feature in a class is unrelated to the presence of any other feature. For example, a dress may be considered to be a shirt if it is red, pr...

Multivariate MultiLabel Classification with Logistic Regression[Case Study]

Multivariate multilabel classification with Logistic Regression Introduction: The goal of the blog post is show you how logistic regression can be applied to do multi class classification.  We will mainly focus on learning to build a multivariate logistic regression model for doing a multi class classification. The data cleaning and preprocessing parts will be covered in detail in an upcoming post...

Multivariate Linear Regression[Case Study]

Learn To Make Prediction By Using Multiple Variables Introduction : The goal of the blogpost is to equip beginners with basics of Linear Regression algorithm having multiple features and quickly help them to build their first model. This is also known as multivariable Linear Regression. We will mainly focus on the modeling side of it . The data cleaning and preprocessing parts would be covered in ...

Lost Password

Register

24 Tutorials