algorithm

Binary Search in Scala (Iterative,Recursion,Tail Recursion Approaches)

Binary Search is a fast & efficient algorithm to search an element in sorted list of elements. It works on the technique of divide and conquer. Data structure: Array Time Complexity: Worst case: O(log n) Average case: O(log n) Best case: O(1) Space complexity: O(1) Let’s see how it can implemented in Scala with different approaches – Iterative Approach – Recursion Approach – Tail Recursion – Driver Program- val arr = Array(1, 2, 4, 5, 6, 7) val target = 7 println(binarySearch_iterative(arr, target) match { case -1 => s"$target doesn't exist in ${arr.mkString("[", ",", "]")}" case index => s"$target exists at index $index" }) println(binarySearch_Recursive(arr, target)() match { case -1 => s"$target doesnt match" case index => s"$target exists a...

Find the average of all contiguous subarrays of fixed size in it

Given an array, find the average of all contiguous subarrays of size ‘n’ in it. Array: [1, 3, 2, 6, -1, 4, 1, 8, 2], n=5 Output: [2.2, 2.8, 2.4, 3.6, 2.8] Solution: Sliding Window algorithm can be used to resolve this. Time Complexity: O(n) Space Complexity: O(1)

Find a pair in the array whose sum is equal to the given target

Given an array of sorted numbers and a target sum, find a pair in the array whose sum is equal to the given target. Write a function to return the indices of the two numbers (i.e. the pair) such that they add up to the given target. Example 1: Input: [1, 2, 3, 4, 6], target=6 Output: [1, 3] Explanation: The numbers at index 1 and 3 add up to 6: 2+4=6 We can use the Two Pointers approach to solve this. Solution: Time Complexity: O(n) Space Complexity: O(1)

Naive Bayes Algorithm [Case Study]

Simple Progression Towards Simple Linear Regression Introduction : It is a classification technique based on Bayes’ Theorem with an assumption of independence among predictors. In simple terms, a Naive Bayes classifier assumes that the presence of a particular feature in a class is unrelated to the presence of any other feature. For example, a dress may be considered to be a shirt if it is red, printed, and has full sleeve . Even if these features depend on each other or upon the existence of the other features, all of these properties independently contribute to the probability that this cloth is a shirt and that is why it is known as ‘Naive’. Classification Machine Learning is a technique of learning where a particular instance is mapped against one of the many labels. The labels are pre...

Multivariate MultiLabel Classification with Logistic Regression[Case Study]

Multivariate multilabel classification with Logistic Regression Introduction: The goal of the blog post is show you how logistic regression can be applied to do multi class classification.  We will mainly focus on learning to build a multivariate logistic regression model for doing a multi class classification. The data cleaning and preprocessing parts will be covered in detail in an upcoming post. Logistic regression is one of the most fundamental and widely used Machine Learning Algorithms. Logistic regression is usually among the first few topics which people pick while learning predictive modeling. Logistic regression is not a regression algorithm but actually a probabilistic classification model. Classification in Machine Learning is a technique of learning where a particular instance...

Multivariate Linear Regression[Case Study]

Learn To Make Prediction By Using Multiple Variables Introduction : The goal of the blogpost is to equip beginners with basics of Linear Regression algorithm having multiple features and quickly help them to build their first model. This is also known as multivariable Linear Regression. We will mainly focus on the modeling side of it . The data cleaning and preprocessing parts would be covered in detail in an upcoming post. A multivariable model can be thought of as a model in which multiple variables are found on the right side of the model equation. This type of statistical model can be used to attempt to assess the relationship between a number of variables. A simple linear regression model has a continuous outcome and one predictor, whereas a multiple or multivariable linear regression...

k-Nearest Neighbors Classification algorithm [Case Study]

Predicting car quality with the help of Neighbors Introduction : The goal of the blogpost is to get the beginners started with fundamental concepts  of the K Nearest Neighbour Classification Algorithm popularly known by the name KNN classifiers. We will mainly focus on learning to build your first KNN model. The data cleaning and preprocessing parts would be covered in detail in an upcoming post. Classification Machine Learning is a technique of learning where a particular instance is mapped against one of the many labels. The labels are prespecified to train your model . The machine learns the pattern from the data in such a way that the learned representation successfully  maps the original dimension to the suggested label/class without any more intervention from a human expert. How does...

K-Means Clustering Algorithm

Introduction : The goal of the blogpost is to get the beginners started with fundamental concepts  of the K Means clustering Algorithm. We will mainly focus on learning to build your first  K Means clustering model. The data cleaning and preprocessing parts would be covered in detail in an upcoming post. Clustering : Clustering can be considered the most important unsupervised learning problem; so, as every other problem of this kind, it deals with finding a structure in a collection of unlabeled data. A loose definition of clustering could be “the process of organizing objects into groups whose members are similar in some way”. A cluster is therefore a collection of objects which are “similar” between them and are “dissimilar” to the objects belonging to other clusters. We can show this w...

Lost Password

Register

Do NOT follow this link or you will be banned from the site!