-
Cost function of Logistic Regression
The gradient descent can be guaranteed to converge to the global minimum. If using the mean squared error for logistic regression, the cost function is non-convex. So, it is more difficult for gradient descent to find an optimal value for w and b. Linear Regression => squared error. cost -> convexLogistic Regression ==> if we […]
-
Logistic Regression
for binary classification problemsy can only be one of two values/two classes/two categories.yes or notrue or false1 or 0 Logistic Regression is a fit curve that looks like S shape curve to the dataset.it uses sigmoid function. Sigmoid function outputs value is between 0 and 1. g(z) = 1 / (1 + e^-z). 0 < […]
-
Feature Scaling
It is more likely that a good model will learn to choose a small parameter value, like 0.1, when a possible range of values of a feature is large, like 2000. In other words, when the possible values of the feature are small, then a reasonable value for its parameters will be large. prediction price […]
-
Linear Regression with Multiple Features
The screenshot below taken from https://www.coursera.org/learn/machine-learning/home/week/2 Enable gradient descent to run much faster : Feature Scaling
-
Gradient Descent
Gradient Descent finds better and better values for the parameters w and b. It uses below equation. @ : learning rate controls how big step you take. It is a small number between 0 and 1. It might be 0.01.Derivative term : in which direction you want to take your baby stepDerivative term in combination […]
-
Cost Function
The first important step to apply linear regression is to define the cost function.Cost function will tell us how well/how fit the model is doing.We should look for more suitable w and b values based on the result. We are going to want to find values of w and b that make the cost function […]
-
Linear Regression
To train the model:You feed the training set (both the input features and the output targets) to your learning algorithm.Then your supervised learning algorithm will produce some function (f). f is called the model.Function takes a new input x and Estimates or makes prediction (y hat) for y. Here, we randomly gave the values w […]
-
Supervised Learning
Learns from data labeled with the right answer(label Y) Regression: Predict a number among infinitely many possible numbers/outputsLinear Regression: Fitting a straight line to your data Classification: Predict categories from among a limited set of categories.There are only a few possible outputs/categories/classes. Categories could be non_numeric or numeric.The learning algorithm has to decide how to […]