1
$\begingroup$

Logistic and Linear Regression have different cost functions. But I don't get how the gradient descent in logistic regression is the same as Linear Regression.

We get the Gradient Descent formula by deriving the Squared Error cost function. However in Logistic Regression we use a Logarithmic Cost function instead. I think I am lost here.

$\endgroup$
3
  • 1
    $\begingroup$Have you tried to take derivative yourself? I mean are you familiar with derivative and chain rule?$\endgroup$CommentedJan 22, 2018 at 10:42
  • $\begingroup$@Media I don't know advanced calculus but still it was weird to me that both the cost functions' derivative leads to the same formula$\endgroup$
    – Huzo
    CommentedJan 22, 2018 at 12:13
  • 1
    $\begingroup$take a look at here which may help you find all the stuff needed for ML.$\endgroup$CommentedJan 22, 2018 at 12:17

1 Answer 1

3
$\begingroup$

Gradient Descent is an universal method, you can us it with basically every loss function you can find in known ML algorithms.

In your case, you have only to derive the logarithmic cost function. You can find a detailed calculation at

https://math.stackexchange.com/questions/477207/derivative-of-cost-function-for-logistic-regression

$\endgroup$

    Start asking to get answers

    Find the answer to your question by asking.

    Ask question

    Explore related questions

    See similar questions with these tags.