this post was submitted on 19 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

I'm trying to teach a lesson on gradient descent from a more statistical and theoretical perspective, and need a good example to show its usefulness.

What is the simplest possible algebraic function that would be impossible or rather difficult to optimize for, by setting its 1st derivative to 0, but easily doable with gradient descent? I preferably want to demonstrate this in context linear regression or some extremely simple machine learning model.

you are viewing a single comment's thread
view the rest of the comments
[–] idkname999@alien.top 1 points 11 months ago (2 children)

Logistic Regression is a simple machine learning model with no closed form solution.

[–] ravshanbeksk@alien.top 1 points 11 months ago (1 children)

You can write a log likelihood function and take its derivative and set it equal to zero.

[–] DoctorFuu@alien.top 1 points 11 months ago

no, to convince yourself, just try to do it.

[–] neyman-pearson@alien.top 1 points 11 months ago

Why is this not the most upvoted answer?