I’m trying to teach a lesson on gradient descent from a more statistical and theoretical perspective, and need a good example to show its usefulness.

What is the simplest possible algebraic function that would be impossible or rather difficult to optimize for, by setting its 1st derivative to 0, but easily doable with gradient descent? I preferably want to demonstrate this in context linear regression or some extremely simple machine learning model.

  • ravshanbeksk@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Give an example of linear regression that does not support zero conditional mean assumption. Like auto regressive process. You cannot take derivative then because it is not differentiable. However gradient descent doesn’t care about all that