About 109,000 results
Open links in new tab
  1. Vanishing and Exploding Gradients Problems in Deep Learning

    Nov 15, 2025 · To train deep neural networks effectively, managing the Vanishing and Exploding Gradients Problems is important. These issues occur during backpropagation when gradients …

  2. Finding gradients (practice) | Khan Academy

    Find the gradient of f (x, y) = 2 x y + sin (x) . Given scalar field, what is the gradient?

  3. Meaning of the Gradient In 1-variable calculus, the derivative gives you an equation for the slope at any x-value along f(x). You can then plug in an x-value to find the actual slope at that point.

  4. Gradient descent - Wikipedia

    Gradient descent 0:14 Gradient descent in 2D Gradient descent is a method for unconstrained mathematical optimization. It is a first-order iterative algorithm for minimizing a differentiable …

  5. Dec 6, 2022 · We then illustrate the application of gradient descent to a loss function which is not merely mean squared loss (Section 3.3). And we present an important method known as stochastic gradient …

  6. Gradient Practice QuestionsCorbettmaths

    Sep 2, 2019 · Gradient Practice Questions Click here for Questions . Click here for Answers . Watch video on YouTube Error 153 Video player configuration error

  7. 13.6E: Directional Derivatives and the Gradient (Exercises)

    This page titled 13.6E: Directional Derivatives and the Gradient (Exercises) is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by OpenStax via source content that …

  8. Gradient - Practice problems by Leading Lesson

    Study guide and practice problems on 'Gradient'.

  9. Calculus III - Gradient Vector, Tangent Planes and Normal Lines ...

    Nov 16, 2022 · Here is a set of practice problems to accompany the Gradient Vector, Tangent Planes and Normal Lines section of the Applications of Partial Derivatives chapter of the notes for Paul …

  10. Understanding Vanishing and Exploding Gradient Problems

    Oct 5, 2024 · In deep neural networks (DNNs), the vanishing gradient problem is a notorious issue that plagues training, especially when using activation functions like sigmoid and tanh. The problem arises...