Added Gradient Descent with Momentum in Machine Learning Algorithms
Describe your change:
-
[x] Add an algorithm? I have added an implementation of the algorithm "Gradient Descent with Momentum". Momentum is a technique to accelerate gradient descent by introducing a velocity term, which helps smooth the updates and improves convergence speed, especially in regions where the cost surface is irregular (like saddle points). Momentum adds a fraction of the previous velocity to the current update. This way, it carries the previous direction of movement (velocity), helping the model avoid oscillations and reach the minimum faster.
-
[x] I have read CONTRIBUTING.md.
-
[x] This pull request is all my own work -- I have not plagiarized.
-
[x] I know that pull requests will not be merged if they fail the automated tests.
-
[x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
-
[x] All new Python files are placed inside an existing directory.
-
[x] All filenames are in all lowercase characters with no spaces or dashes.
-
[x] All functions and variable names follow Python naming conventions.
-
[x] All function parameters and return values are annotated with Python type hints.
-
[x] All functions have doctests that pass the automated testing.
-
[x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation.
-
[ ] If this pull request resolves one or more open issues then the description above includes the issue number(s) with a closing keyword: "Fixes #ISSUE-NUMBER".
Please review this pull request and let me know if any changes are required.. Thanks!
This is my first contribution can someone please review this. If there are any issues please let me know.
Closing require_type_hints PRs to prepare for Hacktoberfest