Skip to content

bsluther/cs491-fnn

Repository files navigation

This is a ground-up (almost, we used Numpy) implementation of a multilayer feedforward neural network completed as a group project for CS 491, "Neural Networks", at the University of New Mexico in Fall 2024. My contributions included the backpropagation implementation and supporting modules, testing, hyperparameter tuning, and an illustration of how the second order Newton method can be used during gradient descent. Trying to derive the correct formulas for the second-order derivatives used in the Newton method was an interesting exercise, I'm still not sure I got them right, although I was able to confirm them with a concrete example. It would be fun to re-visit them with more experience and confirm or else find the errors in them. My understanding is that in general, the cost associated computing the second-order derivatives is too large to justify their increased accuracy, so the second-order Newton method is not generally used for training neural networks.

About

A ground-up implementation of a multilayer feedforward neural network.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages