This is an example of multiple types of neural networks being used in similiar fashion on the MNIST dataset. Specifially I use a FeedForward Neural Network (FNN), Convolutional Neural Network (CNN), and a Transformer Neural Network (TNN) to see the differences between accuracy for a relatively simple dataset having the neural network depict the number an image is showing. After the testing I have concluded that the CNN general has a higher accuracy than the other two. The TNN comes last but I hypothesize if I produce a larger number of parameters the TNN will beat out the FNN with the same number of parameters. This was a collaboration between multiple undergraduate students including myslef, John Beans, Aaron Oster, and Nikolaos Rafailidis. As well as graduate student Logan Hallee all attending the University of Delaware.
MiniSharpie/NeuralNetworks-MNIST-dataset
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|