Welcome to the Fashion MNIST Classifier—a neural network built from scratch using only NumPy! That’s right, no fancy libraries like TensorFlow or PyTorch here. Just pure, unadulterated math, loops, and a whole lot of coffee. ☕
This project is your go-to stylist for classifying Fashion MNIST images. Whether it’s a snazzy T-shirt or a classy ankle boot, this model will tell you what’s what. And the best part? It’s all done from scratch—because why use training wheels when you can build the whole bike? 🚴♂️
- Built from Scratch: No TensorFlow, no PyTorch, no shortcuts. Just NumPy and sheer willpower.
- Adam Optimizer: Because even neural networks need a personal trainer to stay in shape. Adam keeps the gradients fit and the learning rates adaptive. 💪
- ReLU & Softmax: The dynamic duo of activation functions. ReLU brings the energy, and Softmax keeps things chill with probabilities.
- L2 Regularization: To prevent the model from overfitting like a pair of skinny jeans after Thanksgiving. 🦃
- Streamlit App: A sleek, user-friendly interface to upload images and get predictions. It’s like Tinder, but for fashion. 👗❤️👢
-
The Neural Network:
- Input Layer: 784 neurons (because 28x28 pixels = 784, not because I like big numbers).
- Hidden Layers: Two hidden layers with 128 and 64 neurons, respectively. Think of them as the middle managers of the neural network.
- Output Layer: 10 neurons (one for each Fashion MNIST class). It’s like a fashion show where every class gets a turn on the runway.
-
Training:
- The model is trained using cross-entropy loss (because we like to measure how wrong we are).
- The Adam optimizer keeps things running smoothly, like a well-oiled treadmill.
-
Streamlit App:
- Upload an image, and the model will tell you what it is. It’s like having a fashion guru in your pocket. 📱✨
- To Prove a Point: You don’t need fancy libraries to build a neural network. Sometimes, all you need is NumPy and a dream.
- To Learn: I wanted to understand the nuts and bolts of neural networks, from forward propagation to backpropagation (and all the math in between).
- To Flex: Let’s be honest, building a neural network from scratch is a flex. 💪😎
Check out the live demo of the app (https://fashionmnistclassifier-wewetkapheygy2edcnrnr8.streamlit.app).
Let me know if you need further assistance ;)