This repository contains an implementation of a simple neural network from scratch using the Zig programming language. The neural network is designed to perform various machine learning tasks such as classification and regression.
- Flexible Architecture: The neural network architecture is highly customizable, allowing you to define the number of layers, the number of neurons in each layer, and the activation functions used.
- Feedforward and Backpropagation: The network supports feedforward propagation for making predictions as well as backpropagation for training on labeled data.
- Activation Functions: A range of popular activation functions are included, such as sigmoid, ReLU, and tanh, providing flexibility in designing the network.
- Loss Functions: Different loss functions, including mean squared error (MSE) and cross-entropy, can be utilized to measure the network's performance.
- Gradient Descent Optimization: The network employs gradient descent optimization algorithms, such as stochastic gradient descent (SGD), to iteratively update the weights and biases during training.