This project demonstrates a basic single-layer neural network (perceptron) implemented in Rust. It uses the sigmoid activation function and is trained using gradient descent.
- Single-layer neural network (perceptron)
- Sigmoid activation function
- Customizable learning rate and input size
- Simple training and prediction API
The example in main.rs
trains the network on a simple dataset:
let inputs = vec![
vec![0.0, 1.0],
vec![1.0, 0.0],
vec![1.0, 1.0],
vec![0.0, 0.0],
vec![0.5, 0.5],
vec![0.2, 0.8],
vec![0.8, 0.2],
vec![0.3, 0.7],
vec![0.7, 0.3],
];
let outputs = vec![1.0, 1.0, 0.0, 0.0];
git clone https://github.com/yourusername/nn-rust.git
cd nn-rust
cargo run
The program will print predictions for each input after training.
NeuralNetwork
struct: Holds weights, bias, and learning rate.sigmoid
andderivative
: Activation function and its derivative.train
: Trains the network using gradient descent.predict
: Makes predictions for new inputs.
- rand for random weight initialization.
Add to your Cargo.toml
:
[dependencies]
rand = "0.8"
MIT License
This is a simple educational example and not intended for production use.