In this tutorial, we'll demonstrate how to implement a regression model using both BabyTorch and PyTorch. We'll predict values from a synthetic dataset created with make_regression.
import numpy as np
from sklearn.datasets import make_regression
import matplotlib.pyplot as plt
# Create regression data
x, y = make_regression(n_samples=100, n_features=1, noise=20, random_state=0)
y = (y.reshape(-1, 1) + 1) * .5
y /= np.max(y)-
Import Libraries
- Import necessary modules from BabyTorch.
import babytorch import babytorch.nn as nn from babytorch.optim import SGD
-
Model Definition
- Define a neural network with multiple linear and ReLU layers.
model = nn.Sequential( nn.Linear(1, 8, nn.ReLU()), nn.Linear(8, 16, nn.ReLU()), nn.Linear(16, 8, nn.ReLU()), nn.Linear(8, 1) )
-
Training
- Set up the optimizer and loss function, then train the model.
optimizer = SGD(model.parameters(), learning_rate=0.1) criterion = nn.MSELoss() for k in range(2000): y_pred = model(x) loss = criterion(y_pred, y) loss.backward() optimizer.step() model.zero_grad()
-
Visualization
- Visualize the training losses and model predictions.
# Assuming Grapher() is set up for plotting Grapher().plot_loss(losses)
-
Import Libraries
- Use similar libraries from PyTorch.
import torch import torch.nn as nn import torch.optim as optim
-
Model Definition
- Define an equivalent model in PyTorch.
model = nn.Sequential( nn.Linear(1, 8), nn.ReLU(), nn.Linear(8, 16), nn.ReLU(), nn.Linear(16, 8), nn.ReLU(), nn.Linear(8, 1) )
-
Training
- Configure and execute the training loop in PyTorch.
optimizer = optim.SGD(model.parameters(), lr=0.1) criterion = nn.MSELoss() for k in range(2000): optimizer.zero_grad() y_pred = model(torch.tensor(x, dtype=torch.float32)) loss = criterion(y_pred, torch.tensor(y, dtype=torch.float32)) loss.backward() optimizer.step()
-
Plot Results
- Plot the final predictions against the actual data.
plt.scatter(x, y, color='red') # Actual data plt.plot(x_sorted, y_predictions_sorted, color='blue') # Model predictions plt.show()
This tutorial provides a side-by-side implementation of regression models in BabyTorch and PyTorch, highlighting their similarities and key syntax differences. Both frameworks are powerful tools for building neural networks, with BabyTorch serving as a simplified entry point for learning and transitioning to PyTorch.
The full code is available here.