Skip to content

Commit ffa9c84

Browse files
authored
Update README.md as per review comments
1 parent 74d46ab commit ffa9c84

File tree

1 file changed

+4
-3
lines changed
  • AI-and-Analytics/Getting-Started-Samples/IntelJAX_GettingStarted

1 file changed

+4
-3
lines changed

AI-and-Analytics/Getting-Started-Samples/IntelJAX_GettingStarted/README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ The `JAX Getting Started` sample demonstrates how to train a JAX model and run i
1111

1212
JAX is a high-performance numerical computing library that enables automatic differentiation. It provides features like just-in-time compilation and efficient parallelization for machine learning and scientific computing tasks.
1313

14-
This sample code shows how to get started with JAX in CPU. The sample code defines a simple neural network that trains on the MNIST dataset using JAX for parallel computations across multiple CPU cores. The network trains over multiple epochs, evaluates accuracy, and adjusts parameters using stochastic gradient descent across devices.
14+
This sample code shows how to get started with JAX on CPU. The sample code defines a simple neural network that trains on the MNIST dataset using JAX for parallel computations across multiple CPU cores. The network trains over multiple epochs, evaluates accuracy, and adjusts parameters using stochastic gradient descent across devices.
1515

1616
## Prerequisites
1717

@@ -25,7 +25,8 @@ This sample code shows how to get started with JAX in CPU. The sample code defin
2525
2626
## Key Implementation Details
2727

28-
The example implementation involves a python file 'spmd_mnist_classifier_fromscratch.py' under the examples directory from the jax repo [(https://github.com/google/jax/)].
28+
The getting-started sample code uses the python file 'spmd_mnist_classifier_fromscratch.py' under the examples directory in the
29+
[jax repository](https://github.com/google/jax/).
2930
It implements a simple neural network's training and inference for mnist images. The images are downloaded to a temporary directory when the example is run first.
3031
- **init_random_params** initializes the neural network weights and biases for each layer.
3132
- **predict** computes the forward pass of the network, applying weights, biases, and activations to inputs.
@@ -84,7 +85,7 @@ Go to the section which corresponds to the installation method chosen in [AI Too
8485
### Docker
8586
AI Tools Docker images already have Get Started samples pre-installed. Refer to [Working with Preset Containers](https://github.com/intel/ai-containers/tree/main/preset) to learn how to run the docker and samples.
8687
## Example Output
87-
1. With the initial run, you should see results similar to the following:
88+
1. When the program is run, you should see results similar to the following:
8889

8990
```
9091
downloaded https://storage.googleapis.com/cvdf-datasets/mnist/train-images-idx3-ubyte.gz to /tmp/jax_example_data/

0 commit comments

Comments
 (0)