Skip to content

Commit 2cf895a

Browse files
authored
Update README.md to development (#2323)
* Update README.md * update readme
1 parent 95cf11c commit 2cf895a

File tree

1 file changed

+88
-27
lines changed
  • AI-and-Analytics/Getting-Started-Samples/Intel_Extension_For_PyTorch_GettingStarted

1 file changed

+88
-27
lines changed

AI-and-Analytics/Getting-Started-Samples/Intel_Extension_For_PyTorch_GettingStarted/README.md

Lines changed: 88 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -48,64 +48,120 @@ The sample uses pretrained model provided by Intel and published as part of [Int
4848
## Environment Setup
4949
You will need to download and install the following toolkits, tools, and components to use the sample.
5050

51+
5152
**1. Get Intel® AI Tools**
5253

53-
Required AI Tools: <https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html>
54-
<br>If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html). AI and Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option in AI Tools Selector.
54+
Required AI Tools: Intel® Extension for PyTorch* - GPU
55+
56+
If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html). AI and Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option in AI Tools Selector.
57+
58+
>**Note**: If Docker option is chosen in AI Tools Selector, refer to [Working with Preset Containers](https://github.com/intel/ai-containers/tree/main/preset) to learn how to run the docker and samples.
59+
60+
**2. (Offline Installer) Activate the AI Tools bundle base environment**
5561

56-
**2. Install dependencies**
62+
If the default path is used during the installation of AI Tools:
5763
```
58-
pip install -r requirements.txt
64+
source $HOME/intel/oneapi/intelpython/bin/activate
65+
```
66+
If a non-default path is used:
5967
```
60-
**Install Jupyter Notebook** by running `pip install notebook`. Alternatively, see [Installing Jupyter](https://jupyter.org/install) for detailed installation instructions.
68+
source <custom_path>/bin/activate
69+
```
70+
71+
**3. (Offline Installer) Activate relevant Conda environment**
72+
73+
```
74+
conda activate pytorch-gpu
75+
```
76+
77+
**4. Clone the GitHub repository**
78+
79+
```
80+
git git clone https://github.com/oneapi-src/oneAPI-samples.git
81+
cd oneAPI-samples/AI-and-Analytics/Getting-Started-Samples/Intel_Extension_For_PyTorch_GettingStarted/
82+
```
83+
84+
**5. Install dependencies**
85+
86+
>**Note**: Before running the following commands, make sure your Conda/Python environment with AI Tools installed is activated
87+
88+
```
89+
pip install -r requirements.txt
90+
pip install notebook
91+
```
92+
For Jupyter Notebook, refer to [Installing Jupyter](https://jupyter.org/install) for detailed installation instructions.
6193

6294

6395
## Run the Sample
6496
>**Note**: Before running the sample, make sure [Environment Setup](#environment-setup) is completed.
97+
6598
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html) to see relevant instructions:
6699
* [AI Tools Offline Installer (Validated)](#ai-tools-offline-installer-validated)
100+
* [Conda/PIP](#condapip)
67101
* [Docker](#docker)
68102

69103
### AI Tools Offline Installer (Validated)
70-
1. If you have not already done so, activate the AI Tools bundle base environment. If you used the default location to install AI Tools, open a terminal and type the following
104+
105+
**1. Register Conda kernel to Jupyter Notebook kernel**
106+
107+
If the default path is used during the installation of AI Tools:
71108
```
72-
source $HOME/intel/oneapi/intelpython/bin/activate
109+
$HOME/intel/oneapi/intelpython/envs/pytorch-gpu/bin/python -m ipykernel install --user --name=pytorch-gpu
73110
```
74-
If you used a separate location, open a terminal and type the following
111+
If a non-default path is used:
75112
```
76-
source <custom_path>/bin/activate
113+
<custom_path>/bin/python -m ipykernel install --user --name=pytorch-gpu
77114
```
78-
2. Clone the GitHub repository:
79-
```
80-
git clone https://github.com/oneapi-src/oneAPI-samples.git
81-
cd oneAPI-samples/AI-and-Analytics/Getting-Started-Samples/Intel_Extension_For_PyTorch_GettingStarted/
115+
**2. Launch Jupyter Notebook**
116+
117+
```
118+
jupyter notebook --ip=0.0.0.0 --port 8888 --allow-root
119+
```
120+
**3. Follow the instructions to open the URL with the token in your browser**
121+
122+
**4. Select the Notebook**
123+
124+
```
125+
ResNet50_Inference.ipynb
82126
```
83-
3. Run the Python script.
127+
128+
**5. Change the kernel to `pytorch-gpu`**
129+
130+
**6. Run every cell in the Notebook in sequence**
131+
132+
### Conda/PIP
133+
> **Note**: Before running the instructions below, make sure your Conda/Python environment with AI Tools installed is activated
134+
135+
**1. Register Conda/Python kernel to Jupyter Notebook kernel**
136+
137+
For Conda:
84138
```
85-
python Intel_Extension_For_PyTorch_Hello_World.py
139+
<CONDA_PATH_TO_ENV>/bin/python -m ipykernel install --user --name=pytorch-gpu
86140
```
87-
You will see the DNNL verbose trace after exporting the `DNNL_VERBOSE`:
141+
To know <CONDA_PATH_TO_ENV>, run `conda env list` and find your Conda environment path.
142+
143+
For PIP:
88144
```
89-
export DNNL_VERBOSE=1
145+
python -m ipykernel install --user --name=pytorch-gpu
90146
```
91-
>**Note**: Read more information about the mkldnn log at [https://oneapi-src.github.io/oneDNN/dev_guide_verbose.html](https://oneapi-src.github.io/oneDNN/dev_guide_verbose.html).
92147

93-
4. Launch Jupyter Notebook:
94-
> **Note**: You might need to register Conda kernel to Jupyter Notebook kernel,
95-
feel free to check [the instruction](https://github.com/IntelAI/models/tree/master/docs/notebooks/perf_analysis#option-1-conda-environment-creation)
148+
**2. Launch Jupyter Notebook**
149+
96150
```
97151
jupyter notebook --ip=0.0.0.0 --port 8888 --allow-root
98152
```
99153

100-
4. Follow the instructions to open the URL with the token in your browser.
101-
5. Locate and select the Notebook:
154+
**3. Follow the instructions to open the URL with the token in your browser**
155+
156+
**4. Select the Notebook**
157+
102158
```
103159
ResNet50_Inference.ipynb
104160
```
105-
6. Change your Jupyter Notebook kernel to **PyTorch** or **PyTorch-GPU**.
106161

107-
7. Run every cell in the Notebook in sequence.
162+
**5. Change the kernel to `pytorch-gpu`**
108163

164+
**6. Run every cell in the Notebook in sequence**
109165

110166

111167
### Docker
@@ -121,6 +177,11 @@ With successful execution, it will print out `[CODE_SAMPLE_COMPLETED_SUCCESSFULL
121177
## License
122178

123179
Code samples are licensed under the MIT license. See
124-
[License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt) for details.
180+
[License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt)
181+
for details.
182+
183+
Third party program Licenses can be found here:
184+
[third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt)
185+
186+
*Other names and brands may be claimed as the property of others. [Trademarks](https://www.intel.com/content/www/us/en/legal/trademarks.html)
125187

126-
Third party program Licenses can be found here: [third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt).

0 commit comments

Comments
 (0)