Skip to content

Commit a385d61

Browse files
Merge pull request #276763 from sdgilley/sdg-rename-notebook
new notebook names
2 parents 84700df + e78d279 commit a385d61

9 files changed

+190
-190
lines changed

articles/machine-learning/feature-retrieval-concepts.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ featurestore1.generate_feature_retrieval_spec("./feature_retrieval_spec_folder",
6969

7070
```
7171

72-
Find detailed examples in the **2. Experiment and train models using features.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only).
72+
Find detailed examples in the **2.Experiment-train-models-using-features.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only).
7373

7474
The function generates a YAML file artifact, which has a structure similar to the structure in this example:
7575
```yaml
@@ -149,7 +149,7 @@ import shutil
149149
shutil.copy(os.path.join(args.training_data, "feature_retrieval_spec.yaml"), args.model_output)
150150
```
151151

152-
Review the **2. Experiment and train models using features.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only), for a complete pipeline example that uses a built-in feature retrieval component to generate training data and run the training job with the packaging.
152+
Review the **2.Experiment-train-models-using-features.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only), for a complete pipeline example that uses a built-in feature retrieval component to generate training data and run the training job with the packaging.
153153

154154
For training data generated by other methods, the feature retrieval specification can be passed as an input to the training job, and then handle the copy and package process in the training script.
155155

@@ -172,7 +172,7 @@ def init()
172172
init_online_lookup(features, credential)
173173
```
174174

175-
Visit the **4. Enable online store and run online inference.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only), for a detailed code snippet.
175+
Visit the **4.Enable-online-store-run-inference.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only), for a detailed code snippet.
176176

177177
## Use feature retrieval specification in batch inference
178178

@@ -183,7 +183,7 @@ Batch inference requires:
183183

184184
The feature retrieval specification used in step 1 operates the same way as it does to [generate training data](#use-feature-retrieval-specification-to-create-training-data). The built-in feature retrieval component generates the inference data. As long as the feature retrieval specification is packaged with the model, the model can serve, as a convenience, as the input to the component. This approach is an alternative to directly passing the inference data in the feature retrieval specification.
185185

186-
Visit the **3. Enable recurrent materialization and run batch inference.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only), for a detailed code snippet.
186+
Visit the **3.Enable-recurrent-materialization-run-batch-inference.ipynb** notebook, hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only), for a detailed code snippet.
187187

188188
## Built-in feature retrieval component
189189

@@ -220,8 +220,8 @@ To use the component, reference its component ID in a pipeline job YAML file, or
220220

221221
Review these notebooks for examples of the built-in component, both hosted at [this resource](https://github.com/Azure/azureml-examples/tree/main/sdk/python/featurestore_sample/notebooks/sdk_only):
222222

223-
- **2. Experiment and train models using features.ipynb**
224-
- **3. Enable recurrent materialization and run batch inference.ipynb**
223+
- **2.Experiment-train-models-using-features.ipynb**
224+
- **3.Enable-recurrent-materialization-run-batch-inference.ipynb**
225225

226226
## Next steps
227227

articles/machine-learning/feature-set-materialization-concepts.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ To avoid the limit, users should run backfill jobs in advance to [fill the gaps]
5353

5454
Before you run a data materialization job, enable the offline and/or online data materializations at the feature set level.
5555

56-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/4. Enable online store and run online inference.ipynb?name=enable-accounts-material)]
56+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/4.Enable-online-store-run-inference.ipynb?name=enable-accounts-material)]
5757

5858
You can submit the data materialization jobs as a:
5959

@@ -75,7 +75,7 @@ User can submit a backfill request with:
7575
- A list of data materialization status values - Incomplete, Complete, or None
7676
- A feature window (optional)
7777

78-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/1. Develop a feature set and register with managed feature store.ipynb?name=backfill-txns-fset)]
78+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/1.Develop-feature-set-and-register.ipynb?name=backfill-txns-fset)]
7979

8080
After submission of the backfill request, a new materialization job is created for each *data interval* that has a matching data materialization status (Incomplete, Complete, or None). Additionally, the relevant data intervals must fall within the defined *feature window*. If the data materialization status is `Pending` for a *data interval*, no materialization job is submitted for that interval.
8181

articles/machine-learning/tutorial-develop-feature-set-with-custom-source.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ You don't need to explicitly install these resources for this tutorial, because
4646

4747
### Configure the Azure Machine Learning Spark notebook
4848

49-
You can create a new notebook and execute the instructions in this tutorial step by step. You can also open and run the existing notebook *featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb*. Keep this tutorial open and refer to it for documentation links and more explanation.
49+
You can create a new notebook and execute the instructions in this tutorial step by step. You can also open and run the existing notebook *featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb*. Keep this tutorial open and refer to it for documentation links and more explanation.
5050

5151
1. On the top menu, in the **Compute** dropdown list, select **Serverless Spark Compute** under **Azure Machine Learning Serverless Spark**.
5252

@@ -61,17 +61,17 @@ You can create a new notebook and execute the instructions in this tutorial step
6161
## Set up the root directory for the samples
6262
This code cell sets up the root directory for the samples. It needs about 10 minutes to install all dependencies and start the Spark session.
6363

64-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=root-dir)]
64+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=root-dir)]
6565

6666
## Initialize the CRUD client of the feature store workspace
6767
Initialize the `MLClient` for the feature store workspace, to cover the create, read, update, and delete (CRUD) operations on the feature store workspace.
6868

69-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=init-fset-crud-client)]
69+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=init-fset-crud-client)]
7070

7171
## Initialize the feature store core SDK client
7272
As mentioned earlier, this tutorial uses the Python feature store core SDK (`azureml-featurestore`). This initialized SDK client covers create, read, update, and delete (CRUD) operations on feature stores, feature sets, and feature store entities.
7373

74-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=init-fs-core-sdk)]
74+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=init-fs-core-sdk)]
7575

7676
## Custom source definition
7777
You can define your own source loading logic from any data storage that has a custom source definition. Implement a source processor user-defined function (UDF) class (`CustomSourceTransformer` in this tutorial) to use this feature. This class should define an `__init__(self, **kwargs)` function, and a `process(self, start_time, end_time, **kwargs)` function. The `kwargs` dictionary is supplied as a part of the feature set specification definition. This definition is then passed to the UDF. The `start_time` and `end_time` parameters are calculated and passed to the UDF function.
@@ -111,11 +111,11 @@ class CustomSourceTransformer:
111111
## Create a feature set specification with a custom source, and experiment with it locally
112112
Now, create a feature set specification with a custom source definition, and use it in your development environment to experiment with the feature set. The tutorial notebook attached to **Serverless Spark Compute** serves as the development environment.
113113

114-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=create-fs-custom-src)]
114+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=create-fs-custom-src)]
115115

116116
Next, define a feature window, and display the feature values in this feature window.
117117

118-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=display-features)]
118+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=display-features)]
119119

120120
### Export as a feature set specification
121121
To register the feature set specification with the feature store, first save that specification in a specific format. Review the generated `transactions_custom_source` feature set specification. Open this file from the file tree to see the specification: `featurestore/featuresets/transactions_custom_source/spec/FeaturesetSpec.yaml`.
@@ -129,22 +129,22 @@ To learn more about the specification, see [Understanding top-level entities in
129129

130130
Feature set specification persistence offers another benefit: the feature set specification can be source controlled.
131131

132-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=dump-txn-fs-spec)]
132+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=dump-txn-fs-spec)]
133133

134134
## Register the transaction feature set with the feature store
135135
Use this code to register a feature set asset loaded from custom source with the feature store. You can then reuse that asset, and easily share it. Registration of a feature set asset offers managed capabilities, including versioning and materialization.
136136

137-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=register-txn-fset)]
137+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=register-txn-fset)]
138138

139139
Obtain the registered feature set, and print related information.
140140

141-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=get-txn-fset)]
141+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=get-txn-fset)]
142142

143143
## Test feature generation from registered feature set
144144
Use the `to_spark_dataframe()` function of the feature set to test the feature generation from the registered feature set, and display the features.
145145
print-txn-fset-sample-values
146146

147-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/5. Develop a feature set with custom source.ipynb?name=print-txn-fset-sample-values)]
147+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=print-txn-fset-sample-values)]
148148

149149
You should be able to successfully fetch the registered feature set as a Spark dataframe, and then display it. You can now use these features for a point-in-time join with observation data, and the subsequent steps in your machine learning pipeline.
150150

articles/machine-learning/tutorial-enable-recurrent-materialization-run-batch-inference.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -49,11 +49,11 @@ Before you proceed with this tutorial, be sure to complete the first and second
4949

5050
2. Start the Spark session.
5151

52-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=start-spark-session)]
52+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=start-spark-session)]
5353

5454
3. Set up the root directory for the samples.
5555

56-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=root-dir)]
56+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=root-dir)]
5757

5858
4. Set up the CLI.
5959
### [Python SDK](#tab/python)
@@ -64,33 +64,33 @@ Before you proceed with this tutorial, be sure to complete the first and second
6464

6565
1. Install the Azure Machine Learning extension.
6666

67-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_and_cli/3. Enable recurrent materialization and run batch inference.ipynb?name=install-ml-ext-cli)]
67+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_and_cli/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=install-ml-ext-cli)]
6868

6969
2. Authenticate.
7070

71-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_and_cli/3. Enable recurrent materialization and run batch inference.ipynb?name=auth-cli)]
71+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_and_cli/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=auth-cli)]
7272

7373
3. Set the default subscription.
7474

75-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_and_cli/3. Enable recurrent materialization and run batch inference.ipynb?name=set-default-subs-cli)]
75+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_and_cli/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=set-default-subs-cli)]
7676

7777
---
7878

7979
5. Initialize the project workspace CRUD (create, read, update, and delete) client.
8080

8181
The tutorial notebook runs from this current workspace.
8282

83-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=init-ws-crud-client)]
83+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=init-ws-crud-client)]
8484

8585
6. Initialize the feature store variables.
8686

8787
Be sure to update the `featurestore_name` value, to reflect what you created in the first tutorial.
8888

89-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=init-fs-crud-client)]
89+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=init-fs-crud-client)]
9090

9191
7. Initialize the feature store SDK client.
9292

93-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=init-fs-core-sdk)]
93+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=init-fs-core-sdk)]
9494

9595
## Enable recurrent materialization on the transactions feature set
9696

@@ -109,15 +109,15 @@ To handle inference of the model in production, you might want to set up recurre
109109

110110
As explained in earlier tutorials, after data is materialized (backfill or recurrent materialization), feature retrieval uses the materialized data by default.
111111

112-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=enable-recurrent-mat-txns-fset)]
112+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=enable-recurrent-mat-txns-fset)]
113113

114114
## (Optional) Save the YAML file for the feature set asset
115115

116116
You use the updated settings to save the YAML file.
117117

118118
### [Python SDK](#tab/python)
119119

120-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=dump-txn-fset-with-mat-yaml)]
120+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=dump-txn-fset-with-mat-yaml)]
121121

122122
### [Azure CLI](#tab/cli)
123123

@@ -138,7 +138,7 @@ The batch inference has these steps:
138138
> [!NOTE]
139139
> You use a job for batch inference in this example. You can also use batch endpoints in Azure Machine Learning.
140140
141-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=run-batch-inf-pipeline)]
141+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=run-batch-inf-pipeline)]
142142

143143
### Inspect the output data for batch inference
144144

@@ -151,7 +151,7 @@ In the pipeline view:
151151

152152
In the batch inference pipeline (*/project/fraud_mode/pipelines/batch_inference_pipeline.yaml*) outputs, because you didn't provide `name` or `version` values for `outputs` of `inference_step`, the system created an untracked data asset with a GUID as the name value and `1` as the version value. In this cell, you derive and then display the data path from the asset.
153153

154-
[!notebook-python[] (~/azureml-examples-temp-fix/sdk/python/featurestore_sample/notebooks/sdk_only/3. Enable recurrent materialization and run batch inference.ipynb?name=inspect-batch-inf-output-data)]
154+
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/3.Enable-recurrent-materialization-run-batch-inference.ipynb?name=inspect-batch-inf-output-data)]
155155

156156
## Clean up
157157

0 commit comments

Comments
 (0)