Skip to content

Commit cd3c234

Browse files
Merge pull request #1680 from fbsolo-ms1/update-tutorial-develop-feature-set-with-custom-source
Freshness update for tutorial-develop-feature-set-with-custom-source.md . . .
2 parents 46cb645 + abf8d46 commit cd3c234

File tree

1 file changed

+13
-13
lines changed

1 file changed

+13
-13
lines changed

articles/machine-learning/tutorial-develop-feature-set-with-custom-source.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.subservice: core
99
ms.topic: tutorial
1010
author: fbsolo-ms1
1111
ms.author: franksolomon
12-
ms.date: 11/28/2023
12+
ms.date: 11/21/2024
1313
ms.reviewer: yogipandey
1414
ms.custom:
1515
- sdkv2
@@ -20,7 +20,7 @@ ms.custom:
2020

2121
# Tutorial 5: Develop a feature set with a custom source
2222

23-
An Azure Machine Learning managed feature store lets you discover, create, and operationalize features. Features serve as the connective tissue in the machine learning lifecycle, starting from the prototyping phase, where you experiment with various features. That lifecycle continues to the operationalization phase, where you deploy your models, and inference steps look up the feature data. For more information about feature stores, see [feature store concepts](./concept-what-is-managed-feature-store.md).
23+
An Azure Machine Learning managed feature store lets you discover, create, and operationalize features. Features serve as the connective tissue in the machine learning lifecycle, starting from the prototyping phase, where you experiment with various features. That lifecycle continues to the operationalization phase, where you deploy your models, and inference steps look up the feature data. For more information about feature stores, visit the [feature store concepts](./concept-what-is-managed-feature-store.md) resource.
2424

2525
Part 1 of this tutorial series showed how to create a feature set specification with custom transformations, enable materialization and perform a backfill. Part 2 showed how to experiment with features in the experimentation and training flows. Part 3 explained recurrent materialization for the `transactions` feature set, and showed how to run a batch inference pipeline on the registered model. Part 4 described how to run batch inference.
2626

@@ -36,27 +36,27 @@ In this tutorial, you'll
3636
> [!NOTE]
3737
> This tutorial uses an Azure Machine Learning notebook with **Serverless Spark Compute**.
3838
39-
* Make sure you complete the previous tutorials in this series. This tutorial reuses feature store and other resources created in those earlier tutorials.
39+
* Be sure to complete the previous tutorials in this series. This tutorial reuses the feature store and other resources created in those earlier tutorials.
4040

4141
## Set up
4242

43-
This tutorial uses the Python feature store core SDK (`azureml-featurestore`). The Python SDK is used for create, read, update, and delete (CRUD) operations, on feature stores, feature sets, and feature store entities.
43+
This tutorial uses the Python feature store core SDK (`azureml-featurestore`). The Python SDK is used for create, read, update, and delete (CRUD) operations on feature stores, feature sets, and feature store entities.
4444

4545
You don't need to explicitly install these resources for this tutorial, because in the set-up instructions shown here, the `conda.yml` file covers them.
4646

4747
### Configure the Azure Machine Learning Spark notebook
4848

49-
You can create a new notebook and execute the instructions in this tutorial step by step. You can also open and run the existing notebook *featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb*. Keep this tutorial open and refer to it for documentation links and more explanation.
49+
You can create a new notebook and execute the instructions in this tutorial, step by step. You can also open and run the existing notebook *featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb*. Keep this tutorial open and refer to it for documentation links and more explanation.
5050

5151
1. On the top menu, in the **Compute** dropdown list, select **Serverless Spark Compute** under **Azure Machine Learning Serverless Spark**.
5252

53-
2. Configure the session:
53+
1. Configure the session:
5454

55-
1. Select **Configure session** in the top status bar.
56-
2. Select the **Python packages** tab, s
57-
3. Select **Upload Conda file**.
58-
4. Upload the *conda.yml* file that you [uploaded in the first tutorial](./tutorial-get-started-with-feature-store.md#prepare-the-notebook-environment).
59-
5. Optionally, increase the session time-out (idle time) to avoid frequent prerequisite reruns.
55+
1. Select **Configure session** in the top status bar
56+
1. Select the **Python packages** tab, select **Upload Conda file**
57+
1. Select **Upload Conda file**
58+
1. Upload the *conda.yml* file that you [uploaded in the first tutorial](./tutorial-get-started-with-feature-store.md#prepare-the-notebook-environment)
59+
1. Optionally, increase the session time-out (idle time) to avoid frequent prerequisite reruns
6060

6161
## Set up the root directory for the samples
6262
This code cell sets up the root directory for the samples. It needs about 10 minutes to install all dependencies and start the Spark session.
@@ -118,14 +118,14 @@ Next, define a feature window, and display the feature values in this feature wi
118118
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=display-features)]
119119

120120
### Export as a feature set specification
121-
To register the feature set specification with the feature store, first save that specification in a specific format. Review the generated `transactions_custom_source` feature set specification. Open this file from the file tree to see the specification: `featurestore/featuresets/transactions_custom_source/spec/FeaturesetSpec.yaml`.
121+
To register the feature set specification with the feature store, first save that specification in a specific format. Review the generated `transactions_custom_source` feature set specification. Open this file from the file tree to view the specification: `featurestore/featuresets/transactions_custom_source/spec/FeaturesetSpec.yaml`.
122122

123123
The specification has these elements:
124124

125125
- `features`: A list of features and their datatypes.
126126
- `index_columns`: The join keys required to access values from the feature set.
127127

128-
To learn more about the specification, see [Understanding top-level entities in managed feature store](./concept-top-level-entities-in-managed-feature-store.md) and [CLI (v2) feature set YAML schema](./reference-yaml-feature-set.md).
128+
For more information about the specification, visit the [Understanding top-level entities in managed feature store](./concept-top-level-entities-in-managed-feature-store.md) and [CLI (v2) feature set YAML schema](./reference-yaml-feature-set.md) resources.
129129

130130
Feature set specification persistence offers another benefit: the feature set specification can be source controlled.
131131

0 commit comments

Comments
 (0)