Skip to content

Commit 044d86d

Browse files
authored
Merge pull request #298478 from MicrosoftDocs/main
Publish to live, Friday 4 AM PST, 4/18
2 parents 48ae31c + bab5056 commit 044d86d

21 files changed

+43
-127
lines changed

articles/data-factory/connector-snowflake.md

Lines changed: 18 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ This Snowflake connector is supported for the following capabilities:
2929
|[Copy activity](copy-activity-overview.md) (source/sink)|① ②|
3030
|[Mapping data flow](concepts-data-flow-overview.md) (source/sink)|① |
3131
|[Lookup activity](control-flow-lookup-activity.md)|① ②|
32-
|[Script activity](transform-data-using-script.md)|① ②|
32+
|[Script activity](transform-data-using-script.md) (Apply version 1.1 (Preview) when you use the script parameter)|① ②|
3333

3434
*① Azure integration runtime ② Self-hosted integration runtime*
3535

@@ -93,6 +93,7 @@ These generic properties are supported for the Snowflake linked service:
9393
| Property | Description | Required |
9494
| :--------------- | :----------------------------------------------------------- | :------- |
9595
| type | The type property must be set to **SnowflakeV2**. | Yes |
96+
| version | The version that you specify. Recommend upgrading to the latest version to take advantage of the newest enhancements. | Yes for version 1.1 (Preview) |
9697
| accountIdentifier | The name of the account along with its organization. For example, myorg-account123. | Yes |
9798
| database | The default database used for the session after connecting. | Yes |
9899
| warehouse | The default virtual warehouse used for the session after connecting. |Yes|
@@ -684,19 +685,29 @@ By setting the pipeline Logging Level to None, we exclude the transmission of in
684685

685686
For more information about the properties, see [Lookup activity](control-flow-lookup-activity.md).
686687

687-
## <a name="upgrade-the-snowflake-linked-service"></a> Upgrade the Snowflake connector
688+
## <a name="differences-between-snowflake-and-snowflake-legacy"></a> Snowflake connector lifecycle and upgrade
688689

689-
To upgrade the Snowflake connector, you can do a side-by-side upgrade, or an in-place upgrade.
690+
The following table shows the release stage and change logs for different versions of the Snowflake connector:
690691

691-
### Side-by-side upgrade
692+
| Version | Release stage | Change log |
693+
| :----------- | :------- |:------- |
694+
| Snowflake V1 | GA version available | / |
695+
| Snowflake V2 (version 1.0) | GA version available | • Add support for Key pair authentication.<br><br>• The `accountIdentifier`, `warehouse`, `database`, `schema` and `role` properties are used to establish a connection instead of `connectionstring` property.<br><br>• Add support for BigDecimal in Lookup activity. The NUMBER type, as defined in Snowflake, will be displayed as a string in Lookup activity. If you want to covert it to numeric type in V2, you can use the pipeline parameter with [int function](control-flow-expression-language-functions.md#int) or [float function](control-flow-expression-language-functions.md#float). For example, `int(activity('lookup').output.firstRow.VALUE)`, `float(activity('lookup').output.firstRow.VALUE)`<br><br>• timestamp data type in Snowflake is read as DateTimeOffset data type in Lookup and Script activity. If you still need to use the Datetime value as a parameter in your pipeline after upgrading to V2, you can convert DateTimeOffset type to DateTime type by using [formatDateTime function](control-flow-expression-language-functions.md#formatdatetime) (recommended) or [concat function](control-flow-expression-language-functions.md#concat). For example: `formatDateTime(activity('lookup').output.firstRow.DATETIMETYPE)`, `concat(substring(activity('lookup').output.firstRow.DATETIMETYPE, 0, 19), 'Z')`<br><br>• Script parameters are not supported in Script activity. As an alternative, utilize dynamic expressions for script parameters. For more information, see [Expressions and functions in Azure Data Factory and Azure Synapse Analytics](control-flow-expression-language-functions.md).<br><br>• Multiple SQL statements execution in Script activity is not supported. |
696+
| Snowflake V2 (version 1.1) | Preview version available | • Add support for script parameters.<br><br>• Add support for mutiple statement execution in Script activity. |
697+
698+
### <a name="upgrade-the-snowflake-linked-service"></a> Upgrade the Snowflake connector from V1 to V2
699+
700+
To upgrade the Snowflake connector from V1 to V2, you can do a side-by-side upgrade, or an in-place upgrade.
701+
702+
#### Side-by-side upgrade
692703

693704
To perform a side-by-side upgrade, complete the following steps:
694705

695706
1. Create a new Snowflake linked service and configure it by referring to the V2 linked service properties.
696707
1. Create a dataset based on the newly created Snowflake linked service.
697708
1. Replace the new linked service and dataset with the existing ones in the pipelines that targets the V1 objects.
698709

699-
### In-place upgrade
710+
#### In-place upgrade
700711

701712
To perform an in-place upgrade, you need to edit the existing linked service payload and update dataset to use the new linked service.
702713

@@ -756,17 +767,9 @@ To perform an in-place upgrade, you need to edit the existing linked service pay
756767

757768
1. Update dataset to use the new linked service. You can either create a new dataset based on the newly created linked service, or update an existing dataset's type property from **SnowflakeTable** to **SnowflakeV2Table**.
758769

759-
## <a name="differences-between-snowflake-and-snowflake-legacy"></a> Differences between Snowflake V2 and V1
760-
761-
The Snowflake V2 connector offers new functionalities and is compatible with most features of Snowflake V1 connector. The table below shows the feature differences between V2 and V1.
770+
### Upgrade the Snowflake V2 connector from version 1.0 to version 1.1 (Preview)
762771

763-
| Snowflake V2 | Snowflake V1 |
764-
| :----------- | :------- |
765-
| Support Basic and Key pair authentication. | Support Basic authentication. |
766-
| Script parameters are not supported in Script activity currently. As an alternative, utilize dynamic expressions for script parameters. For more information, see [Expressions and functions in Azure Data Factory and Azure Synapse Analytics](control-flow-expression-language-functions.md). | Support script parameters in Script activity. |
767-
| Support BigDecimal in Lookup activity. The NUMBER type, as defined in Snowflake, will be displayed as a string in Lookup activity. If you want to covert it to numeric type, you can use the pipeline parameter with [int function](control-flow-expression-language-functions.md#int) or [float function](control-flow-expression-language-functions.md#float). For example, `int(activity('lookup').output.firstRow.VALUE)`, `float(activity('lookup').output.firstRow.VALUE)`| BigDecimal is not supported in Lookup activity. |
768-
| The `accountIdentifier`, `warehouse`, `database`, `schema` and `role` properties are used to establish a connection. | The `connectionstring` property is used to establish a connection. |
769-
| timestamp data type in Snowflake is read as DateTimeOffset data type in Lookup and Script activity. | timestamp data type in Snowflake is read as DateTime data type in Lookup and Script activity.<br> If you still need to use the Datetime value as a parameter in your pipeline after upgrading the connector, you can convert DateTimeOffset type to DateTime type by using [formatDateTime function](control-flow-expression-language-functions.md#formatdatetime) (recommended) or [concat function](control-flow-expression-language-functions.md#concat). For example: `formatDateTime(activity('lookup').output.firstRow.DATETIMETYPE)`, `concat(substring(activity('lookup').output.firstRow.DATETIMETYPE, 0, 19), 'Z')`|
772+
In **Edit linked service** page, select 1.1 for version. For more information, see [Linked service properties](#linked-service-properties).
770773

771774
## Related content
772775

articles/digital-twins/concepts-3d-scenes-studio.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ This article gives an overview of 3D Scenes Studio and its key features. For com
1919

2020
## Studio overview
2121

22-
Work in 3D Scenes Studio is built around the concept of *scenes*. A scene is a view of a single business environment, and is comprised of 3D content, custom business logic, and references to an Azure Digital Twins instance. You can have multiple scenes for a single digital twin instance.
22+
Work in 3D Scenes Studio is built around the concept of *scenes*. A scene is a view of a single business environment, and is composed of 3D content, custom business logic, and references to an Azure Digital Twins instance. You can have multiple scenes for a single digital twin instance.
2323

2424
Scenes are configured in the [builder](#builder) inside the 3D Scenes Studio. Then, you can view your finished scenes in the studio's [built-in view experience](#viewer), or [embedded in custom web applications](#embeddable-viewer-component). You can extend the built-in viewer or create your own viewers that access the 3D Scenes files and your Azure Digital Twins graph.
2525

@@ -43,7 +43,7 @@ To share your scenes with someone else, the recipient will need at least *Reader
4343
## Set up
4444

4545
To work with 3D Scenes Studio, you'll need the following required resources:
46-
* An [Azure Digital Twins instance](how-to-set-up-instance-portal.md)
46+
* An [Azure Digital Twins instance](how-to-set-up-instance-cli.md)
4747
* You'll need *Azure Digital Twins Data Owner* or *Azure Digital Twins Data Reader* access to the instance
4848
* The instance should be populated with [models](concepts-models.md) and [twins](concepts-twins-graph.md)
4949

articles/digital-twins/concepts-cli.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,8 +24,6 @@ Some of the actions you can do using the command set include:
2424

2525
The command set is called `az dt`, and is part of the [Azure IoT extension for Azure CLI](https://github.com/Azure/azure-iot-cli-extension). You can view the full list of commands and their usage as part of the reference documentation for the `az iot` command set: [az dt command reference](/cli/azure/dt).
2626

27-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
28-
2927
## Uses (deploy and validate)
3028

3129
Apart from generally managing your instance, the CLI is also a useful tool for deployment and validation.

articles/digital-twins/how-to-create-app-registration.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,6 @@ ms.service: azure-digital-twins
1313

1414
This article describes how to create an [Microsoft Entra ID](../active-directory/fundamentals/active-directory-whatis.md) *app registration* that can access Azure Digital Twins. This article includes steps for the [Azure portal](https://portal.azure.com) and the [Azure CLI](/cli/azure/what-is-azure-cli).
1515

16-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
17-
1816
When working with Azure Digital Twins, it's common to interact with your instance through client applications. Those applications need to authenticate with Azure Digital Twins, and some of the [authentication mechanisms](how-to-authenticate-client.md) that apps can use involve an app registration.
1917

2018
The app registration isn't required for all authentication scenarios. However, if you're using an authentication strategy or code sample that does require an app registration, this article shows you how to set one up and grant it permissions to the Azure Digital Twins APIs. It also covers how to collect important values that you need to use the app registration when authenticating.
@@ -236,8 +234,6 @@ The app registration should show up in the list along with the role you assigned
236234

237235
# [CLI](#tab/cli)
238236

239-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
240-
241237
Use the [az dt role-assignment create](/cli/azure/dt/role-assignment#az-dt-role-assignment-create) command to assign the role (you must have [sufficient permissions](how-to-set-up-instance-cli.md#prerequisites-permission-requirements) in the Azure subscription). The command requires you to pass in the name of the role you want to assign, the name of your Azure Digital Twins instance, and either the name or the object ID of the app registration.
242238

243239
```azurecli-interactive

articles/digital-twins/how-to-create-data-history-connection.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -165,9 +165,7 @@ Now that you created the required resources, use the command in this section to
165165

166166
This command also creates three tables in your Azure Data Explorer database to store twin property updates, relationship lifecycle events, and twin lifecycle events, respectively. For more information about these types of historized data and their corresponding Azure Data Explorer tables, see [Data types and schemas](concepts-data-history.md#data-types-and-schemas).
167167

168-
# [CLI](#tab/cli)
169-
170-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
168+
# [CLI](#tab/cli)
171169

172170
Use the command in this section to create a data history connection and the tables in Azure Data Explorer. The command always creates a table for historized twin property updates, and it includes parameters to create the tables for relationship lifecycle and twin lifecycle events.
173171

articles/digital-twins/how-to-create-endpoints.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,6 @@ This article explains how to create an *endpoint* for Azure Digital Twin events
1616

1717
Routing [event notifications](concepts-event-notifications.md) from Azure Digital Twins to downstream services or connected compute resources is a two-step process: create endpoints, then create event routes that send data to those endpoints. This article covers the first step, setting up endpoints that can receive the events. Later, you can create [event routes](how-to-create-routes.md) that specify which events generated by Azure Digital Twins are delivered to which endpoints.
1818

19-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
20-
2119
## Prerequisites
2220

2321
* An Azure account, which you can [set up for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
@@ -84,8 +82,6 @@ Now the Event Grid topic, event hub, or Service Bus topic is available as an end
8482

8583
# [CLI](#tab/cli)
8684

87-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
88-
8985
The following examples show how to create endpoints using the [az dt endpoint create](/cli/azure/dt/endpoint/create) command for the [Azure Digital Twins CLI](/cli/azure/dt). Replace the placeholders in the commands with the details of your own resources.
9086

9187
To create an Event Grid endpoint:

articles/digital-twins/how-to-create-routes.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -33,8 +33,6 @@ Next, follow the instructions below if you intend to use the Azure CLI while fol
3333

3434
[!INCLUDE [azure-cli-prepare-your-environment-h3.md](~/reusable-content/azure-cli/azure-cli-prepare-your-environment-h3.md)]
3535

36-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
37-
3836
## Create an event route
3937

4038
After [creating an endpoint](how-to-create-endpoints.md), you'll need to define an *event route* to actually send data to the endpoint. These routes let developers wire up event flow, throughout the system and to downstream services. A single route can allow multiple notifications and event types to be selected. Read more about event routes in [Endpoints and event routes](concepts-route-events.md).
@@ -78,8 +76,6 @@ When finished, select the **Save** button to create your event route.
7876

7977
# [CLI](#tab/cli2)
8078

81-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
82-
8379
Routes can be managed using the [az dt route](/cli/azure/dt/route) commands for the Azure Digital Twins CLI.
8480

8581
For more information about using the CLI and what commands are available, see [Azure Digital Twins CLI command set](concepts-cli.md).

articles/digital-twins/how-to-enable-private-link.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -168,8 +168,6 @@ Once a private endpoint has been created for your Azure Digital Twins instance,
168168

169169
For more information and examples, see the [az dt network private-endpoint reference documentation](/cli/azure/dt/network/private-endpoint).
170170

171-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
172-
173171
### Get more Private Link information
174172

175173
You can get more information about the Private Link status of your instance with the [az dt network private-link](/cli/azure/dt/network/private-link) commands. Operations include:
@@ -202,8 +200,6 @@ To disable or enable public network access in the [Azure portal](https://portal.
202200

203201
# [CLI](#tab/cli-2)
204202

205-
[!INCLUDE [digital-twins-cli-issue.md](includes/digital-twins-cli-issue.md)]
206-
207203
In the Azure CLI, you can disable or enable public network access by adding a `--public-network-access` parameter to the `az dt create` command. While this command can also be used to create a new instance, you can use it to edit the properties of an existing instance by providing it the name of an instance that already exists. (For more information about this command, see its [reference documentation](/cli/azure/dt#az-dt-create) or the [general instructions for setting up an Azure Digital Twins instance](how-to-set-up-instance-cli.md#create-the-azure-digital-twins-instance)).
208204

209205
To disable public network access for an Azure Digital Twins instance, use the `--public-network-access` parameter like this:

articles/digital-twins/how-to-ingest-iot-hub-data.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -53,8 +53,6 @@ You then need to create one twin using this model. Use the following command to
5353
az dt twin create --dt-name <instance-hostname-or-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties '{"Temperature": 0.0}'
5454
```
5555

56-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
57-
5856
When the twin is created successfully, the CLI output from the command should look something like this:
5957
```json
6058
{

articles/digital-twins/how-to-set-up-instance-cli.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,8 +44,6 @@ Use these values in the following [az dt command](/cli/azure/dt) to create the i
4444
az dt create --dt-name <name-for-your-Azure-Digital-Twins-instance> --resource-group <your-resource-group> --location <region>
4545
```
4646

47-
[!INCLUDE [digital-twins-cli-issue](includes/digital-twins-cli-issue.md)]
48-
4947
There are several optional parameters that can be added to the command to specify additional things about your resource during creation, including creating a managed identity for the instance or enabling/disabling public network access. For a full list of supported parameters, see the [az dt create](/cli/azure/dt#az-dt-create) reference documentation.
5048

5149
### Create the instance with a managed identity

0 commit comments

Comments
 (0)