You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| type | The type property must be set to **SnowflakeV2**. | Yes |
96
+
| version | The version that you specify. Recommend upgrading to the latest version to take advantage of the newest enhancements. | Yes for version 1.1 (Preview) |
96
97
| accountIdentifier | The name of the account along with its organization. For example, myorg-account123. | Yes |
97
98
| database | The default database used for the session after connecting. | Yes |
98
99
| warehouse | The default virtual warehouse used for the session after connecting. |Yes|
@@ -684,19 +685,29 @@ By setting the pipeline Logging Level to None, we exclude the transmission of in
684
685
685
686
For more information about the properties, see [Lookup activity](control-flow-lookup-activity.md).
686
687
687
-
## <aname="upgrade-the-snowflake-linked-service"></a> Upgrade the Snowflake connector
688
+
## <aname="differences-between-snowflake-and-snowflake-legacy"></a> Snowflake connector lifecycle and upgrade
688
689
689
-
To upgrade the Snowflake connector, you can do a side-by-side upgrade, or an in-place upgrade.
690
+
The following table shows the release stage and change logs for different versions of the Snowflake connector:
690
691
691
-
### Side-by-side upgrade
692
+
| Version | Release stage | Change log |
693
+
| :----------- | :------- |:------- |
694
+
| Snowflake V1 | GA version available | / |
695
+
| Snowflake V2 (version 1.0) | GA version available | • Add support for Key pair authentication.<br><br>• The `accountIdentifier`, `warehouse`, `database`, `schema` and `role` properties are used to establish a connection instead of `connectionstring` property.<br><br>• Add support for BigDecimal in Lookup activity. The NUMBER type, as defined in Snowflake, will be displayed as a string in Lookup activity. If you want to covert it to numeric type in V2, you can use the pipeline parameter with [int function](control-flow-expression-language-functions.md#int) or [float function](control-flow-expression-language-functions.md#float). For example, `int(activity('lookup').output.firstRow.VALUE)`, `float(activity('lookup').output.firstRow.VALUE)`<br><br>• timestamp data type in Snowflake is read as DateTimeOffset data type in Lookup and Script activity. If you still need to use the Datetime value as a parameter in your pipeline after upgrading to V2, you can convert DateTimeOffset type to DateTime type by using [formatDateTime function](control-flow-expression-language-functions.md#formatdatetime) (recommended) or [concat function](control-flow-expression-language-functions.md#concat). For example: `formatDateTime(activity('lookup').output.firstRow.DATETIMETYPE)`, `concat(substring(activity('lookup').output.firstRow.DATETIMETYPE, 0, 19), 'Z')`<br><br>• Script parameters are not supported in Script activity. As an alternative, utilize dynamic expressions for script parameters. For more information, see [Expressions and functions in Azure Data Factory and Azure Synapse Analytics](control-flow-expression-language-functions.md).<br><br>• Multiple SQL statements execution in Script activity is not supported. |
696
+
| Snowflake V2 (version 1.1) | Preview version available | • Add support for script parameters.<br><br>• Add support for mutiple statement execution in Script activity. |
697
+
698
+
### <aname="upgrade-the-snowflake-linked-service"></a> Upgrade the Snowflake connector from V1 to V2
699
+
700
+
To upgrade the Snowflake connector from V1 to V2, you can do a side-by-side upgrade, or an in-place upgrade.
701
+
702
+
#### Side-by-side upgrade
692
703
693
704
To perform a side-by-side upgrade, complete the following steps:
694
705
695
706
1. Create a new Snowflake linked service and configure it by referring to the V2 linked service properties.
696
707
1. Create a dataset based on the newly created Snowflake linked service.
697
708
1. Replace the new linked service and dataset with the existing ones in the pipelines that targets the V1 objects.
698
709
699
-
### In-place upgrade
710
+
####In-place upgrade
700
711
701
712
To perform an in-place upgrade, you need to edit the existing linked service payload and update dataset to use the new linked service.
702
713
@@ -756,17 +767,9 @@ To perform an in-place upgrade, you need to edit the existing linked service pay
756
767
757
768
1. Update dataset to use the new linked service. You can either create a new dataset based on the newly created linked service, or update an existing dataset's type property from **SnowflakeTable** to **SnowflakeV2Table**.
758
769
759
-
## <aname="differences-between-snowflake-and-snowflake-legacy"></a> Differences between Snowflake V2 and V1
760
-
761
-
The Snowflake V2 connector offers new functionalities and is compatible with most features of Snowflake V1 connector. The table below shows the feature differences between V2 and V1.
770
+
### Upgrade the Snowflake V2 connector from version 1.0 to version 1.1 (Preview)
762
771
763
-
| Snowflake V2 | Snowflake V1 |
764
-
| :----------- | :------- |
765
-
| Support Basic and Key pair authentication. | Support Basic authentication. |
766
-
| Script parameters are not supported in Script activity currently. As an alternative, utilize dynamic expressions for script parameters. For more information, see [Expressions and functions in Azure Data Factory and Azure Synapse Analytics](control-flow-expression-language-functions.md). | Support script parameters in Script activity. |
767
-
| Support BigDecimal in Lookup activity. The NUMBER type, as defined in Snowflake, will be displayed as a string in Lookup activity. If you want to covert it to numeric type, you can use the pipeline parameter with [int function](control-flow-expression-language-functions.md#int) or [float function](control-flow-expression-language-functions.md#float). For example, `int(activity('lookup').output.firstRow.VALUE)`, `float(activity('lookup').output.firstRow.VALUE)`| BigDecimal is not supported in Lookup activity. |
768
-
| The `accountIdentifier`, `warehouse`, `database`, `schema` and `role` properties are used to establish a connection. | The `connectionstring` property is used to establish a connection. |
769
-
| timestamp data type in Snowflake is read as DateTimeOffset data type in Lookup and Script activity. | timestamp data type in Snowflake is read as DateTime data type in Lookup and Script activity.<br> If you still need to use the Datetime value as a parameter in your pipeline after upgrading the connector, you can convert DateTimeOffset type to DateTime type by using [formatDateTime function](control-flow-expression-language-functions.md#formatdatetime) (recommended) or [concat function](control-flow-expression-language-functions.md#concat). For example: `formatDateTime(activity('lookup').output.firstRow.DATETIMETYPE)`, `concat(substring(activity('lookup').output.firstRow.DATETIMETYPE, 0, 19), 'Z')`|
772
+
In **Edit linked service** page, select 1.1 for version. For more information, see [Linked service properties](#linked-service-properties).
Copy file name to clipboardExpand all lines: articles/digital-twins/concepts-3d-scenes-studio.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ This article gives an overview of 3D Scenes Studio and its key features. For com
19
19
20
20
## Studio overview
21
21
22
-
Work in 3D Scenes Studio is built around the concept of *scenes*. A scene is a view of a single business environment, and is comprised of 3D content, custom business logic, and references to an Azure Digital Twins instance. You can have multiple scenes for a single digital twin instance.
22
+
Work in 3D Scenes Studio is built around the concept of *scenes*. A scene is a view of a single business environment, and is composed of 3D content, custom business logic, and references to an Azure Digital Twins instance. You can have multiple scenes for a single digital twin instance.
23
23
24
24
Scenes are configured in the [builder](#builder) inside the 3D Scenes Studio. Then, you can view your finished scenes in the studio's [built-in view experience](#viewer), or [embedded in custom web applications](#embeddable-viewer-component). You can extend the built-in viewer or create your own viewers that access the 3D Scenes files and your Azure Digital Twins graph.
25
25
@@ -43,7 +43,7 @@ To share your scenes with someone else, the recipient will need at least *Reader
43
43
## Set up
44
44
45
45
To work with 3D Scenes Studio, you'll need the following required resources:
46
-
* An [Azure Digital Twins instance](how-to-set-up-instance-portal.md)
46
+
* An [Azure Digital Twins instance](how-to-set-up-instance-cli.md)
47
47
* You'll need *Azure Digital Twins Data Owner* or *Azure Digital Twins Data Reader* access to the instance
48
48
* The instance should be populated with [models](concepts-models.md) and [twins](concepts-twins-graph.md)
Copy file name to clipboardExpand all lines: articles/digital-twins/concepts-cli.md
-2Lines changed: 0 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,8 +24,6 @@ Some of the actions you can do using the command set include:
24
24
25
25
The command set is called `az dt`, and is part of the [Azure IoT extension for Azure CLI](https://github.com/Azure/azure-iot-cli-extension). You can view the full list of commands and their usage as part of the reference documentation for the `az iot` command set: [az dt command reference](/cli/azure/dt).
Copy file name to clipboardExpand all lines: articles/digital-twins/how-to-create-app-registration.md
-4Lines changed: 0 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,8 +13,6 @@ ms.service: azure-digital-twins
13
13
14
14
This article describes how to create an [Microsoft Entra ID](../active-directory/fundamentals/active-directory-whatis.md)*app registration* that can access Azure Digital Twins. This article includes steps for the [Azure portal](https://portal.azure.com) and the [Azure CLI](/cli/azure/what-is-azure-cli).
When working with Azure Digital Twins, it's common to interact with your instance through client applications. Those applications need to authenticate with Azure Digital Twins, and some of the [authentication mechanisms](how-to-authenticate-client.md) that apps can use involve an app registration.
19
17
20
18
The app registration isn't required for all authentication scenarios. However, if you're using an authentication strategy or code sample that does require an app registration, this article shows you how to set one up and grant it permissions to the Azure Digital Twins APIs. It also covers how to collect important values that you need to use the app registration when authenticating.
@@ -236,8 +234,6 @@ The app registration should show up in the list along with the role you assigned
Use the [az dt role-assignment create](/cli/azure/dt/role-assignment#az-dt-role-assignment-create) command to assign the role (you must have [sufficient permissions](how-to-set-up-instance-cli.md#prerequisites-permission-requirements) in the Azure subscription). The command requires you to pass in the name of the role you want to assign, the name of your Azure Digital Twins instance, and either the name or the object ID of the app registration.
Copy file name to clipboardExpand all lines: articles/digital-twins/how-to-create-data-history-connection.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -165,9 +165,7 @@ Now that you created the required resources, use the command in this section to
165
165
166
166
This command also creates three tables in your Azure Data Explorer database to store twin property updates, relationship lifecycle events, and twin lifecycle events, respectively. For more information about these types of historized data and their corresponding Azure Data Explorer tables, see [Data types and schemas](concepts-data-history.md#data-types-and-schemas).
Use the command in this section to create a data history connection and the tables in Azure Data Explorer. The command always creates a table for historized twin property updates, and it includes parameters to create the tables for relationship lifecycle and twin lifecycle events.
Copy file name to clipboardExpand all lines: articles/digital-twins/how-to-create-endpoints.md
-4Lines changed: 0 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,8 +16,6 @@ This article explains how to create an *endpoint* for Azure Digital Twin events
16
16
17
17
Routing [event notifications](concepts-event-notifications.md) from Azure Digital Twins to downstream services or connected compute resources is a two-step process: create endpoints, then create event routes that send data to those endpoints. This article covers the first step, setting up endpoints that can receive the events. Later, you can create [event routes](how-to-create-routes.md) that specify which events generated by Azure Digital Twins are delivered to which endpoints.
The following examples show how to create endpoints using the [az dt endpoint create](/cli/azure/dt/endpoint/create) command for the [Azure Digital Twins CLI](/cli/azure/dt). Replace the placeholders in the commands with the details of your own resources.
After [creating an endpoint](how-to-create-endpoints.md), you'll need to define an *event route* to actually send data to the endpoint. These routes let developers wire up event flow, throughout the system and to downstream services. A single route can allow multiple notifications and event types to be selected. Read more about event routes in [Endpoints and event routes](concepts-route-events.md).
@@ -78,8 +76,6 @@ When finished, select the **Save** button to create your event route.
You can get more information about the Private Link status of your instance with the [az dt network private-link](/cli/azure/dt/network/private-link) commands. Operations include:
@@ -202,8 +200,6 @@ To disable or enable public network access in the [Azure portal](https://portal.
In the Azure CLI, you can disable or enable public network access by adding a `--public-network-access` parameter to the `az dt create` command. While this command can also be used to create a new instance, you can use it to edit the properties of an existing instance by providing it the name of an instance that already exists. (For more information about this command, see its [reference documentation](/cli/azure/dt#az-dt-create) or the [general instructions for setting up an Azure Digital Twins instance](how-to-set-up-instance-cli.md#create-the-azure-digital-twins-instance)).
208
204
209
205
To disable public network access for an Azure Digital Twins instance, use the `--public-network-access` parameter like this:
There are several optional parameters that can be added to the command to specify additional things about your resource during creation, including creating a managed identity for the instance or enabling/disabling public network access. For a full list of supported parameters, see the [az dt create](/cli/azure/dt#az-dt-create) reference documentation.
0 commit comments