Skip to content

Commit a5daa3d

Browse files
authored
Merge pull request #209465 from MicrosoftDocs/main
Publish to Live, Monday 4AM PST, 8/29
2 parents defbffd + f6ae142 commit a5daa3d

25 files changed

+42
-44
lines changed

articles/cognitive-services/Speech-Service/record-custom-voice-samples.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -267,7 +267,7 @@ Use a stand to hold the script. Avoid angling the stand so that it can reflect s
267267

268268
The person operating the recording equipment — the recording engineer — should be in a separate room from the talent, with some way to talk to the talent in the recording booth (a *talkback circuit*).
269269

270-
The recording should contain as little noise as possible, with a goal of an 80-dB signal-to-noise ratio or better.
270+
The recording should contain as little noise as possible, with a goal of -80 dB.
271271

272272
Listen closely to a recording of silence in your "booth," figure out where any noise is coming from, and eliminate the cause. Common sources of noise are air vents, fluorescent light ballasts, traffic on nearby roads, and equipment fans (even notebook PCs might have fans). Microphones and cables can pick up electrical noise from nearby AC wiring, usually a hum or buzz. A buzz can also be caused by a *ground loop*, which is caused by having equipment plugged into more than one electrical circuit.
273273

articles/data-factory/connector-azure-blob-storage.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 07/04/2022
11+
ms.date: 08/24/2022
1212
---
1313

1414
# Copy and transform data in Azure Blob storage by using Azure Data Factory or Azure Synapse Analytics
@@ -257,7 +257,8 @@ These properties are supported for an Azure Blob storage linked service:
257257
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
258258
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
259259
| servicePrincipalId | Specify the application's client ID. | Yes |
260-
| servicePrincipalKey | Specify the application's key. Mark this field as **SecureString** to store it securelyFactory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
260+
| servicePrincipalCredentialType | The credential type to use for service principal authentication. Allowed values are **ServicePrincipalKey** and **ServicePrincipalCert**. | Yes |
261+
| servicePrincipalCredential | The service principal credential. <br/> When you use **ServicePrincipalKey** as the credential type, specify the application's key. Mark this field as **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). <br/> When you use **ServicePrincipalCert** as the credential, reference a certificate in Azure Key Vault, and ensure the certificate content type is **PKCS #12**.| Yes |
261262
| tenant | Specify the tenant information (domain name or tenant ID) under which your application resides. Retrieve it by hovering over the upper-right corner of the Azure portal. | Yes |
262263
| azureCloudType | For service principal authentication, specify the type of Azure cloud environment, to which your Azure Active Directory application is registered. <br/> Allowed values are **AzurePublic**, **AzureChina**, **AzureUsGovernment**, and **AzureGermany**. By default, the data factory or Synapse pipeline's cloud environment is used. | No |
263264
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |

articles/data-factory/connector-snowflake.md

Lines changed: 14 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 07/28/2022
11+
ms.date: 08/24/2022
1212
---
1313

1414
# Copy and transform data in Snowflake using Azure Data Factory or Azure Synapse Analytics
@@ -185,7 +185,7 @@ To copy data from Snowflake, the following properties are supported in the Copy
185185
| :--------------------------- | :----------------------------------------------------------- | :------- |
186186
| type | The type property of the Copy activity source must be set to **SnowflakeSource**. | Yes |
187187
| query | Specifies the SQL query to read data from Snowflake. If the names of the schema, table and columns contain lower case, quote the object identifier in query e.g. `select * from "schema"."myTable"`.<br>Executing stored procedure is not supported. | No |
188-
| exportSettings | Advanced settings used to retrieve data from Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. | No |
188+
| exportSettings | Advanced settings used to retrieve data from Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. | Yes |
189189
| ***Under `exportSettings`:*** | | |
190190
| type | The type of export command, set to **SnowflakeExportCopyCommand**. | Yes |
191191
| additionalCopyOptions | Additional copy options, provided as a dictionary of key-value pairs. Examples: MAX_FILE_SIZE, OVERWRITE. For more information, see [Snowflake Copy Options](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html#copy-options-copyoptions). | No |
@@ -289,8 +289,11 @@ To use this feature, create an [Azure Blob storage linked service](connector-azu
289289
],
290290
"typeProperties": {
291291
"source": {
292-
"type": "SnowflakeSource",
293-
"sqlReaderQuery": "SELECT * FROM MyTable"
292+
"type": "SnowflakeSource",
293+
"sqlReaderQuery": "SELECT * FROM MyTable",
294+
"exportSettings": {
295+
"type": "SnowflakeExportCopyCommand"
296+
}
294297
},
295298
"sink": {
296299
"type": "<sink type>"
@@ -320,7 +323,7 @@ To copy data to Snowflake, the following properties are supported in the Copy ac
320323
| :---------------- | :----------------------------------------------------------- | :-------------------------------------------- |
321324
| type | The type property of the Copy activity sink, set to **SnowflakeSink**. | Yes |
322325
| preCopyScript | Specify a SQL query for the Copy activity to run before writing data into Snowflake in each run. Use this property to clean up the preloaded data. | No |
323-
| importSettings | Advanced settings used to write data into Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. | No |
326+
| importSettings | Advanced settings used to write data into Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. | Yes |
324327
| ***Under `importSettings`:*** | | |
325328
| type | The type of import command, set to **SnowflakeImportCopyCommand**. | Yes |
326329
| additionalCopyOptions | Additional copy options, provided as a dictionary of key-value pairs. Examples: ON_ERROR, FORCE, LOAD_UNCERTAIN_FILES. For more information, see [Snowflake Copy Options](https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#copy-options-copyoptions). | No |
@@ -390,10 +393,10 @@ If your source data store and format meet the criteria described in this section
390393
"type": "SnowflakeImportCopyCommand",
391394
"copyOptions": {
392395
"FORCE": "TRUE",
393-
"ON_ERROR": "SKIP_FILE",
396+
"ON_ERROR": "SKIP_FILE"
394397
},
395398
"fileFormatOptions": {
396-
"DATE_FORMAT": "YYYY-MM-DD",
399+
"DATE_FORMAT": "YYYY-MM-DD"
397400
}
398401
}
399402
}
@@ -435,7 +438,10 @@ To use this feature, create an [Azure Blob storage linked service](connector-azu
435438
"type": "<source type>"
436439
},
437440
"sink": {
438-
"type": "SnowflakeSink"
441+
"type": "SnowflakeSink",
442+
"importSettings": {
443+
"type": "SnowflakeImportCopyCommand"
444+
}
439445
},
440446
"enableStaging": true,
441447
"stagingSettings": {

articles/databox/data-box-deploy-copy-data-via-nfs.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ author: alkohli
77
ms.service: databox
88
ms.subservice: pod
99
ms.topic: tutorial
10-
ms.date: 03/11/2022
10+
ms.date: 08/26/2022
1111
ms.author: alkohli
1212
#Customer intent: As an IT admin, I need to be able to copy data to Data Box to upload on-premises data from my server onto Azure.
1313
---
@@ -46,10 +46,10 @@ The following table shows the UNC path to the shares on your Data Box and Azure
4646

4747
| Azure Storage type| Data Box shares |
4848
|-------------------|--------------------------------------------------------------------------------|
49-
| Azure Block blobs | <li>UNC path to shares: `//<DeviceIPAddress>/<StorageAccountName_BlockBlob>/<ContainerName>/files/a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
50-
| Azure Page blobs | <li>UNC path to shares: `//<DeviceIPAddres>/<StorageAccountName_PageBlob>/<ContainerName>/files/a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
51-
| Azure Files |<li>UNC path to shares: `//<DeviceIPAddres>/<StorageAccountName_AzFile>/<ShareName>/files/a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.file.core.windows.net/<ShareName>/files/a.txt`</li> |
52-
| Azure Block blobs (Archive) | <li>UNC path to shares: `//<DeviceIPAddres>/<StorageAccountName_BlockBlobArchive>/<ContainerName>/files/a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
49+
| Azure Block blobs | <li>UNC path to shares: `//<DeviceIPAddress>/<storageaccountname_BlockBlob>/<ContainerName>/files/a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
50+
| Azure Page blobs | <li>UNC path to shares: `//<DeviceIPAddress>/<storageaccountname_PageBlob>/<ContainerName>/files/a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
51+
| Azure Files |<li>UNC path to shares: `//<DeviceIPAddress>/<storageaccountname_AzFile>/<ShareName>/files/a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.file.core.windows.net/<ShareName>/files/a.txt`</li> |
52+
| Azure Block blobs (Archive) | <li>UNC path to shares: `//<DeviceIPAddress>/<storageaccountname_BlockBlobArchive>/<ContainerName>/files/a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
5353

5454
If you are using a Linux host computer, perform the following steps to configure Data Box to allow access to NFS clients.
5555

articles/databox/data-box-deploy-copy-data-via-rest.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ author: alkohli
88
ms.service: databox
99
ms.subservice: pod
1010
ms.topic: tutorial
11-
ms.date: 07/02/2020
11+
ms.date: 08/26/2022
1212
ms.author: alkohli
1313
#Customer intent: As an IT admin, I need to be able to copy data to Data Box to upload on-premises data from my server onto Azure.
1414
---

articles/databox/data-box-deploy-copy-data.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ author: alkohli
77
ms.service: databox
88
ms.subservice: pod
99
ms.topic: tutorial
10-
ms.date: 03/17/2022
10+
ms.date: 08/26/2022
1111
ms.author: alkohli
1212

1313
# Customer intent: As an IT admin, I need to be able to copy data to Data Box to upload on-premises data from my server onto Azure.
@@ -61,10 +61,10 @@ The following table shows the UNC path to the shares on your Data Box and Azure
6161

6262
|Azure Storage types | Data Box shares |
6363
|-------------------|--------------------------------------------------------------------------------|
64-
| Azure Block blobs | <li>UNC path to shares: `\\<DeviceIPAddress>\<StorageAccountName_BlockBlob>\<ContainerName>\files\a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
65-
| Azure Page blobs | <li>UNC path to shares: `\\<DeviceIPAddres>\<StorageAccountName_PageBlob>\<ContainerName>\files\a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
66-
| Azure Files |<li>UNC path to shares: `\\<DeviceIPAddres>\<StorageAccountName_AzFile>\<ShareName>\files\a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.file.core.windows.net/<ShareName>/files/a.txt`</li> |
67-
| Azure Block blobs (Archive) | <li>UNC path to shares: `\\<DeviceIPAddres>\<StorageAccountName_BlockBlobArchive>\<ContainerName>\files\a.txt`</li><li>Azure Storage URL: `https://<StorageAccountName>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
64+
| Azure Block blobs | <li>UNC path to shares: `\\<DeviceIPAddress>\<storageaccountname_BlockBlob>\<ContainerName>\files\a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
65+
| Azure Page blobs | <li>UNC path to shares: `\\<DeviceIPAddress>\<storageaccountname_PageBlob>\<ContainerName>\files\a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
66+
| Azure Files |<li>UNC path to shares: `\\<DeviceIPAddress>\<storageaccountname_AzFile>\<ShareName>\files\a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.file.core.windows.net/<ShareName>/files/a.txt`</li> |
67+
| Azure Block blobs (Archive) | <li>UNC path to shares: `\\<DeviceIPAddress>\<storageaccountname_BlockBlobArchive>\<ContainerName>\files\a.txt`</li><li>Azure Storage URL: `https://<storageaccountname>.blob.core.windows.net/<ContainerName>/files/a.txt`</li> |
6868

6969
If using a Windows Server host computer, follow these steps to connect to the Data Box.
7070

@@ -81,10 +81,10 @@ If using a Windows Server host computer, follow these steps to connect to the Da
8181
`net use \\<IP address of the device>\<share name> /u:<IP address of the device>\<user name for the share>`
8282

8383
Depending upon your data format, the share paths are as follows:
84-
- Azure Block blob - `\\10.126.76.138\utSAC1_202006051000_BlockBlob`
85-
- Azure Page blob - `\\10.126.76.138\utSAC1_202006051000_PageBlob`
86-
- Azure Files - `\\10.126.76.138\utSAC1_202006051000_AzFile`
87-
- Azure Blob blob (Archive) - `\\10.126.76.138\utSAC0_202202241054_BlockBlobArchive`
84+
- Azure Block blob - `\\10.126.76.138\utsac1_BlockBlob`
85+
- Azure Page blob - `\\10.126.76.138\utsac1_PageBlob`
86+
- Azure Files - `\\10.126.76.138\utsac1_AzFile`
87+
- Azure Blob blob (Archive) - `\\10.126.76.138\utsac0_BlockBlobArchive`
8888

8989
4. Enter the password for the share when prompted. If the password has special characters, add double quotation marks before and after it. The following sample shows connecting to a share via the preceding command.
9090

@@ -107,7 +107,7 @@ If using a Windows Server host computer, follow these steps to connect to the Da
107107
If using a Linux client, use the following command to mount the SMB share. The "vers" parameter below is the version of SMB that your Linux host supports. Plug in the appropriate version in the command below. For versions of SMB that the Data Box supports see [Supported file systems for Linux clients](./data-box-system-requirements.md#supported-file-transfer-protocols-for-clients)
108108
109109
```console
110-
sudo mount -t nfs -o vers=2.1 10.126.76.138:/utSAC1_202006051000_BlockBlob /home/databoxubuntuhost/databox
110+
sudo mount -t nfs -o vers=2.1 10.126.76.138:/utsac1_BlockBlob /home/databoxubuntuhost/databox
111111
```
112112

113113
## Copy data to Data Box
-736 Bytes
Loading
-1.47 KB
Loading
-8.44 KB
Loading
-6.34 KB
Loading

0 commit comments

Comments
 (0)