You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/cognitive-services/Speech-Service/record-custom-voice-samples.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -267,7 +267,7 @@ Use a stand to hold the script. Avoid angling the stand so that it can reflect s
267
267
268
268
The person operating the recording equipment — the recording engineer — should be in a separate room from the talent, with some way to talk to the talent in the recording booth (a *talkback circuit*).
269
269
270
-
The recording should contain as little noise as possible, with a goal of an 80-dB signal-to-noise ratio or better.
270
+
The recording should contain as little noise as possible, with a goal of -80 dB.
271
271
272
272
Listen closely to a recording of silence in your "booth," figure out where any noise is coming from, and eliminate the cause. Common sources of noise are air vents, fluorescent light ballasts, traffic on nearby roads, and equipment fans (even notebook PCs might have fans). Microphones and cables can pick up electrical noise from nearby AC wiring, usually a hum or buzz. A buzz can also be caused by a *ground loop*, which is caused by having equipment plugged into more than one electrical circuit.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-blob-storage.md
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
8
8
ms.subservice: data-movement
9
9
ms.topic: conceptual
10
10
ms.custom: synapse
11
-
ms.date: 07/04/2022
11
+
ms.date: 08/24/2022
12
12
---
13
13
14
14
# Copy and transform data in Azure Blob storage by using Azure Data Factory or Azure Synapse Analytics
@@ -257,7 +257,8 @@ These properties are supported for an Azure Blob storage linked service:
257
257
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
258
258
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
| servicePrincipalKey | Specify the application's key. Mark this field as **SecureString** to store it securelyFactory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
260
+
| servicePrincipalCredentialType | The credential type to use for service principal authentication. Allowed values are **ServicePrincipalKey** and **ServicePrincipalCert**. | Yes |
261
+
| servicePrincipalCredential | The service principal credential. <br/> When you use **ServicePrincipalKey** as the credential type, specify the application's key. Mark this field as **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). <br/> When you use **ServicePrincipalCert** as the credential, reference a certificate in Azure Key Vault, and ensure the certificate content type is **PKCS #12**.| Yes |
261
262
| tenant | Specify the tenant information (domain name or tenant ID) under which your application resides. Retrieve it by hovering over the upper-right corner of the Azure portal. | Yes |
262
263
| azureCloudType | For service principal authentication, specify the type of Azure cloud environment, to which your Azure Active Directory application is registered. <br/> Allowed values are **AzurePublic**, **AzureChina**, **AzureUsGovernment**, and **AzureGermany**. By default, the data factory or Synapse pipeline's cloud environment is used. | No |
263
264
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
| type | The type property of the Copy activity source must be set to **SnowflakeSource**. | Yes |
187
187
| query | Specifies the SQL query to read data from Snowflake. If the names of the schema, table and columns contain lower case, quote the object identifier in query e.g. `select * from "schema"."myTable"`.<br>Executing stored procedure is not supported. | No |
188
-
| exportSettings | Advanced settings used to retrieve data from Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. |No|
188
+
| exportSettings | Advanced settings used to retrieve data from Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. |Yes|
189
189
|***Under `exportSettings`:***|||
190
190
| type | The type of export command, set to **SnowflakeExportCopyCommand**. | Yes |
191
191
| additionalCopyOptions | Additional copy options, provided as a dictionary of key-value pairs. Examples: MAX_FILE_SIZE, OVERWRITE. For more information, see [Snowflake Copy Options](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html#copy-options-copyoptions). | No |
@@ -289,8 +289,11 @@ To use this feature, create an [Azure Blob storage linked service](connector-azu
289
289
],
290
290
"typeProperties": {
291
291
"source": {
292
-
"type": "SnowflakeSource",
293
-
"sqlReaderQuery": "SELECT * FROM MyTable"
292
+
"type": "SnowflakeSource",
293
+
"sqlReaderQuery": "SELECT * FROM MyTable",
294
+
"exportSettings": {
295
+
"type": "SnowflakeExportCopyCommand"
296
+
}
294
297
},
295
298
"sink": {
296
299
"type": "<sink type>"
@@ -320,7 +323,7 @@ To copy data to Snowflake, the following properties are supported in the Copy ac
| type | The type property of the Copy activity sink, set to **SnowflakeSink**. | Yes |
322
325
| preCopyScript | Specify a SQL query for the Copy activity to run before writing data into Snowflake in each run. Use this property to clean up the preloaded data. | No |
323
-
| importSettings | Advanced settings used to write data into Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. |No|
326
+
| importSettings | Advanced settings used to write data into Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. |Yes|
324
327
|***Under `importSettings`:***|||
325
328
| type | The type of import command, set to **SnowflakeImportCopyCommand**. | Yes |
326
329
| additionalCopyOptions | Additional copy options, provided as a dictionary of key-value pairs. Examples: ON_ERROR, FORCE, LOAD_UNCERTAIN_FILES. For more information, see [Snowflake Copy Options](https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#copy-options-copyoptions). | No |
@@ -390,10 +393,10 @@ If your source data store and format meet the criteria described in this section
390
393
"type": "SnowflakeImportCopyCommand",
391
394
"copyOptions": {
392
395
"FORCE": "TRUE",
393
-
"ON_ERROR": "SKIP_FILE",
396
+
"ON_ERROR": "SKIP_FILE"
394
397
},
395
398
"fileFormatOptions": {
396
-
"DATE_FORMAT": "YYYY-MM-DD",
399
+
"DATE_FORMAT": "YYYY-MM-DD"
397
400
}
398
401
}
399
402
}
@@ -435,7 +438,10 @@ To use this feature, create an [Azure Blob storage linked service](connector-azu
4. Enter the password for the share when prompted. If the password has special characters, add double quotation marks before and after it. The following sample shows connecting to a share via the preceding command.
90
90
@@ -107,7 +107,7 @@ If using a Windows Server host computer, follow these steps to connect to the Da
107
107
If using a Linux client, use the following command to mount the SMB share. The "vers" parameter below is the version of SMB that your Linux host supports. Plug in the appropriate version in the command below. For versions of SMB that the Data Box supports see [Supported file systems for Linux clients](./data-box-system-requirements.md#supported-file-transfer-protocols-for-clients)
108
108
109
109
```console
110
-
sudo mount -t nfs -o vers=2.1 10.126.76.138:/utSAC1_202006051000_BlockBlob /home/databoxubuntuhost/databox
110
+
sudo mount -t nfs -o vers=2.1 10.126.76.138:/utsac1_BlockBlob /home/databoxubuntuhost/databox
0 commit comments