Skip to content

Commit b1deaa8

Browse files
authored
Merge pull request #291380 from MicrosoftDocs/main
Publish to Live Wednesday 4AM PST, 12/4
2 parents 9a0dc75 + f2f1669 commit b1deaa8

29 files changed

+136
-78
lines changed

articles/azure-functions/functions-bindings-signalr-service-input.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -134,6 +134,11 @@ public SignalRConnectionInfo negotiate(
134134

135135
:::zone-end
136136

137+
> [!Warning]
138+
> For the simplicity, we omit the authentication and authorization parts in this sample. As a result, this endpoint is publicly accessible without any restrictions. To ensure the security of your negotiation endpoint, you should implement appropriate authentication and authorization mechanisms based on your specific requirements. For guidance on protecting your HTTP endpoints, see the following articles:
139+
> * [Secure HTTP endpoints](../azure-functions/security-concepts.md#secure-http-endpoints).
140+
> * [Authentication and authorization in Azure App Service and Azure Functions](../app-service/overview-authentication-authorization.md)
141+
137142
## Usage
138143

139144
### Authenticated tokens

articles/azure-signalr/signalr-concept-client-negotiation.md

Lines changed: 9 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -200,22 +200,19 @@ You can find a full sample on how to use the Management SDK to redirect SignalR
200200

201201
### Azure SignalR Service function extension
202202

203-
When you use an Azure function app, you can work with the function extension. Here's a sample of using `SignalRConnectionInfo` to help you build the negotiation response:
203+
When you use an Azure function app, you can work with the function extension. Here's a sample of using `SignalRConnectionInfo` in C# isolated worker model to help you build the negotiation response:
204204

205-
```cs
206-
[FunctionName("negotiate")]
207-
public SignalRConnectionInfo Negotiate([HttpTrigger(AuthorizationLevel.Anonymous)]HttpRequest req)
208-
{
209-
var claims = GetClaims(req.Headers["Authorization"]);
210-
return Negotiate(
211-
claims.First(c => c.Type == ClaimTypes.NameIdentifier).Value,
212-
claims
213-
);
214-
}
215-
```
205+
:::code language="csharp" source="~/azure-functions-dotnet-worker/samples/Extensions/SignalR/SignalRNegotiationFunctions.cs" id="snippet_negotiate":::
206+
207+
> [!Warning]
208+
> For the simplicity, we omit the authentication and authorization parts in this sample. As a result, this endpoint is publicly accessible without any restrictions. To ensure the security of your negotiation endpoint, you should implement appropriate authentication and authorization mechanisms based on your specific requirements. For guidance on protecting your HTTP endpoints, see the following articles:
209+
> * [Secure HTTP endpoints](../azure-functions/security-concepts.md#secure-http-endpoints).
210+
> * [Authentication and authorization in Azure App Service and Azure Functions](../app-service/overview-authentication-authorization.md)
216211
217212
Then your clients can request the function endpoint `https://<Your Function App Name>.azurewebsites.net/api/negotiate` to get the service URL and access token. You can find a full sample on [GitHub](https://github.com/aspnet/AzureSignalR-samples/tree/main/samples/BidirectionChat).
218213

214+
For `SignalRConnectionInfo` input binding samples in other languages, see [Azure Functions SignalR Service input binding](../azure-functions/functions-bindings-signalr-service-input.md).
215+
219216
### Self-exposing `/negotiate` endpoint
220217

221218
You could also expose the negotiation endpoint in your own server and return the negotiation response by yourself if you are using other languages.

articles/confidential-computing/confidential-clean-rooms.md

Lines changed: 34 additions & 40 deletions
Large diffs are not rendered by default.
Binary file not shown.
231 KB
Loading
318 KB
Loading

articles/data-factory/format-excel.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,10 @@ ms.author: jianleishen
1515

1616
Follow this article when you want to **parse the Excel files**. The service supports both ".xls" and ".xlsx".
1717

18-
Excel format is supported for the following connectors: [Amazon S3](connector-amazon-simple-storage-service.md), [Amazon S3 Compatible Storage](connector-amazon-s3-compatible-storage.md), [Azure Blob](connector-azure-blob-storage.md), [Azure Data Lake Storage Gen1](connector-azure-data-lake-store.md), [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md), [Azure Files](connector-azure-file-storage.md), [File System](connector-file-system.md), [FTP](connector-ftp.md), [Google Cloud Storage](connector-google-cloud-storage.md), [HDFS](connector-hdfs.md), [HTTP](connector-http.md), [Oracle Cloud Storage](connector-oracle-cloud-storage.md) and [SFTP](connector-sftp.md). It is supported as source but not sink.
18+
Excel format is supported for the following connectors: [Amazon S3](connector-amazon-simple-storage-service.md), [Amazon S3 Compatible Storage](connector-amazon-s3-compatible-storage.md), [Azure Blob](connector-azure-blob-storage.md), [Azure Data Lake Storage Gen1](connector-azure-data-lake-store.md), [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md), [Azure Files](connector-azure-file-storage.md), [File System](connector-file-system.md), [FTP](connector-ftp.md), [Google Cloud Storage](connector-google-cloud-storage.md), [HDFS](connector-hdfs.md), [HTTP](connector-http.md), [Oracle Cloud Storage](connector-oracle-cloud-storage.md) and [SFTP](connector-sftp.md). It's supported as source but not sink.
1919

2020
>[!NOTE]
21-
>".xls" format is not supported while using [HTTP](connector-http.md).
21+
>".xls" format isn't supported while using [HTTP](connector-http.md).
2222
2323
## Dataset properties
2424

@@ -35,7 +35,7 @@ For a full list of sections and properties available for defining datasets, see
3535
| nullValue | Specifies the string representation of null value. <br>The default value is **empty string**. | No |
3636
| compression | Group of properties to configure file compression. Configure this section when you want to do compression/decompression during activity execution. | No |
3737
| type<br/>(*under `compression`*) | The compression codec used to read/write JSON files. <br>Allowed values are **bzip2**, **gzip**, **deflate**, **ZipDeflate**, **TarGzip**, **Tar**, **snappy**, or **lz4**. Default is not compressed.<br>**Note** currently Copy activity doesn't support "snappy" & "lz4", and mapping data flow doesn't support "ZipDeflate", "TarGzip" and "Tar".<br>**Note** when using copy activity to decompress **ZipDeflate** file(s) and write to file-based sink data store, files are extracted to the folder: `<path specified in dataset>/<folder named as source zip file>/`. | No. |
38-
| level<br/>(*under `compression`*) | The compression ratio. <br>Allowed values are **Optimal** or **Fastest**.<br>- **Fastest:** The compression operation should complete as quickly as possible, even if the resulting file is not optimally compressed.<br>- **Optimal**: The compression operation should be optimally compressed, even if the operation takes a longer time to complete. For more information, see [Compression Level](/dotnet/api/system.io.compression.compressionlevel) topic. | No |
38+
| level<br/>(*under `compression`*) | The compression ratio. <br>Allowed values are **Optimal** or **Fastest**.<br>- **Fastest:** The compression operation should complete as quickly as possible, even if the resulting file isn't optimally compressed.<br>- **Optimal**: The compression operation should be optimally compressed, even if the operation takes a longer time to complete. For more information, see [Compression Level](/dotnet/api/system.io.compression.compressionlevel) topic. | No |
3939

4040
Below is an example of Excel dataset on Azure Blob Storage:
4141

@@ -102,7 +102,7 @@ In mapping data flows, you can read Excel format in the following data stores: [
102102

103103
### Source properties
104104

105-
The below table lists the properties supported by an Excel source. You can edit these properties in the **Source options** tab. When using inline dataset, you will see additional file settings, which are the same as the properties described in [dataset properties](#dataset-properties) section.
105+
The below table lists the properties supported by an Excel source. You can edit these properties in the **Source options** tab. When using inline dataset, you'll see additional file settings, which are the same as the properties described in [dataset properties](#dataset-properties) section.
106106

107107
| Name | Description | Required | Allowed values | Data flow script property |
108108
| ------------------------- | ------------------------------------------------------------ | -------- | --------------------------------------------------------- | --------------------------------- |
@@ -112,7 +112,7 @@ The below table lists the properties supported by an Excel source. You can edit
112112
| Column to store file name | Create a new column with the source file name and path | no | String | rowUrlColumn |
113113
| After completion | Delete or move the files after processing. File path starts from the container root | no | Delete: `true` or `false` <br> Move: `['<from>', '<to>']` | purgeFiles <br> moveFiles |
114114
| Filter by last modified | Choose to filter files based upon when they were last altered | no | Timestamp | modifiedAfter <br> modifiedBefore |
115-
| Allow no files found | If true, an error is not thrown if no files are found | no | `true` or `false` | ignoreNoFilesFound |
115+
| Allow no files found | If true, an error isn't thrown if no files are found | no | `true` or `false` | ignoreNoFilesFound |
116116

117117
### Source example
118118

@@ -145,9 +145,12 @@ source(allowSchemaDrift: true,
145145
firstRowAsHeader: true) ~> ExcelSourceInlineDataset
146146
```
147147

148+
>[!NOTE]
149+
> Mapping data flow doesn't support reading protected Excel files, as these files may contain confidentiality notices or enforce specific access restrictions that limit access to their contents.
150+
148151
## Handling very large Excel files
149152

150-
The Excel connector does not support streaming read for the Copy activity and must load the entire file into memory before data can be read. To import schema, preview data, or refresh an Excel dataset, the data must be returned before the http request timeout (100s). For large Excel files, these operations may not finish within that timeframe, causing a timeout error. If you want to move large Excel files (>100MB) into another data store, you can use one of following options to work around this limitation:
153+
The Excel connector doesn't support streaming read for the Copy activity and must load the entire file into memory before data can be read. To import schema, preview data, or refresh an Excel dataset, the data must be returned before the http request timeout (100s). For large Excel files, these operations may not finish within that timeframe, causing a timeout error. If you want to move large Excel files (>100MB) into another data store, you can use one of following options to work around this limitation:
151154

152155
- Use the self-hosted integration runtime (SHIR), then use the Copy activity to move the large Excel file into another data store with the SHIR.
153156
- Split the large Excel file into several smaller ones, then use the Copy activity to move the folder containing the files.
155 KB
Loading

0 commit comments

Comments
 (0)