Skip to content

Commit ad31896

Browse files
Merge pull request #286635 from Albertyang0/2024_09-Monthly-broken-links-fix-jianleishen
2024_09 - Fix monthly broken links - jianleishen
2 parents 92f0fb7 + a6cc245 commit ad31896

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

articles/data-factory/data-flow-troubleshoot-connector-format.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -240,7 +240,7 @@ You use Azure PostgreSQL as a source or sink in the data flow such as previewing
240240
If you use the flexible server or Hyperscale (Citus) for your Azure PostgreSQL server, since the system is built via Spark upon Azure Databricks cluster, there's a limitation in Azure Databricks blocks our system to connect to the Flexible server or Hyperscale (Citus). You can review the following two links as references.
241241
- [Handshake fails trying to connect from Azure Databricks to Azure PostgreSQL with SSL](/answers/questions/170730/handshake-fails-trying-to-connect-from-azure-datab.html)
242242

243-
- [MCW-Real-time-data-with-Azure-Database-for-PostgreSQL-Hyperscale](https://github.com/microsoft/MCW-Real-time-data-with-Azure-Database-for-PostgreSQL-Hyperscale/blob/master/Hands-on%20lab/HOL%20step-by%20step%20-%20Real-time%20data%20with%20Azure%20Database%20for%20PostgreSQL%20Hyperscale.md)<br/>
243+
- MCW-Real-time-data-with-Azure-Database-for-PostgreSQL-Hyperscale<br/>
244244
Refer to the content in the following picture in this article:<br/>
245245

246246
:::image type="content" source="./media/data-flow-troubleshoot-connector-format/handshake-failure-cause-2.png" alt-text="Screenshot that shows the referring content in the article above.":::

articles/data-factory/format-avro.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ The below table lists the properties supported by an avro sink. You can edit the
117117
## Data type support
118118

119119
### Copy activity
120-
Avro [complex data types](https://avro.apache.org/docs/current/spec.html#schema_complex) are not supported (records, enums, arrays, maps, unions, and fixed) in Copy Activity.
120+
Avro complex data types are not supported (records, enums, arrays, maps, unions, and fixed) in Copy Activity.
121121

122122
### Data flows
123123
When working with Avro files in data flows, you can read and write complex data types, but be sure to clear the physical schema from the dataset first. In data flows, you can set your logical projection and derive columns that are complex structures, then auto-map those fields to an Avro file.

articles/data-factory/supported-file-formats-and-compression-codecs-legacy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -530,7 +530,7 @@ To use Avro format in a Hive table, you can refer to [Apache Hive's tutorial](ht
530530

531531
Note the following points:
532532

533-
* [Complex data types](https://avro.apache.org/docs/current/spec.html#schema_complex) are not supported (records, enums, arrays, maps, unions, and fixed).
533+
* Complex data types are not supported (records, enums, arrays, maps, unions, and fixed).
534534

535535
## <a name="compression-support"></a> Compression support (legacy)
536536

0 commit comments

Comments
 (0)