Skip to content

Commit 8db768f

Browse files
authored
Rework Import and connectors docs for Azure launch (#4473)
1 parent a979962 commit 8db768f

File tree

5 files changed

+25
-2
lines changed

5 files changed

+25
-2
lines changed
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
<Highlight type="note">
2+
3+
This feature is on our roadmap for $CLOUD_LONG on Microsoft Azure. Stay tuned!
4+
5+
</Highlight>

migrate/livesync-for-kafka.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,13 @@ tags: [stream, connector]
99

1010
import PrereqCloud from "versionContent/_partials/_prereqs-cloud-only.mdx";
1111
import EarlyAccessNoRelease from "versionContent/_partials/_early_access.mdx";
12+
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";
1213

1314
# Stream data from Kafka
1415

1516
You use the Kafka source connector in $CLOUD_LONG to stream events from Kafka into your $SERVICE_SHORT. $CLOUD_LONG connects to your Confluent Cloud Kafka cluster and Schema Registry using SASL/SCRAM authentication and service account–based API keys. Only the Avro format is currently supported [with some limitations][limitations].
1617

17-
This page explains how to connect $CLOUD_LONG to your Confluence Cloud Kafka cluster.
18+
This page explains how to connect $CLOUD_LONG to your Confluent Cloud Kafka cluster.
1819

1920
<EarlyAccessNoRelease />: the Kafka source connector is not yet supported for production use.
2021

@@ -25,6 +26,8 @@ This page explains how to connect $CLOUD_LONG to your Confluence Cloud Kafka clu
2526
- [Sign up][confluence-signup] for Confluence Cloud.
2627
- [Create][create-kafka-cluster] a Kafka cluster in Confluence Cloud.
2728

29+
<NotSupportedAzure />
30+
2831
## Access your Kafka cluster in Confluent Cloud
2932

3033
Take the following steps to prepare your Kafka cluster for connection to $CLOUD_LONG:

migrate/livesync-for-s3.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ tags: [recovery, logical backup, replication]
99

1010
import PrereqCloud from "versionContent/_partials/_prereqs-cloud-only.mdx";
1111
import EarlyAccessNoRelease from "versionContent/_partials/_early_access.mdx";
12+
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";
1213

1314
# Sync data from S3
1415

@@ -63,6 +64,8 @@ The $S3_CONNECTOR continuously imports data from an Amazon S3 bucket into your d
6364

6465
- [Public anonymous user][credentials-public].
6566

67+
<NotSupportedAzure />
68+
6669
## Limitations
6770

6871
- **File naming**:
@@ -162,6 +165,8 @@ To sync data from your S3 bucket to your $SERVICE_LONG using $CONSOLE:
162165
And that is it, you are using the $S3_CONNECTOR to synchronize all the data, or specific files, from an S3 bucket to your
163166
$SERVICE_LONG in real time.
164167

168+
169+
165170
[about-hypertables]: /use-timescale/:currentVersion:/hypertables/
166171
[lives-sync-specify-tables]: /migrate/:currentVersion:/livesync-for-postgresql/#specify-the-tables-to-synchronize
167172
[compression]: /use-timescale/:currentVersion:/compression/about-compression

migrate/upload-file-using-console.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ keywords: [import]
88
import ImportPrerequisitesCloudNoConnection from "versionContent/_partials/_prereqs-cloud-no-connection.mdx";
99
import EarlyAccessGeneral from "versionContent/_partials/_early_access.mdx";
1010
import NotAvailableFreePlan from "versionContent/_partials/_not-available-in-free-plan.mdx";
11+
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";
1112

1213
# Upload a file into your $SERVICE_SHORT using $CONSOLE_LONG
1314

@@ -25,6 +26,8 @@ $CONSOLE_LONG enables you to drag and drop files to upload from your local machi
2526

2627
<ImportPrerequisitesCloudNoConnection />
2728

29+
<NotSupportedAzure />
30+
2831
<Tabs label="Upload files from a local machine" persistKey="file-import">
2932

3033
<Tab title="From CSV" label="import-csv">
@@ -127,6 +130,8 @@ $CONSOLE_LONG enables you to upload CSV and Parquet files, including archives co
127130
- [IAM Role][credentials-iam].
128131
- [Public anonymous user][credentials-public].
129132

133+
<NotSupportedAzure />
134+
130135
<Tabs label="Import files from S3" persistKey="file-import">
131136

132137
<Tab title="From CSV" label="import-csv">
@@ -205,7 +210,6 @@ To import a Parquet file from an S3 bucket:
205210

206211
</Tabs>
207212

208-
209213
And that is it, you have imported your data to your $SERVICE_LONG.
210214

211215
[credentials-iam]: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html#roles-creatingrole-user-console

use-timescale/tigerlake.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ keywords: [data lake, lakehouse, s3, iceberg]
88

99
import IntegrationPrereqsCloud from "versionContent/_partials/_integration-prereqs-cloud-only.mdx";
1010
import EarlyAccessGeneral from "versionContent/_partials/_early_access.mdx";
11+
import NotSupportedAzure from "versionContent/_partials/_not-supported-for-azure.mdx";
1112

1213
# Integrate data lakes with $CLOUD_LONG
1314

@@ -29,6 +30,8 @@ Tiger Lake is currently in private beta. Please contact us to request access.
2930

3031
<IntegrationPrereqsCloud/>
3132

33+
<NotSupportedAzure />
34+
3235
## Integrate a data lake with your $SERVICE_LONG
3336

3437
To connect a $SERVICE_LONG to your data lake:
@@ -361,6 +364,9 @@ data lake:
361364
* Writing to the same S3 table bucket from multiple services is not supported, bucket-to-service mapping is one-to-one.
362365
* Iceberg snapshots are pruned automatically if the amount exceeds 2500.
363366

367+
368+
369+
364370
[cmc]: https://console.aws.amazon.com/cloudformation/
365371
[aws-athena]: https://aws.amazon.com/athena/
366372
[apache-spark]: https://spark.apache.org/

0 commit comments

Comments
 (0)