Skip to content

Commit b27c6e1

Browse files
Removing noisy log (Azure#29498)
* Fixing analytics violations found in release pipeline * Removing noisy log
1 parent e04187b commit b27c6e1

File tree

9 files changed

+16
-25
lines changed

9 files changed

+16
-25
lines changed

eng/versioning/version_client.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,8 +86,8 @@ com.azure:azure-cosmos;4.31.0;4.32.0-beta.1
8686
com.azure:azure-cosmos-benchmark;4.0.1-beta.1;4.0.1-beta.1
8787
com.azure:azure-cosmos-dotnet-benchmark;4.0.1-beta.1;4.0.1-beta.1
8888
com.azure.cosmos.spark:azure-cosmos-spark_3_2-12;1.0.0-beta.1;1.0.0-beta.1
89-
com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;4.11.1;4.12.0-beta.1
90-
com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;4.11.1;4.12.0-beta.1
89+
com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;4.11.1;4.11.2
90+
com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;4.11.1;4.11.2
9191
com.azure:azure-cosmos-encryption;1.3.0;1.4.0-beta.1
9292
com.azure:azure-data-appconfiguration;1.3.4;1.4.0-beta.1
9393
com.azure:azure-data-appconfiguration-perf;1.0.0-beta.1;1.0.0-beta.1

sdk/cosmos/azure-cosmos-spark_3-1_2-12/CHANGELOG.md

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,10 @@
11
## Release History
22

3-
### 4.12.0-beta.1 (Unreleased)
4-
5-
#### Features Added
6-
7-
#### Breaking Changes
3+
### 4.11.2 (2022-06-17)
84

95
#### Bugs Fixed
106
* Fixed a regression introduced in [PR 29152](https://github.com/Azure/azure-sdk-for-java/pull/29152) that can lead to `IllegalStateException: Latest LSN xxx must not be smaller than start LSN yyy`. - See [PR 29485](https://github.com/Azure/azure-sdk-for-java/pull/29485)
117

12-
#### Other Changes
13-
148
### 4.11.1 (2022-06-09)
159

1610
#### Bugs Fixed

sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
2929
#### azure-cosmos-spark_3-1_2-12
3030
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
3131
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
32+
| 4.11.2 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3233
| 4.11.1 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3334
| 4.11.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3435
| 4.10.1 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
@@ -58,6 +59,7 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
5859
#### azure-cosmos-spark_3-2_2-12
5960
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
6061
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
62+
| 4.11.2 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
6163
| 4.11.1 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
6264
| 4.11.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
6365
| 4.10.1 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
@@ -72,11 +74,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
7274
### Download
7375

7476
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
75-
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.11.1`
77+
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.11.2`
7678

7779
You can also integrate against Cosmos DB Spark Connector in your SBT project:
7880
```scala
79-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.11.1"
81+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.11.2"
8082
```
8183

8284
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3-1_2-12/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
</parent>
1212
<groupId>com.azure.cosmos.spark</groupId>
1313
<artifactId>azure-cosmos-spark_3-1_2-12</artifactId>
14-
<version>4.12.0-beta.1</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;current} -->
14+
<version>4.11.2</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;current} -->
1515
<packaging>jar</packaging>
1616
<url>https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3-1_2-12</url>
1717
<name>OLTP Spark 3.1 Connector for Azure Cosmos DB SQL API</name>

sdk/cosmos/azure-cosmos-spark_3-2_2-12/CHANGELOG.md

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,10 @@
11
## Release History
22

3-
### 4.12.0-beta.1 (Unreleased)
4-
5-
#### Features Added
6-
7-
#### Breaking Changes
3+
### 4.11.2 (2022-06-17)
84

95
#### Bugs Fixed
106
* Fixed a regression introduced in [PR 29152](https://github.com/Azure/azure-sdk-for-java/pull/29152) that can lead to `IllegalStateException: Latest LSN xxx must not be smaller than start LSN yyy`. - See [PR 29485](https://github.com/Azure/azure-sdk-for-java/pull/29485)
117

12-
#### Other Changes
13-
148
### 4.11.1 (2022-06-09)
159

1610
#### Bugs Fixed

sdk/cosmos/azure-cosmos-spark_3-2_2-12/README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,7 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
2828
#### azure-cosmos-spark_3-2_2-12
2929
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
3030
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
31+
| 4.11.2 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3132
| 4.11.1 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3233
| 4.11.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3334
| 4.10.1 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
@@ -42,6 +43,7 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
4243
#### azure-cosmos-spark_3-1_2-12
4344
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
4445
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
46+
| 4.11.2 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
4547
| 4.11.1 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
4648
| 4.11.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
4749
| 4.10.1 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
@@ -71,11 +73,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
7173
### Download
7274

7375
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
74-
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.11.1`
76+
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.11.2`
7577

7678
You can also integrate against Cosmos DB Spark Connector in your SBT project:
7779
```scala
78-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.11.1"
80+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.11.2"
7981
```
8082

8183
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3-2_2-12/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
</parent>
1212
<groupId>com.azure.cosmos.spark</groupId>
1313
<artifactId>azure-cosmos-spark_3-2_2-12</artifactId>
14-
<version>4.12.0-beta.1</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;current} -->
14+
<version>4.11.2</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;current} -->
1515
<packaging>jar</packaging>
1616
<url>https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3-2_2-12</url>
1717
<name>OLTP Spark 3.2 Connector for Azure Cosmos DB SQL API</name>

sdk/cosmos/azure-cosmos-spark_3_2-12/docs/quick-start.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,10 @@ You can use any other Spark 3.1.1 spark offering as well, also you should be abl
2323
SLF4J is only needed if you plan to use logging, please also download an SLF4J binding which will link the SLF4J API with the logging implementation of your choice. See the [SLF4J user manual](https://www.slf4j.org/manual.html) for more information.
2424

2525
For Spark 3.1:
26-
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.11.1](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.11.1/jar)
26+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.11.2](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.11.2/jar)
2727

2828
For Spark 3.2:
29-
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.11.1](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.11.1/jar)
29+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.11.2](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.11.2/jar)
3030

3131

3232
The getting started guide is based on PySpark however you can use the equivalent scala version as well, and you can run the following code snippet in an Azure Databricks PySpark notebook.

sdk/cosmos/azure-cosmos-spark_3_2-12/src/main/scala/com/azure/cosmos/spark/CosmosPartitionPlanner.scala

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -399,7 +399,6 @@ private object CosmosPartitionPlanner extends BasicLoggingTrait {
399399
val result = new ArrayBuffer[PartitionMetadata]
400400
orderedRanges
401401
.foreach(range => {
402-
logInfo(s"merging range $range")
403402
val initialStartTokensIndex = startTokensIndex
404403
val initialLatestTokensIndex = latestTokensIndex
405404
while (startTokensIndex < startTokens.length &&

0 commit comments

Comments
 (0)