Skip to content

Commit 232d210

Browse files
Release for Cosmos spark connector 4.6.0 (Azure#26691)
* Release for Cosmos spark connector 4.6.0 * Create CHANGELOG.md
1 parent 7e91ca4 commit 232d210

File tree

9 files changed

+30
-18
lines changed

9 files changed

+30
-18
lines changed

eng/versioning/version_client.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,8 +82,8 @@ com.azure:azure-cosmos;4.25.0;4.26.0-beta.1
8282
com.azure:azure-cosmos-benchmark;4.0.1-beta.1;4.0.1-beta.1
8383
com.azure:azure-cosmos-dotnet-benchmark;4.0.1-beta.1;4.0.1-beta.1
8484
com.azure.cosmos.spark:azure-cosmos-spark_3_2-12;1.0.0-beta.1;1.0.0-beta.1
85-
com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;4.6.0-beta.1;4.6.0-beta.1
86-
com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;4.6.0-beta.1;4.6.0-beta.1
85+
com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;4.6.0;4.6.0
86+
com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;4.6.0;4.6.0
8787
com.azure:azure-cosmos-encryption;1.0.0-beta.9;1.0.0-beta.10
8888
com.azure:azure-data-appconfiguration;1.2.5;1.3.0-beta.1
8989
com.azure:azure-data-appconfiguration-perf;1.0.0-beta.1;1.0.0-beta.1

sdk/cosmos/azure-cosmos-spark_3-1_2-12/CHANGELOG.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,12 @@
11
## Release History
2-
### 4.6.0-beta.1 (Unreleased)
2+
### 4.6.0 (2022-01-25)
3+
#### Key Bug Fixes
4+
* Fixed an issue in schema inference logic resulting in only using the first element of an array to derive the schema. - See [PR 26568](https://github.com/Azure/azure-sdk-for-java/pull/26568)
5+
6+
#### New Features
7+
* Added support for Spark 3.2. Two different maven packages will be published - but we will keep versions with further feature updates and fixes in-sync between both.
8+
- Spark 3.1: com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.0
9+
- Spark 3.2: com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.0
310

411
### 4.5.3 (2022-01-06)
512
#### Key Bug Fixes

sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,11 +52,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
5252
### Download
5353

5454
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
55-
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.5.3`
55+
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.0`
5656

5757
You can also integrate against Cosmos DB Spark Connector in your SBT project:
5858
```scala
59-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.5.3"
59+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.6.0"
6060
```
6161

6262
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3-1_2-12/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
</parent>
1212
<groupId>com.azure.cosmos.spark</groupId>
1313
<artifactId>azure-cosmos-spark_3-1_2-12</artifactId>
14-
<version>4.6.0-beta.1</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;current} -->
14+
<version>4.6.0</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12;current} -->
1515
<packaging>jar</packaging>
1616
<url>https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3-1_2-12</url>
1717
<name>OLTP Spark 3.1 Connector for Azure Cosmos DB SQL API</name>

sdk/cosmos/azure-cosmos-spark_3-2_2-12/CHANGELOG.md

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,15 @@
11
## Release History
2-
### 4.6.0-beta.1 (2022-01-18)
2+
### 4.6.0 (2022-01-25)
3+
#### Key Bug Fixes
4+
* Fixed an issue in schema inference logic resulting in only using the first element of an array to derive the schema. - See [PR 26568](https://github.com/Azure/azure-sdk-for-java/pull/26568)
5+
36
#### New Features
4-
* Added beta support for Spark 3.2
7+
* Added support for Spark 3.2. Two different maven packages will be published - but we will keep versions with further feature updates and fixes in-sync between both.
8+
- Spark 3.1: com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.0
9+
- Spark 3.2: com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.0
510

611
### NOTE: Below versions only exist for Spark 3.1 -
7-
***keeping the changelog here as reference only***
12+
***keeping the changelog here for reference only***
813

914
### 4.5.3 (2022-01-06)
1015
#### Key Bug Fixes

sdk/cosmos/azure-cosmos-spark_3-2_2-12/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,11 +52,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
5252
### Download
5353

5454
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
55-
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.5.3`
55+
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.0`
5656

5757
You can also integrate against Cosmos DB Spark Connector in your SBT project:
5858
```scala
59-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.5.3"
59+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.6.0"
6060
```
6161

6262
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3-2_2-12/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
</parent>
1212
<groupId>com.azure.cosmos.spark</groupId>
1313
<artifactId>azure-cosmos-spark_3-2_2-12</artifactId>
14-
<version>4.6.0-beta.1</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;current} -->
14+
<version>4.6.0</version> <!-- {x-version-update;com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12;current} -->
1515
<packaging>jar</packaging>
1616
<url>https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3-2_2-12</url>
1717
<name>OLTP Spark 3.2 Connector for Azure Cosmos DB SQL API</name>
Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
## Release History
2+
### 1.0.0-beta.1 (Unreleased)
3+
This maven-package will never be released - it is just used to share code between
4+
the child projects for specific Spark versions.
25

3-
### 4.6.0-beta.1 (Unreleased)
6+
See the changelog of the sibling projects for the changelog of the connector targeting a specific Spark version.
47

58
#### Features Added
6-
79
#### Breaking Changes
8-
910
#### Bugs Fixed
10-
1111
#### Other Changes

sdk/cosmos/azure-cosmos-spark_3_2-12/docs/quick-start.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,10 @@ You can use any other Spark 3.1.1 spark offering as well, also you should be abl
2323
SLF4J is only needed if you plan to use logging, please also download an SLF4J binding which will link the SLF4J API with the logging implementation of your choice. See the [SLF4J user manual](https://www.slf4j.org/manual.html) for more information.
2424

2525
For Spark 3.1:
26-
- Install Cosmos DB Spark Connector, in your spark Cluster [azure-cosmos-spark_3-1_2-12-4.6.0.jar](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.6.0/jar)
26+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.0](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.6.0/jar)
2727

2828
For Spark 3.2:
29-
- Install Cosmos DB Spark Connector, in your spark Cluster [azure-cosmos-spark_3-2_2-12-4.6.0.jar](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.6.0/jar)
29+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.0](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.6.0/jar)
3030

3131

3232
The getting started guide is based on PySpark however you can use the equivalent scala version as well, and you can run the following code snippet in an Azure Databricks PySpark notebook.

0 commit comments

Comments
 (0)