Skip to content

Commit 8db33fd

Browse files
Cosmos Spark release - version bump for documentation (Azure#27044)
1 parent 3b531fa commit 8db33fd

File tree

3 files changed

+6
-6
lines changed

3 files changed

+6
-6
lines changed

sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,11 +52,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
5252
### Download
5353

5454
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
55-
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.0`
55+
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.1`
5656

5757
You can also integrate against Cosmos DB Spark Connector in your SBT project:
5858
```scala
59-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.6.0"
59+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.6.1"
6060
```
6161

6262
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3-2_2-12/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,11 +52,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
5252
### Download
5353

5454
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
55-
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.0`
55+
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.1`
5656

5757
You can also integrate against Cosmos DB Spark Connector in your SBT project:
5858
```scala
59-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.6.0"
59+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.6.1"
6060
```
6161

6262
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3_2-12/docs/quick-start.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,10 @@ You can use any other Spark 3.1.1 spark offering as well, also you should be abl
2323
SLF4J is only needed if you plan to use logging, please also download an SLF4J binding which will link the SLF4J API with the logging implementation of your choice. See the [SLF4J user manual](https://www.slf4j.org/manual.html) for more information.
2424

2525
For Spark 3.1:
26-
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.0](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.6.0/jar)
26+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.1](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.6.1/jar)
2727

2828
For Spark 3.2:
29-
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.0](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.6.0/jar)
29+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.1](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.6.1/jar)
3030

3131

3232
The getting started guide is based on PySpark however you can use the equivalent scala version as well, and you can run the following code snippet in an Azure Databricks PySpark notebook.

0 commit comments

Comments
 (0)