Skip to content

Commit 53150a7

Browse files
Fixing MD files with latest versions (Azure#29020)
1 parent 6382e14 commit 53150a7

File tree

3 files changed

+22
-6
lines changed

3 files changed

+22
-6
lines changed

sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,10 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
2727
#### azure-cosmos-spark_3-1_2-12
2828
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
2929
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
30+
| 4.10.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
31+
| 4.9.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
32+
| 4.8.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
33+
| 4.7.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3034
| 4.6.2 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3135
| 4.6.1 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3236
| 4.6.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
@@ -49,18 +53,22 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
4953
#### azure-cosmos-spark_3-2_2-12
5054
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
5155
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
56+
| 4.10.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
57+
| 4.9.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
58+
| 4.8.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
59+
| 4.7.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
5260
| 4.6.2 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
5361
| 4.6.1 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
5462
| 4.6.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
5563

5664
### Download
5765

5866
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
59-
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.2`
67+
`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.10.0`
6068

6169
You can also integrate against Cosmos DB Spark Connector in your SBT project:
6270
```scala
63-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.6.2"
71+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.10.0"
6472
```
6573

6674
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3-2_2-12/README.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,13 +27,21 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
2727
#### azure-cosmos-spark_3-2_2-12
2828
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
2929
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
30+
| 4.10.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
31+
| 4.9.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
32+
| 4.8.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
33+
| 4.7.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3034
| 4.6.2 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3135
| 4.6.1 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3236
| 4.6.0 | 3.2.0 - 3.2.1 | 8 | 2.12 | 10.\* |
3337

3438
#### azure-cosmos-spark_3-1_2-12
3539
| Connector | Supported Spark Versions | Minimum Java Version | Supported Scala Versions | Supported Databricks Runtimes |
3640
| ------------- | ------------------------ | -------------------- | ----------------------- | ----------------------------- |
41+
| 4.10.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
42+
| 4.9.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
43+
| 4.8.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
44+
| 4.7.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3745
| 4.6.2 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3846
| 4.6.1 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
3947
| 4.6.0 | 3.1.1 - 3.1.2 | 8 | 2.12 | 8.\*, 9.\* |
@@ -56,11 +64,11 @@ https://github.com/Azure/azure-sdk-for-java/issues/new
5664
### Download
5765

5866
You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven:
59-
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.2`
67+
`com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.10.0`
6068

6169
You can also integrate against Cosmos DB Spark Connector in your SBT project:
6270
```scala
63-
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.6.2"
71+
libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-2_2-12" % "4.10.0"
6472
```
6573

6674
Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/search?q=g:com.azure.cosmos.spark).

sdk/cosmos/azure-cosmos-spark_3_2-12/docs/quick-start.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,10 @@ You can use any other Spark 3.1.1 spark offering as well, also you should be abl
2323
SLF4J is only needed if you plan to use logging, please also download an SLF4J binding which will link the SLF4J API with the logging implementation of your choice. See the [SLF4J user manual](https://www.slf4j.org/manual.html) for more information.
2424

2525
For Spark 3.1:
26-
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.6.2](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.6.2/jar)
26+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.10.0](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.10.0/jar)
2727

2828
For Spark 3.2:
29-
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.6.2](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.6.2/jar)
29+
- Install Cosmos DB Spark Connector, in your spark Cluster [com.azure.cosmos.spark:azure-cosmos-spark_3-2_2-12:4.10.0](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-2_2-12/4.10.0/jar)
3030

3131

3232
The getting started guide is based on PySpark however you can use the equivalent scala version as well, and you can run the following code snippet in an Azure Databricks PySpark notebook.

0 commit comments

Comments
 (0)