Skip to content

Commit 40979eb

Browse files
committed
fix example code in README.md
1 parent b848bcb commit 40979eb

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,10 @@
22
<center>
33
<img src="https://github.com/open-datastudio/datastudio/raw/master/docs/_static/open-datastudio-logo.png" width="110px"/>
44
</center>
5-
<br />
65

76
# Open data studio
87

9-
[Open data studio](https://open-datastudio.io) is managed computing computing service on Staroid cloud. Run your machine learning and large scale data processing workload without managing cluster and servers.
8+
[Open data studio](https://open-datastudio.io) is a managed computing computing service on Staroid cloud. Run your machine learning and large scale data processing workloads without managing clusters and servers.
109

1110
Supported computing frameworks are
1211

@@ -34,7 +33,8 @@ pip install ods
3433

3534
```python
3635
import ods
37-
# 'ske' is the name of kubernetes cluster created from staroid.com. Alternatively, you can export 'STAROID_SKE' environment variable.
36+
# 'ske' is the name of kubernetes cluster created from staroid.com.
37+
# Alternatively, you can export 'STAROID_SKE' environment variable.
3838
ods.init(ske="kube-cluster-1")
3939
```
4040

@@ -45,7 +45,7 @@ Create spark session with default configuration
4545

4646
```python
4747
import ods
48-
spark = ods.spark("spark-1") # 'spark-1' is name of spark-serverless instance to create.
48+
spark = ods.spark("spark-1").session() # 'spark-1' is name of spark-serverless instance to create.
4949
df = spark.createDataFrame(....)
5050
```
5151

@@ -61,7 +61,7 @@ df = spark.createDataFrame(....)
6161

6262
```python
6363
import ods
64-
spark = ods.spark("spark-delta", delta=True)
64+
spark = ods.spark("spark-delta", delta=True).session()
6565
spark.read.format("delta").load(....)
6666
```
6767

@@ -72,7 +72,7 @@ import ods
7272
spark = ods.spark(spark_conf = {
7373
"spark.hadoop.fs.s3a.access.key": "...",
7474
"spark.hadoop.fs.s3a.secret.key" : "..."
75-
})
75+
}).session()
7676
```
7777

7878
Check [tests/test_spark.py](https://github.com/open-datastudio/ods/blob/master/tests/test_spark.py) as well.

0 commit comments

Comments
 (0)