v0.4.0
- Added catalog and schema parameters to execute and fetch (#90). In this release, we have added optional
catalogandschemaparameters to theexecuteandfetchmethods in theSqlBackendabstract base class, allowing for more flexibility when executing SQL statements in specific catalogs and schemas. These updates include new method signatures and their respective implementations in theSparkSqlBackendandDatabricksSqlBackendclasses. The new parameters control the catalog and schema used by theSparkSessioninstance in theSparkSqlBackendclass and theSqlClientinstance in theDatabricksSqlBackendclass. This enhancement enables better functionality in multi-catalog and multi-schema environments. Additionally, this change comes with unit tests and integration tests to ensure proper functionality. The new parameters can be used when calling theexecuteandfetchmethods. For example, with aSparkSqlBackendinstancespark_backend, you can execute a SQL statement in a specific catalog and schema with the following code:spark_backend.execute("SELECT * FROM my_table", catalog="my_catalog", schema="my_schema"). Similarly, thefetchmethod can also be used with the new parameters.
Contributors: @FastLee