Skip to content

Commit

Permalink
Add Snowflake JDBC Connector
Browse files Browse the repository at this point in the history
*****

Update ci and delete no use lines (#20)

Approved
Update according to reviews 11/01/2024

Various style fixes and cleanup (#15) (#17)

Co-authored-by: Martin Traverso <mtraverso@gmail.com>
Various style fixes and cleanup (#15)

Update the github CI (#12)

* Add Snowflake JDBC Connector

* Add snowflake in the ci
Add Snowflake JDBC Connector (#11)

Had to redo the connector because all the rebases caused havoc
  • Loading branch information
yuuteng authored and dprophet committed Mar 4, 2024
1 parent acc40c9 commit dafaa5d
Show file tree
Hide file tree
Showing 26 changed files with 2,722 additions and 0 deletions.
22 changes: 22 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -373,6 +373,7 @@ jobs:
!:trino-server,
!:trino-server-rpm,
!:trino-singlestore,
!:trino-snowflake,
!:trino-sqlserver,
!:trino-test-jdbc-compatibility-old-server,
!:trino-tests,
Expand Down Expand Up @@ -475,6 +476,8 @@ jobs:
- { modules: plugin/trino-redshift, profile: fte-tests }
- { modules: plugin/trino-resource-group-managers }
- { modules: plugin/trino-singlestore }
- { modules: plugin/trino-snowflake }
- { modules: plugin/trino-snowflake, profile: cloud-tests }
- { modules: plugin/trino-sqlserver }
- { modules: testing/trino-faulttolerant-tests, profile: default }
- { modules: testing/trino-faulttolerant-tests, profile: test-fault-tolerant-delta }
Expand Down Expand Up @@ -651,6 +654,24 @@ jobs:
if: matrix.modules == 'plugin/trino-bigquery' && !contains(matrix.profile, 'cloud-tests-2') && (env.CI_SKIP_SECRETS_PRESENCE_CHECKS != '' || env.BIGQUERY_CASE_INSENSITIVE_CREDENTIALS_KEY != '')
run: |
$MAVEN test ${MAVEN_TEST} -pl :trino-bigquery -Pcloud-tests-case-insensitive-mapping -Dbigquery.credentials-key="${BIGQUERY_CASE_INSENSITIVE_CREDENTIALS_KEY}"
- name: Cloud Snowflake Tests
env:
SNOWFLAKE_URL: ${{ secrets.SNOWFLAKE_URL }}
SNOWFLAKE_USER: ${{ secrets.SNOWFLAKE_USER }}
SNOWFLAKE_PASSWORD: ${{ secrets.SNOWFLAKE_PASSWORD }}
SNOWFLAKE_DATABASE: ${{ secrets.SNOWFLAKE_DATABASE }}
SNOWFLAKE_ROLE: ${{ secrets.SNOWFLAKE_ROLE }}
SNOWFLAKE_WAREHOUSE: ${{ secrets.SNOWFLAKE_WAREHOUSE }}
if: matrix.modules == 'plugin/trino-snowflake' && !contains(matrix.profile, 'cloud-tests') && (env.SNOWFLAKE_URL != '' && env.SNOWFLAKE_USER != '' && env.SNOWFLAKE_PASSWORD != '')
run: |
$MAVEN test ${MAVEN_TEST} -pl :trino-snowflake -Pcloud-tests \
-Dconnector.name="snowflake" \
-Dsnowflake.test.server.url="${SNOWFLAKE_URL}" \
-Dsnowflake.test.server.user="${SNOWFLAKE_USER}" \
-Dsnowflake.test.server.password="${SNOWFLAKE_PASSWORD}" \
-Dsnowflake.test.server.database="${SNOWFLAKE_DATABASE}" \
-Dsnowflake.test.server.role="${SNOWFLAKE_ROLE}" \
-Dsnowflake.test.server.warehouse="${SNOWFLAKE_WAREHOUSE}"
- name: Iceberg Cloud Tests
id: tests-iceberg
env:
Expand Down Expand Up @@ -842,6 +863,7 @@ jobs:
- suite-clickhouse
- suite-mysql
- suite-iceberg
- suite-snowflake
- suite-hudi
- suite-ignite
exclude:
Expand Down
6 changes: 6 additions & 0 deletions core/trino-server/src/main/provisio/trino.xml
Original file line number Diff line number Diff line change
Expand Up @@ -296,6 +296,12 @@
</artifact>
</artifactSet>

<artifactSet to="plugin/snowflake">
<artifact id="${project.groupId}:trino-snowflake:zip:${project.version}">
<unpack />
</artifact>
</artifactSet>

<artifactSet to="plugin/sqlserver">
<artifact id="${project.groupId}:trino-sqlserver:zip:${project.version}">
<unpack />
Expand Down
1 change: 1 addition & 0 deletions docs/src/main/sphinx/connector.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ Prometheus <connector/prometheus>
Redis <connector/redis>
Redshift <connector/redshift>
SingleStore <connector/singlestore>
Snowflake <connector/snowflake>
SQL Server <connector/sqlserver>
System <connector/system>
Thrift <connector/thrift>
Expand Down
94 changes: 94 additions & 0 deletions docs/src/main/sphinx/connector/snowflake.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
# Snowflake connector

```{raw} html
<img src="../_static/img/snowflake.png" class="connector-logo">
```

The Snowflake connector allows querying and creating tables in an
external [Snowflake](https://www.snowflake.com/) account. This can be used to join data between
different systems like Snowflake and Hive, or between two different
Snowflake accounts.

## Configuration

To configure the Snowflake connector, create a catalog properties file
in `etc/catalog` named, for example, `example.properties`, to
mount the Snowflake connector as the `snowflake` catalog.
Create the file with the following contents, replacing the
connection properties as appropriate for your setup:

```none
connector.name=snowflake
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com
connection-user=root
connection-password=secret
snowflake.account=account
snowflake.database=database
snowflake.role=role
snowflake.warehouse=warehouse
```

### Arrow serialization support

This is an experimental feature which introduces support for using Apache Arrow
as the serialization format when reading from Snowflake. Please note there are
a few caveats:

- Using Apache Arrow serialization is disabled by default. In order to enable
it, add `--add-opens=java.base/java.nio=ALL-UNNAMED` to the Trino
{ref}`jvm-config`.

### Multiple Snowflake databases or accounts

The Snowflake connector can only access a single database within
a Snowflake account. Thus, if you have multiple Snowflake databases,
or want to connect to multiple Snowflake accounts, you must configure
multiple instances of the Snowflake connector.

% snowflake-type-mapping:

## Type mapping

Trino supports the following Snowflake data types:

| Snowflake Type | Trino Type |
| -------------- | -------------- |
| `boolean` | `boolean` |
| `tinyint` | `bigint` |
| `smallint` | `bigint` |
| `byteint` | `bigint` |
| `int` | `bigint` |
| `integer` | `bigint` |
| `bigint` | `bigint` |
| `float` | `real` |
| `real` | `real` |
| `double` | `double` |
| `decimal` | `decimal(P,S)` |
| `varchar(n)` | `varchar(n)` |
| `char(n)` | `varchar(n)` |
| `binary(n)` | `varbinary` |
| `varbinary` | `varbinary` |
| `date` | `date` |
| `time` | `time` |
| `timestampntz` | `timestamp` |

Complete list of [Snowflake data types](https://docs.snowflake.com/en/sql-reference/intro-summary-data-types.html).

(snowflake-sql-support)=

## SQL support

The connector provides read access and write access to data and metadata in
a Snowflake database. In addition to the {ref}`globally available
<sql-globally-available>` and {ref}`read operation <sql-read-operations>`
statements, the connector supports the following features:

- {doc}`/sql/insert`
- {doc}`/sql/delete`
- {doc}`/sql/truncate`
- {doc}`/sql/create-table`
- {doc}`/sql/create-table-as`
- {doc}`/sql/drop-table`
- {doc}`/sql/alter-table`
- {doc}`/sql/create-schema`
- {doc}`/sql/drop-schema`
Binary file added docs/src/main/sphinx/static/img/snowflake.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit dafaa5d

Please sign in to comment.