Snowflake connector#

The Snowflake connector allows querying and creating tables in an external Snowflake account. This can be used to join data between different systems like Snowflake and Hive, or between two different Snowflake accounts.

Configuration#

To configure the Snowflake connector, create a catalog properties file in etc/catalog named, for example, example.properties, to mount the Snowflake connector as the snowflake catalog. Create the file with the following contents, replacing the connection properties as appropriate for your setup:

connector.name=snowflake
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com
connection-user=root
connection-password=secret
snowflake.account=account
snowflake.database=database
snowflake.role=role
snowflake.warehouse=warehouse

Arrow serialization support#

This is an experimental feature which introduces support for using Apache Arrow as the serialization format when reading from Snowflake. Please note there are a few caveats:

  • Using Apache Arrow serialization is disabled by default. In order to enable it, add --add-opens=java.base/java.nio=ALL-UNNAMED to the Trino JVM config.

Multiple Snowflake databases or accounts#

The Snowflake connector can only access a single database within a Snowflake account. Thus, if you have multiple Snowflake databases, or want to connect to multiple Snowflake accounts, you must configure multiple instances of the Snowflake connector.

Type mapping#

Trino supports the following Snowflake data types:

Snowflake Type

Trino Type

boolean

boolean

tinyint

bigint

smallint

bigint

byteint

bigint

int

bigint

integer

bigint

bigint

bigint

float

real

real

real

double

double

decimal

decimal(P,S)

varchar(n)

varchar(n)

char(n)

varchar(n)

binary(n)

varbinary

varbinary

varbinary

date

date

time(n)

time(n)

timestampntz

timestamp

Complete list of Snowflake data types.

SQL support#

The connector provides read access and write access to data and metadata in a Snowflake database. In addition to the globally available and read operation statements, the connector supports the following features:

Performance#

The connector includes a number of performance improvements, detailed in the following sections.

Pushdown#

The connector supports pushdown for a number of operations:

Aggregate pushdown for the following functions:

Note

The connector performs pushdown where performance may be improved, but in order to preserve correctness an operation may not be pushed down. When pushdown of an operation may result in better performance but risks correctness, the connector prioritizes correctness.