Snowflake

Learn the required and optional properties of creating an Snowflake Connection, Credential, Read Connector, and Write Connector

Snowflake

Prerequisites

  • Access credentials
  • Snowflake account
  • Warehouse and Database name

Connection Properties

The following table describes the fields available when creating a new Snowflake Connection. Create a connection using the information below and these step-by-step instructions.

FieldRequiredDescription
Access TypeRequiredThis connection type is Read-Only, Write-Only, or Read-Write.
Connection NameRequiredInput your desired name.
DescriptionOptionalAdd a description of this Connection.
AccountRequiredSpecifies your account identifier. It is common to use the "Account Locator in a Region" which is a format that is a little bit more concise of the form <account_locator>.<region_id>.<cloud>, such as companyaccount.europe-west4.gcp. DO NOT add the ".snowflakecomputing.com" suffix.
WarehouseRequiredName of the warehouse to use. Each individual read and write connector will be allowed to override which warehouse to use.
Database NameRequiredName of the default database to use. Each individual read and write connector will be allowed to override which database to use.
Optional Role NameOptionalA role to use for the connection to Snowflake. If not specified, this will use the default role for the user.
Requires CredentialsOptionalChoose from existing credentials or create new credential for connecting to Snowflake if Required Credentials checkbox is selected.

Credential Properties

The following table describes the fields available when creating a new Snowflake credential.

FieldRequiredDescription
Credential NameRequiredThe name to identify this credential with. This credential will be available as a selection for future use.
Credential TypeRequiredThis field will automatically populate with Snowflake.
UserRequiredSnowflake username to connect with.
PasswordRequiredSnowflake password to connect with.

πŸ“˜

Warehouse, Database, and Role Name Overrides

If you have multiple Warehouse, Database, and Role Names within Snowflake that share a single credential, you can either

  • Create a new Connection and reuse the Credential, or
  • Use a single Connection and override the warehouse, database, and/or role name within each Read Connector and Write Connector.

Read Connector Properties

The following table describes the fields available when creating a new Snowflake Read Connector. Create a new Read Connector using the information below and these step-by-step instructions.

FieldRequiredDescription
NameRequiredProvide a name for your connector. We recommend using lowercase with underscores in place of spaces.
DescriptionOptionalDescribes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons.
Override Warehouse NameOptionalControls which warehouse Ascend will use for ingestion of the source table. This will override the Connection warehouse.
Override Database NameOptionalChanges which database is used when reading that table.
Override Optional Role NameOptionalControls which database role is used when making the connection to read a specific table.
Table NameRequiredName of the table to ingest. The table name is case-sensitive. Enclose it in double quotes if the table name is not all upper case.
Schema NameOptionalThe name of the Snowflake Schema.
Replication StrategyOptionalFull Resync, Filter by column range, or Incremental column name. See Database Reading Strategies for more information.
Data VersionOptionalA change to Data Version results in no longer using data previously ingested by this Connector, and a complete ingest of new data.

Write Connector Properties

The following table describes the fields available when creating a new Snowflake Write Connector. Create a new Write Connector using the information below and these step-by-step instructions.

FieldRequiredDescription
NameRequiredProvide a name for your connector. We recommend using lowercase with underscores in place of spaces.
DescriptionOptionalDescribes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons.
UpstreamRequiredThe name of the previous connector to pull data from.
Override Warehouse NameOptionalControls which database role is used when making the connection to write a specific table.
Override Database NameOptionalChanges which database is used when writing that table.
Override Optional Role NameOptionalControls which database role is used when making the connection to read a specific table.
Table NameRequiredName of the table to write to. The table name is case-sensitive. Enclose it in double quotes if the table name is not all upper case.
Schema NameOptionalThe name of the Snowflake Schema.
On Schema MismatchOptionalSelect how you want Ascend to handle writing data if the schema indicated above does not match the schema of the table in Snowflake. Options are as follows:

- Skip schema check
- Stop and display error
- Recreate table
- Alter table

Schema Overrides

The following fields are used when overriding the schema of an existing table within Snowflake.

FieldRequiredDescription
Clustering KeysOptionalDo not modify the column names to upper-case
Keep Column CaseOptionalDo not modify the column names to upper-case. Column case will change to uppercase by default.
Max Number of Parallel Ascend PartitionsOptionalThe max number of partitions Ascend will write in parallel.
Load Timestamp Column NameOptionalAdd a column with the load timestamp when this field is provided.
Disable Transactional WriteOptionalTransactional Write is enabled by default.
A SQL Statement for Ascend to Execute Before WritingOptionalExecute a pre-processing script before writing to final table.
A SQL Statement for Ascend to Execute After WritingOptionalExecute a pre-processing script after writing to final table.
Data VersionOptionalA change to Data Version results in no longer using data previously ingested by this Connector, and a complete ingest of new data.