Databricks

Learn the required and optional properties of creating a Databricks Connection, Credential, Read Connector, and Write Connector.

databricks logo

Prerequisites

  • Access to create a Connection
  • Databricks account

Connection Properties

The following table describes the fields available when creating a new Databricks Connection. Create a connection using the information below and these step-by-step instructions.

FieldRequiredDescription
Access TypeRequiredThis connection type is Read-Only, Write-Only, or Read-Write.
Connection NameRequiredInput your desired name.
Server Host NameRequiredYour Databricks instance name. Format: dbc-01234567-89ab.cloud.databricks.com.
Execution Context for SQL WorkRequiredSelect the context for SQL work. For now, this option is Use Databricks SQL Endpoint.
Endpoint IDRequiredThe format for the Databricks endpoint is 0123456789abcdef.This is the last part of HTTP Path: /sql/1.0/endpoints/0123456789abcdef.
Execution Context for Non-SQL WorkRequiredCurrent default option is Use an all-purpose Databricks Cluster.
Cluster IDRequiredThe Databricks Cluster ID for non-SQL work. Format: 0123-012345-zyxwvuts See Databricks documentation to learn more.

❗️NOTE: To use the Unity Catalog, you must set up the all-purpose cluster configured for a Databricks Connection as Single user.
Catalog NameOptionalName of catalog in the Unity Catalog metastore.
Requires CredentialsOptionalEnable "Requires Credentials" if this connection is to a private location that requires credentials to be passed through as part of authentication. Generally, this option should be selected for a Databricks Connection.

Credential Properties

The following table describes the fields available when creating a new Databricks credential.

FieldRequiredDescription
Credential NameRequiredThe name to identify this credential with. This credential will be available as a selection for future use.
Credential TypeRequiredThis field will automatically populate with Databricks.
Access TokenRequiredThe JSON access token. Manage access tokens with Databricks.

Read Connector Properties

The following table describes the fields available when creating a new Databricks Read Connector. Create a new Read Connector using the information below and these step-by-step instructions.

FieldRequiredDescription
NameRequiredProvide a name for your connector. We recommend using lowercase with underscores in place of spaces.
DescriptionOptionalDescribes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons.
Catalog NameOptionalName of catalog in the Unity Catalog metastore. This overrides the catalog name specified at the Connection level.
Database/Schema NameRequiredName of the schema (database) the data will be written to.
Table NameRequiredName of the table containing the data to ingest.
Pagination SizeOptionalPagination allows Ascend to ingest records by pages. If left unspecified, the default is 100,000 records per page.
Replication StrategyOptionalFull Resync, Filter by column range, or Incremental column name. See Database Reading Strategies for more information.

Write Connector Properties

The following table describes the fields available when creating a new Databricks Write Connector. Create a new Write Connector using the information below and these step-by-step instructions.

FieldRequiredDescription
NameRequiredProvide a name for your connector. We recommend using lowercase with underscores in place of spaces.
DescriptionOptionalDescribes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons.
UpstreamRequiredThe name of the previous connector to pull data from.
Catalog NameOptionalName of catalog in the Unity Catalog metastore. This overrides the catalog name specified at the Connection level.
Database/Schema NameRequiredName of the schema (database) the data will be written to.
Table NameRequiredName of the table to write the data to.

© Ascension Labs Inc. | All Rights Reserved