Databricks
Learn the required and optional properties of creating a Databricks Connection, Credential, Read Connector, and Write Connector.
Prerequisites
- Access to create a Connection
- Databricks account
Connection Properties
The following table describes the fields available when creating a new Databricks Connection. Create a connection using the information below and these step-by-step instructions.
Field | Required | Description |
---|---|---|
Access Type | Required | This connection type is Read-Only, Write-Only, or Read-Write. |
Connection Name | Required | Input your desired name. |
Server Host Name | Required | Your Databricks instance name. Format: dbc-01234567-89ab.cloud.databricks.com . |
Execution Context for SQL Work | Required | Select the context for SQL work. For now, this option is Use Databricks SQL Endpoint. |
Endpoint ID | Required | The format for the Databricks endpoint is 0123456789abcdef .This is the last part of HTTP Path: /sql/1.0/endpoints/0123456789abcdef . |
Execution Context for Non-SQL Work | Required | Current default option is Use an all-purpose Databricks Cluster. |
Cluster ID | Required | The Databricks Cluster ID for non-SQL work. Format: 0123-012345-zyxwvuts See Databricks documentation to learn more.❗️NOTE: To use the Unity Catalog, you must set up the all-purpose cluster configured for a Databricks Connection as Single user . |
Catalog Name | Optional | Name of catalog in the Unity Catalog metastore. |
Requires Credentials | Optional | Enable "Requires Credentials" if this connection is to a private location that requires credentials to be passed through as part of authentication. Generally, this option should be selected for a Databricks Connection. |
Credential Properties
The following table describes the fields available when creating a new Databricks credential.
Field | Required | Description |
---|---|---|
Credential Name | Required | The name to identify this credential with. This credential will be available as a selection for future use. |
Credential Type | Required | This field will automatically populate with Databricks . |
Access Token | Required | The JSON access token. Manage access tokens with Databricks. |
Read Connector Properties
The following table describes the fields available when creating a new Databricks Read Connector. Create a new Read Connector using the information below and these step-by-step instructions.
Field | Required | Description |
---|---|---|
Name | Required | Provide a name for your connector. We recommend using lowercase with underscores in place of spaces. |
Description | Optional | Describes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons. |
Catalog Name | Optional | Name of catalog in the Unity Catalog metastore. This overrides the catalog name specified at the Connection level. |
Database/Schema Name | Required | Name of the schema (database) the data will be written to. |
Table Name | Required | Name of the table containing the data to ingest. |
Pagination Size | Optional | Pagination allows Ascend to ingest records by pages. If left unspecified, the default is 100,000 records per page. |
Replication Strategy | Optional | Full Resync, Filter by column range, or Incremental column name. See Database Reading Strategies for more information. |
Write Connector Properties
The following table describes the fields available when creating a new Databricks Write Connector. Create a new Write Connector using the information below and these step-by-step instructions.
Field | Required | Description |
---|---|---|
Name | Required | Provide a name for your connector. We recommend using lowercase with underscores in place of spaces. |
Description | Optional | Describes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons. |
Upstream | Required | The name of the previous connector to pull data from. |
Catalog Name | Optional | Name of catalog in the Unity Catalog metastore. This overrides the catalog name specified at the Connection level. |
Database/Schema Name | Required | Name of the schema (database) the data will be written to. |
Table Name | Required | Name of the table to write the data to. |
Updated 9 months ago