Kafka
Learn the required and optional properties of creating a Kafka Connection, Credential, Read Connector, and Write Connector.
If your Kafka is in a private subnet or VPC and not accessible from the public internet. Please contact [email protected] to set up networking.
Prerequisites
- Access credentials
- Kafka topic
- Consumer configurations
Connection Properties
The following table describes the fields available when creating a new Kafka Connection. Create a connection using the information below and these step-by-step instructions.
Field | Required | Description |
---|---|---|
Access Type | Required | This connection type is Read-Only, Write-Only, or Read-Write. |
Connection Name | Required | Input your desired name. |
Kafka Cluster Identifier | Required | Must be unique across the full Ascend environment. It lets Ascend store the data read from a specific Kafka cluster (e.g., dev cluster) into the correct location in Ascend identified by this identifier. |
Bootstrap Servers | Required | This is a comma-separated list of host and port pairs that are the addresses of the Kafka brokers in a "bootstrap" Kafka cluster that a Kafka client connects to initially to bootstrap itself. In other words, it provides the initial hosts that act as the starting point for a Kafka client to discover the full set of alive servers in the cluster. A host and port pair uses : as the separator as following examples: localhost:9092 localhost:9092,another.host:9092 |
Consumer Group ID | Optional | Consumers can join a consumer group by using group id. Ascend will assign a default one if this is not passed in.. |
Skip Data Loss Errors | Optional | By default, Ascend will display an error and pause processing If Ascend encounters data loss. Selecting "Skip Data Loss Errors" lets Ascend ignore data loss and continue processing. |
Consumer Configs | Optional | Open set of configs in Kaka property style to pass down to the consumer, such as security.protocol=SASL_SSL. See Kafka Consumer Configurations to learn more. |
Requires Credentials | Optional | Check this box to create a new credential or select an existing credential. |
Credential Properties
The following table describes the fields available when creating a new Kafka credential.
Field | Required | Description |
---|---|---|
Credential Name | Required | The name to identify this credential with. This credential will be available as a selection for future use. |
Credential Type | Required | This field will automatically populate with Kafka . |
SASL.JAAS.CONFIG | Required | The source SASL.JAAS.CONFIG config file in Kafka. |
Format of the SASL.JAAS.CONFIG Credential
The sample below shows how the sasl.jaas.config key/value pair looks in an actual Kafka config file.
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="alice" \
password="alice-secret";
When entering the credential into the Ascend SASL.JAAS.CONFIG credential entry box (Figure 3), you must enter it without the preceding sasl.jaas.config= text. Only enter what comes after that text as the actual value in the credential (seen as obfuscated dots in Figure 3).
If you fail to format the credential correctly, you will likely see error messages about a missing LoginModule or other similar errors when testing the connection.
The correct format should look like this:
org.apache.kafka.common.security.plain.PlainLoginModule required
username="alice"
password="alice-secret";
Read Connector Properties
The following table describes the fields available when creating a new Kafka Read Connector. Create a new Read Connector using the information below and these step-by-step instructions.
Field | Required | Description |
---|---|---|
Name | Required | Provide a name for your connector. We recommend using lowercase with underscores in place of spaces. |
Description | Optional | Describes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons. |
Subscription | Required | Single Topic: Topic to subscribe to Multi-Topic: Provide a Topic Pattern to subscribe to all the topics matching the reg-ex. |
Key Deserializer | Required | Before transmitting the entire message to the broker, to let the producer know how to convert the message into byte array we use serializers. Similarly, to convert the byte array back to the object we use the deserializers by the consumer. String Deserializer: Convert byte array(containing string) to string Binary Deserializer (B64 Encoded): Convert byte array (containing formats other than strings like AVRO) to Binary and then base64 encode it. |
Value Deserializer | Required | Options are the same as Key Deserializer. String Deserializer: Convert byte array(containing string) to string Binary Deserializer (B64 Encoded): Convert byte array (containing formats other than strings like AVRO) to Binary and then base64 encode it. |
Starting Position | Optional | You can chose to read all the messages from the beginning (Earliest) or just after the last one (Latest). |
Consumer Configs | Optional | Override the configs at the connection level for this connector. |
Data Version | Optional | Assign a Data Version. A change to Data Version triggers the component to reprocess. It results in no longer using data previously ingested by this Connector, and a complete ingest of new data. |
Write Connector Properties
The following table describes the fields available when creating a new {connection type name} Write Connector. Create a new Write Connector using the information below and these step-by-step instructions.
Field | Required | Description |
---|---|---|
Name | Required | Provide a name for your connector. We recommend using lowercase with underscores in place of spaces. |
Description | Optional | Describes the connector. We recommend providing a description if you are ingesting information from the same source multiple times for different reasons. |
Upstream | Required | The name of the previous connector the Write Connector will pull data from. |
Subscription | Required | Single Topic: Topic to subscribe to Multi-Topic: Provide a Topic Pattern to subscribe to all the topics matching the reg-ex. |
Key Deserializer | Required | Before transmitting the entire message to the broker, to let the producer know how to convert the message into byte array we use serializers. Similarly, to convert the byte array back to the object we use the deserializers by the consumer. String Deserializer: Convert byte array(containing string) to string Binary Deserializer (B64 Encoded): Convert byte array (containing formats other than strings like AVRO) to Binary and then base64 encode it. |
Value Deserializer | Required | Options are the same as Key Deserializer. String Deserializer: Convert byte array(containing string) to string Binary Deserializer (B64 Encoded): Convert byte array (containing formats other than strings like AVRO) to Binary and then base64 encode it. |
Starting Position | Optional | You can chose to read all the messages from the beginning (Earliest) or just after the last one (Latest). |
Producer Configs | Optional | Override the configs at the connection level for this connector. |
Data Version | Optional | Assign a Data Version. A change to Data Version triggers the component to reprocess. It results in no longer using data previously ingested by this Connector, and a complete ingest of new data. |
Updated 9 months ago