Kafka
Overview
The Airbyte Kafka destination allows you to sync data to Kafka. Each stream is written to the corresponding Kafka topic.
Prerequisites
- For Airbyte Open Source users using the Postgres source connector, upgrade your Airbyte platform to version
v0.40.0-alpha
or newer and upgrade your Kafka connector to version0.1.10
or newer
Sync overview
Output schema
Each stream will be output into a Kafka topic.
Currently, this connector only writes data with JSON format. More formats (e.g. Apache Avro) will be supported in the future.
Each record will contain in its key the uuid assigned by Airbyte, and in the value these 3 fields:
_airbyte_ab_id
: a uuid assigned by Airbyte to each event that is processed._airbyte_emitted_at
: a timestamp representing when the event was pulled from the data source._airbyte_data
: a json blob representing with the event data._airbyte_stream
: the name of each record's stream.
Features
Feature | Supported?(Yes/No) | Notes |
---|---|---|
Full Refresh Sync | No | |
Incremental - Append Sync | Yes | |
Incremental - Append + Deduped | No | |
Namespaces | Yes |
Getting started
Requirements
To use the Kafka destination, you'll need:
- A Kafka cluster 1.0 or above.