Sink connector for writing data to Redis
This connector expects to received data with a key of bytes and a values of bytes. If your data is structured you need to use a Transformation to convert this data from structured data like a Struct to an array of bytes for the key and value.
This connector supports deletes. It will issue a delete to the Redis cluster for any key that does not have a corresponding value. In Kafka a record that contains a key and a null value is considered a delete.
Importance: High
Type: List
Default Value: [localhost:6379]
Importance: Medium
Type: String
Default Value: Standalone
Validator: ValidEnum{enum=ClientMode, allowed=[Standalone, Cluster]}
Importance: Medium
Type: Int
Default Value: 1
Importance: Medium
Type: Long
Default Value: 10000
Validator: [100,...]
redis.operation.timeout.ms
Importance: Medium
Type: Password
Default Value: [hidden]
Importance: Medium
Type: Boolean
Default Value: false
Importance: Medium
Type: Password
Default Value: [hidden]
Importance: Medium
Type: String
Importance: Medium
Type: Password
Default Value: [hidden]
Importance: Medium
Type: String
Importance: Low
Type: Boolean
Default Value: true
Importance: Low
Type: Int
Default Value: 2147483647
Importance: Low
Type: Int
Default Value: 10000
Importance: Low
Type: Boolean
Default Value: false
Importance: Low
Type: Boolean
Default Value: true
Importance: Low
Type: String
Default Value: JDK
Validator: ValidEnum{enum=RedisSslProvider, allowed=[OPENSSL, JDK]}
This configuration is used typically along with standalone mode.
name=RedisSinkConnector1
connector.class=com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector
tasks.max=1
topics=< Required Configuration >
This configuration is used typically along with distributed mode.
Write the following json to connector.json
, configure all of the required values, and use the command below to
post the configuration to one the distributed connect worker(s).
{
"config" : {
"name" : "RedisSinkConnector1",
"connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
"tasks.max" : "1",
"topics" : "< Required Configuration >"
}
}
Use curl to post the configuration to one of the Kafka Connect Workers. Change http://localhost:8083/
the the endpoint of
one of your Kafka Connect worker(s).
Create a new instance.
curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors
Update an existing instance.
curl -s -X PUT -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors/TestSinkConnector1/config