How to produce Confluent's Kafka dummy data generator (datagen) messages to Elasticsearch?
Date : March 29 2020, 07:55 AM
wish of those help In your datagen you've specified format=json - so you're producing JSON data to a Kafka topic. You've not provided your connector properties file, but since you say the connector does work if you use the Avro console producer then I guess you're using Avro deserialisation in the connector. Therefore, use Avro in datagen, or configure your connector to deserialise the data using json.
|
Kafka Connect Sink (GCS) only reading from latest offset, configure to read from earliest?
Date : December 23 2020, 07:01 AM
wish helps you When you create a connector for the first time it will take by default the earliest offset. You should see this in the Connect worker log: [2019-08-05 23:31:35,405] INFO ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
…
|
Build a combined docker image for snowflake-kafka-connector with cp-kafka-connect-base to deploy on kafka connect cluste
Date : March 29 2020, 07:55 AM
will be helpful for those in need Have you tried mounting external volumes with Docker and mapping this location where the Snowflake Connector jar is stored: https://docs.confluent.io/current/installation/docker/operations/external-volumes.html#For example: connect:
image: confluentinc/kafka-connect-datagen:latest
build:
context: .
dockerfile: Dockerfile
hostname: connect
container_name: connect
depends_on:
- zookeeper
- broker
- schema-registry
ports:
- "8083:8083"
volumes:
- ~/my-location:/etc/kafka-connect/jar
CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components,/etc/kafka-connect/jars"
|
Kafka Connect error : java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestEx
Date : March 29 2020, 07:55 AM
may help you . When I changed the test.config like the below, it works. (JSON format -> regular properties format) test.config name=mysql-source-demo-customers
tasks.max=1
connector.class =io.debezium.connector.mysql.MySqlConnector
database.hostname=localhost
database.port = 3306
database.user =root
database.password= dsm1234
database.server.id= 1234
database.server.name =jin
table.whitelist= demo.customers
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic= dbhistory.demo
include.schema.changes =true
|
why should i use docker image "confluentinc/kafka" to kafka cluster?
Tag : docker , By : marocchino
Date : March 29 2020, 07:55 AM
it should still fix some issue If you are running Kafka (and Zookeeper) on your host, then, no, you don't need Docker at all. You should not use localhost in any connection string from the container, rather use the external service names of the containers and remove network_mode from the configurations.
|