Debezium with Postgres | Kafka Consumer not able to consume any message

Published

Here is my docker-compose file:

version: '3.7'

services:

  postgres:
    image: debezium/postgres:12
    container_name: postgres
    networks: 
      - broker-kafka
    environment:
      POSTGRES_PASSWORD: admin
      POSTGRES_USER: antriksh
    ports:
      - 5499:5432
      
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    container_name: zookeeper
    networks: 
      - broker-kafka
    ports:
      - 2181:2181
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  kafka:
    image: confluentinc/cp-kafka:latest
    container_name: kafka
    networks: 
      - broker-kafka
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_LOG_CLEANER_DELETE_RETENTION_MS: 5000
      KAFKA_BROKER_ID: 1
      KAFKA_MIN_INSYNC_REPLICAS: 1

  connector:
    image: debezium/connect:latest
    container_name: kafka_connect_with_debezium
    networks: 
      - broker-kafka
    ports:
      - "8083:8083"
    environment:
      GROUP_ID: 1
      CONFIG_STORAGE_TOPIC: my_connect_configs
      OFFSET_STORAGE_TOPIC: my_connect_offsets
      BOOTSTRAP_SERVERS: kafka:29092
    depends_on:
      - zookeeper
      - kafka

networks: 
  broker-kafka:
    driver: bridge  

I am able to create table and insert data into it. I am also able to initialise connector using following config –

curl -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '
{                                                
 "name": "payment-connector",
 "config": {
 "connector.class": "io.debezium.connector.postgresql.PostgresConnector",
 "tasks.max": "1",
 "database.hostname": "postgres",
 "database.port": "5432",
 "database.user": "antriksh",
 "database.password": "admin",
 "database.dbname" : "payment",
 "database.server.name": "dbserver1",
 "database.whitelist": "payment",
 "database.history.kafka.bootstrap.servers": "localhost:9092",
 "database.history.kafka.topic": "schema-changes.payment",
 "publication.name": "mytestpub",
 "publication.autocreate.mode": "all_tables"
 }
}'

I start my Kafka Consumer like this

kafka-console-consumer --bootstrap-server kafka:29092 --from-beginning --topic dbserver1.public.transaction --property print.key=true --property key.separator="-"

But whenever I update or insert any changes inside my db I don’t see the messages being relayed to Kafka Consumer.

I have put the config property – "publication.autocreate.mode": "all_tables" which will create a publication automatically for all tables. But when I do select * from pg_publication I see nothing. It’s an empty table.

There is a Debezium named replication slot so I don’t know which config or step am I missing which is preventing Kafka Consumer to consume the message.

Source: Docker Questions

Answers

Leave a Reply

Still Have Questions?


Our dedicated development team is here for you!

We can help you find answers to your question for as low as 5$.

Contact Us
faq