简体   繁体   中英

How to rename primary key when using Debezium and Kafka Connect JDBC sink connector to synchronize databases?

I am attempting to synchronize a table in an upstream database to a downstream database using Debezium, following the approach described on the Debezium blog here .

In the downstream table, I only need certain columns from the upstream table. I also wish to change some of the column names (including the name of the primary key). If I do not attempt to rename the primary key, the synchronization works without any issues.

I am using:

  • SQL Server 2019 for both databases; and
  • Debezium 1.3 (but have also tried with Debezium 1.2 with the same results).

I have set out full details of my database and connector setup below.

(1) Database table definitions:

The DDL for the upstream table is:

CREATE TABLE [kafkatest.service1].dbo.Users (
    Id int IDENTITY(1,1) NOT NULL,
    Name nvarchar COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    CONSTRAINT PK_Users PRIMARY KEY (Id)
) GO

The DDL for the downstream table is:

CREATE TABLE [kafkatest.service2].dbo.Users (
    LocalId int IDENTITY(1,1) NOT NULL, // added to avoid IDENTITY_INSERT issue with SQL Server
    ExternalId int NOT NULL,
    ExternalName nvarchar COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    CONSTRAINT PK_Users PRIMARY KEY (LocalId)
) GO

In particular, note that the 'Id' column in the upstream table (which is the primary key) should be mapped to the 'ExternalId' column in the downstream table.

(2) Kafka Connect connector definitions:

Source connector:

{
    "name": "users-connector",
  "config": {
    "connector.class": "io.debezium.connector.sqlserver.SqlServerConnector",
    "tasks.max": "1",
    "database.server.name": "sqlserver",
    "database.hostname": "sqlserver",
    "database.port": "1433",
    "database.user": "sa",
    "database.password": "Password!",
    "database.dbname": "kafkatest.service1",
    "database.history.kafka.bootstrap.servers": "kafka:9092",
    "database.history.kafka.topic": "schema-changes.users",
    "table.whitelist": "dbo.Users"
  }
}

Sink connector:

{
    "name": "jdbc-sink",
    "config": {
        "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
        "tasks.max": "1",
        "topics.regex": "sqlserver\\.dbo\\.(Users)",
        "connection.url": "jdbc:sqlserver://sqlserver:1433;databaseName=kafkatest.service2",
        "connection.user": "sa",
        "connection.password": "Password!",
        "transforms": "unwrap,route,RenameField",
        "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
        "transforms.unwrap.drop.tombstones": "false",
        "transforms.route.type": "org.apache.kafka.connect.transforms.RegexRouter",
        "transforms.route.regex": "(?:[^.]+)\\.(?:[^.]+)\\.([^.]+)",
        "transforms.route.replacement": "$1",
        "transforms.RenameField.type": "org.apache.kafka.connect.transforms.ReplaceField$Value",
        "transforms.RenameField.renames": "Id:ExternalId,Name:ExternalName",
        "auto.create": "false",
        "auto.evolve": "false",
        "insert.mode": "upsert",
        "delete.enabled": "true",
        "pk.fields": "Id",
        "pk.mode": "record_key"
    }
}

As far as I am aware, "pk.mode" needs to be "record_key" in order for delete to be enabled. I have tried setting the "pk.fields" value to both "Id" and "ExternalId" and neither works.

(3) Error messages:

In the first case (ie "pk.fields": "Id") I get the following error:

2020-08-18 10:16:16,951 INFO   ||  Unable to find fields [SinkRecordField{schema=Schema{INT32}, name='Id', isPrimaryKey=true}] among column names [ExternalId, ExternalName, LocalId]   [io.confluent.connect.jdbc.sink.DbStructure]
2020-08-18 10:16:16,952 ERROR  ||  WorkerSinkTask{id=jdbc-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. Error: Cannot ALTER TABLE "Users" to add missing field SinkRecordField{schema=Schema{INT32}, name='Id', isPrimaryKey=true}, as the field is not optional and does not have a default value   [org.apache.kafka.connect.runtime.WorkerSinkTask]
org.apache.kafka.connect.errors.ConnectException: Cannot ALTER TABLE "Users" to add missing field SinkRecordField{schema=Schema{INT32}, name='Id', isPrimaryKey=true}, as the field is not optional and does not have a default value

In the second case (ie "pk.fields": "ExternalId") I get the following error:

2020-08-18 10:17:50,192 ERROR  ||  WorkerSinkTask{id=jdbc-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. Error: PK mode for table 'Users' is RECORD_KEY with configured PK fields [ExternalId], but record key schema does not contain field: ExternalId   [org.apache.kafka.connect.runtime.WorkerSinkTask]
org.apache.kafka.connect.errors.ConnectException: PK mode for table 'Users' is RECORD_KEY with configured PK fields [ExternalId], but record key schema does not contain field: ExternalId

Is it possible to rename a primary key when using Debezium? Or do I always need to structure my database tables so that the primary key name matches in both the upstream and downstream databases?

Try to rename key field:

"transforms": "unwrap,route,RenameField,RenameKey",
...
"transforms.RenameKey.type": "org.apache.kafka.connect.transforms.ReplaceField$Key",
"transforms.RenameKey.renames": "Id:ExternalId",

When you use "pk.mode": "record_key" , the primary keys from the message key are used to build the upsert query statement .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM