簡體   English   中英

Avro 向后模式演化拋出 ClassCastException

[英]Avro backward schema evolution throws ClassCastException

我在嘗試使用簡單的 Java 程序測試 Avro 模式演變時遇到ClassCastException

Avro 版本: 1.10.0

customer-v1.avsc

{
  "type": "record",
  "namespace": "com.practice.kafka",
  "name": "Customer",
  "doc": "Avro schema for Customer",
  "fields": [
    {"name":  "first_name", "type":  "string", "doc": "Customer first name"},
    {"name":  "last_name", "type":  "string", "doc": "Customer last name"},
    {"name":  "automated_email", "type":  "boolean", "default": true, "doc": "Receive marketing emails or not"}
  ]
}

customer-v2.avsc

{
  "type": "record",
  "namespace": "com.practice.kafka",
  "name": "CustomerV2",
  "doc": "Avro schema for Customer",
  "fields": [
    {"name":  "first_name", "type":  "string", "doc": "Customer first name"},
    {"name":  "last_name", "type":  "string", "doc": "Customer last name"},
    {"name":  "phone_number", "type":  ["null","boolean"], "default": null, "doc": "Optional phone number"},
    {"name":  "email", "type":  "string", "default":  "missing@example.com", "doc":  "Optional email address"}
  ]
}

Program to serialize v1 and deserialize v2

package com.practice.kafka;

import org.apache.avro.file.DataFileReader;
import org.apache.avro.file.DataFileWriter;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.specific.SpecificDatumReader;
import org.apache.avro.specific.SpecificDatumWriter;

import java.io.File;
import java.io.IOException;

public class BackwardSchemaEvolutionSample {

    public static void main(String[] args) {

        // Step 1 - Create specific record
        Customer customer = Customer.newBuilder().setFirstName("John").setLastName("Doe").setAutomatedEmail(false).build();

        // Step 2 - Write specific record to a file
        final DatumWriter<Customer> datumWriter = new SpecificDatumWriter<>();
        try (DataFileWriter<Customer> dataFileWriter = new DataFileWriter<>(datumWriter)) {
            dataFileWriter.create(customer.getSchema(), new File("customer-v1.avro"));
            dataFileWriter.append(customer);
        } catch (IOException e) {
            e.printStackTrace();
        }

        // Step 3 - Read specific record from a file
        final File file = new File("customer-v1.avro");
        final DatumReader<CustomerV2> datumReader = new SpecificDatumReader<>();
        CustomerV2 customerRecord;
        try (DataFileReader<CustomerV2> dataFileReader = new DataFileReader<>(file, datumReader)) {
            customerRecord = dataFileReader.next();
            System.out.println(customerRecord.toString());
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

Result

Exception in thread "main" java.lang.ClassCastException: class com.practice.kafka.Customer cannot be cast to class com.practice.kafka.CustomerV2 (com.practice.kafka.Customer and com.practice.kafka.CustomerV2 are in unnamed module of loader 'app')
    at com.practice.kafka.SchemaEvolutionSample.main(SchemaEvolutionSample.java:34)

你能告訴我如何解決這個錯誤嗎?

您定義了 2 個數據類型CustomerCustomer2並且您不能進行任何轉換,因為它們沒有繼承關系。 所以 Java 不能進行轉換,而你得到的是ClassCastException 在您的代碼中,唯一的解決方案是捕獲ClassCastException並在 catch 塊中將 Customer 轉換為 Customer2。

我假設您正在模擬 Kafka 環境中架構的更改。 在這種情況下,您將通過添加新字段或刪除舊字段來擴展現有的 avro 模式。

只要類的名稱保持不變,avro 架構更改將起作用。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM