簡體   English   中英

Apple M1 - RocksDBException 導致打開商店時出錯:將 KStream 加入 KTable 時找不到列族

[英]Apple M1 - Error opening store caused by RocksDBException: Column family not found when joining KStream to KTable

我正在嘗試 leftJoin 來自 2 個流的事件。 最初,我加入了 2 個 KStreams,一切正常。 但是,當我嘗試將第二個 stream 轉換為 KTable 時,出現錯誤。 這是將第二個 stream 轉換為 KTable 的代碼:

@Bean
public KafkaStreams kafkaStreams() throws IOException {
        final Properties props = configureKafkaStreamsProperties();
            
        ObjectMapper mapper = new ObjectMapper();
        mapper.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES);

        final StreamsBuilder builder = new StreamsBuilder();

        // 1st Structured stream
        KStream<String, String> firstStream = builder.stream("topic-1", Consumed.with(Serdes.String(), Serdes.String()));


        KStream<String, String> firstStreamTransformed = firstStream.map((k, v) -> {
                        try {
                                InputModelOne model = mapper.readValue(v, InputModelOne.class);
                                return new KeyValue<>(model.getId(), v);
                        } catch (Exception e) {
                                logger.error(e.toString());
                                return new KeyValue<>(k, v);
                        }

         });


        // Second stream
        KStream<String, String> secondStream = builder.stream("topic-2",
                                Consumed.with(Serdes.String(), Serdes.String()));

        KStream<String, String> secondStreamTransformed = secondStream.map((k, v) -> {
                        try {
                                InputModelTwo model = mapper.readValue(v, InputModelTwo.class);
                                return new KeyValue<>(model.getId(), v);

                        } catch (Exception e) {
                                logger.error(e.toString());
                                return new KeyValue<>(k, v);
                        }
        });

        // Build KTable from second topic
        KTable<String, String> secondTable = secondStreamTransformed.toTable(Materialized.as("topic-2-table"));


        // Valuejoiner
        ValueJoiner<String, String, String> joiner = (one, two) -> {

                try {
                                
                    InputModelOne modelOne = mapper.readValue(one, InputModelOne.class);
                    InputModelTwo modelTwo = new InputModelTwo();

                    // Create output object with properties
                    OutputModel out = new OutputModel(modelOne.getId());
                    out.setOneTimestamp(modelOne.getTimestamp());
                    out.setTwoTimestamp(modelTwo.getTimestamp());

                    return mapper.writeValueAsString(out);
                    } catch (JsonProcessingException e) {
                                // TODO Auto-generated catch block
                                e.printStackTrace();
                                return null;
                    }
         };

         KStream<String, String> joined = firstStreamTransformed leftJoin(secondTable,
                                joiner);


         joined.to("joined-topics", Produced.with(Serdes.String(), Serdes.String()));

這是錯誤:

org.apache.kafka.streams.errors.ProcessorStateException: Error opening store joined-topics at location /var/folders/lx/dz_x9j5d7lz4mfymgzkcn7wr0000gn/T/kafka-streams/streams-pipe/2_0/rocksdb/joined-topics
        at org.apache.kafka.streams.state.internals.RocksDBTimestampedStore.openRocksDB(RocksDBTimestampedStore.java:87) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:186) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.RocksDBStore.init(RocksDBStore.java:254) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.WrappedStateStore.init(WrappedStateStore.java:55) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.ChangeLoggingKeyValueBytesStore.init(ChangeLoggingKeyValueBytesStore.java:55) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.WrappedStateStore.init(WrappedStateStore.java:55) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.CachingKeyValueStore.init(CachingKeyValueStore.java:75) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.WrappedStateStore.init(WrappedStateStore.java:55) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.MeteredKeyValueStore.lambda$init$1(MeteredKeyValueStore.java:122) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:884) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.state.internals.MeteredKeyValueStore.init(MeteredKeyValueStore.java:122) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.ProcessorStateManager.registerStateStores(ProcessorStateManager.java:201) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.StateManagerUtil.registerStateStores(StateManagerUtil.java:103) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.StreamTask.initializeIfNeeded(StreamTask.java:216) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.TaskManager.tryToCompleteRestoration(TaskManager.java:433) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.StreamThread.initializeAndRestorePhase(StreamThread.java:849) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:731) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:583) ~[kafka-streams-2.8.0.jar:na]
        at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:556) ~[kafka-streams-2.8.0.jar:na]
Caused by: org.rocksdb.RocksDBException: Column family not found: keyValueWithTimestamp
        at org.rocksdb.RocksDB.open(Native Method) ~[rocksdbjni-6.29.4.1.jar:na]
        at org.rocksdb.RocksDB.open(RocksDB.java:306) ~[rocksdbjni-6.29.4.1.jar:na]
        at org.apache.kafka.streams.state.internals.RocksDBTimestampedStore.openRocksDB(RocksDBTimestampedStore.java:75) ~[kafka-streams-2.8.0.jar:na]
        ... 18 common frames omitted

我將 Docker 用於 Kafka 和 Zookeeper,並且 Kafka 在本地運行。 任何幫助或建議將不勝感激。 我確實希望我可以繼續使用我的 Mac,而不是切換到質量較低的顯示器。 干杯,伙計們!

您每日提醒更新所有依賴項並重新啟動 IDE

<dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-streams -->
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-streams</artifactId>
            <!-- <version>2.8.0</version> -->
            <version>3.2.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <!-- <version>2.8.0</version> -->
            <version>3.2.3</version>
        </dependency>

      
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-streams-avro-serde</artifactId>
            <!-- <version>7.0.1</version> -->
            <version>7.2.1</version>
        </dependency>
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-schema-serializer</artifactId>
            <!-- <version>7.0.1</version> -->
            <version>7.2.1</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.rocksdb/rocksdbjni -->
        <dependency>
            <groupId>org.rocksdb</groupId>
            <artifactId>rocksdbjni</artifactId>
            <!-- <version>6.29.4.1</version> -->
            <version>7.5.3</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/com.squareup.okhttp3/okhttp -->
        <dependency>
            <groupId>com.squareup.okhttp3</groupId>
            <artifactId>okhttp</artifactId>
            <version>4.10.0</version>
        </dependency>
         <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
            <version>2.9.0</version>
        </dependency>
    </dependencies>

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM