Context:
I have multiple applications on premise which are publishing real-time messages on Enterprise Kafka and Solace queues. Volume and velocity of the messages are considerable. These messages are consumed by message processing modules and store that into real time data-store hosted on premise as of now. we are planning to move the message processing modules and real-time data-store on GCP.
Problem statement:
As the message processing modules and real-time data-store are being moved to GCP there is a need to publish / push the messages from "the on premise Kafka topics and Solace queues to GCP Pubsub topics ".
For this I was planning to write a NiFi workflow on the on premise NiFi cluster. It would be great if somebody share thoughts if has already done similar attempt?
Kindly let know if additional information is required.
I could not find similar question already posted here if it is kindly point.
Thanks in advance!
I've worked briefly with both Kafka and GCP PubSub. I haven't worked much with Solace but from what I know, you may have to do a bit of code change on the nifi-hms-bundle
to customize the JMS controller service to have standard JMS JNDI connection factory, after which you can leverage NiFi's ConsumeJMS
and PublishJMS
processor to read/write to/from Solace queues.
So in general, your flow would be like this:
ConsumeKafka
configured to the correct topic and ConsumeJMS
configured to use the the custom JNDIConnectionFactoryProvider
instead of the built-in JMSConnectionFactoryProvider
. success
output of both these processors and connect it with PublishGCPPubSub
. I would recommend to use record based ConsumeKafka
processors. Choose the processor of mathcing Kafka API version.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.