简体   繁体   中英

issue in data scheduling from mysql to mongo db

We are developing a SAAS system in which we have initially used MySQL as DB but as data grows our listings of data get slower so to resolve that we had used Mongo DB in which we had stored the prepared JSON we need to display (with all the join of MySQL) for some time it works well

we have written a scheduler in java which runs in every 2 mins and update the modified records from MySQL to mongo

Initially, it works well but as time goes and data and its rate increases it fails many times so we decided to find any alternative for that which can read from MySQL binlogs and we can merge MySQL tables according to our need on the way and store in Mongo DB

Table 1
  Col11
  Col12
  Col13
  Col14
  Col15

Table 2 Col21 Col22 Col23 Col24 Col25

Mongo Collection

  Col11
  Col12
  Col13
  Col14
  Col15 
  Col21
  Col22
  Col23
  Col24
  Col25

One option could be Kafka Connect for moving data from MySQL to Kafka and then from Kafka to your MongoDB.

Step 1: Use JDBCSourceConnector to move data from MySQL to Kafka

The Kafka Connect JDBC source connector allows you to import data from any relational database with a JDBC driver into Apache Kafka® topics.

Step 2: Use MongoDB Connector to move data from Kafka to MongoDB

Map and persist events from Kafka topics directly to MongoDB collections with ease. Ingest events from your Kakfa topics directly into MongoDB collections, exposing the data to your services for efficient querying, enrichment, and analytics.

Note that MongoDB connector can be used as source or sink connector. In your case, you'd need the sink connector, for moving data from your Kafka topic(s) to your target table(s) in MongoDB.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM