简体   繁体   中英

RethinkDB update transaction performance

I'm a newbie here and also newbie for rethinkdb. First of all, I'm sorry for my bad english and I have a questions about update transaction performance of rethinkdb.

Im using Nodejs with Native JS API of rethinkdb. I need to handle with read file stream and get the data to update in rethinkdb. After I acquired the data and filter it with 1k rows then I sent to update in rethinkdb. Then nodejs server with socketio trigger the changefeed function .

It takes 1 sec/1000 transactions. (SSD Drive)

r.table('mds').getAll(data.symbol, { index : "symbol" }).update({ price : data.price, update_date : moment().format('YYYY-MM-DD HH:mm:ss') }, { returnChanges : false}).run(conn, function(err, cursor)....

Is it usually normal for rethinkdb update performance? Could it be faster? Or Am I wrong with the query or conditions?

How many rows are returned by one of those getAll calls? Depending on the number of rows modified in each transaction, 1000 transactions per second might or might not be reasonable.

If the number of rows in each transaction is small, you should probably be getting better performance. One thing you could try is turning on soft durability for the writes. If that doesn't help (or if you need hard durability), the only other thing to do would be to add more RethinkDB servers to your cluster and shard your table across them.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM