简体   繁体   中英

How to make my code dealing with calculations based on data from large data sets faster in C++?

I am writing a code in which I have input data of size 50000x20 matrix (50000 blocks , 20 properties each). I then calculations to decide the block I would select to move based on some calculations for which I have to run the loop over all blocks. Then I make the some changes to a characteristic value (a different variable )for this block and repeat the same process over and over. Is do the calculations for each block inside the loop. Is calling a function a better and faster way to do it? This code is running without converging (based on some convergence criteria) for long. How so I deal with this? Is this approach of dealing with large data sets good. Can I do better? Please help.

Thanks Surendra

Calling a function helps with readability and can potentially even slow down your code if it's called extensively and is not inline .

You need to do benchmarking and determine what your bottlenecks are and also take a look at dynamic programming , it could be of help to you.

Would also help to post some code :).

Does changing the characteristic value for block affect the other blocks?

If not, then there's probably no need to repeat the calculations for the other blocks.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM