简体   繁体   中英

Running thousands of goroutines concurrently

I'm developing a simulator model for historical trading data. I have the below pseudo code where I'm using many goroutines.

start_day := "2015-01-01 09:15:00 "
end_day := "2015-02-01 09:15:00"
ch = make(chan float64)
var value bool

// pair_map have some int pairs like {1,2},{10,11} like approx 20k pairs which are taken
// dynamically from a range of input
for start_day.Before(end_day) {
    // a true value in ch is set at end of each func call
    go get_output(ch, start_day, pair_map)
    value <- ch
    start_day = start_date.Add(time.Hour * 24)
}

Now the problem here is that each get_otput() is executed approx. for 20k combinations per single day (which is taking approx. 1 min), here I'm trying to do it for a month time span using go routines (on single core machine), but it is taking almost same amount of time as running sequentially (approx. 30 min).

Is there anything wrong with my approach, or is it because I'm using a single core?

I'm trying to do it for a month time span using go routines (on single core machine) , but it is taking almost same amount of time as running sequentially

What makes you think it should perform any better? Running multiple goroutines has its overhead: goroutines have to be managed and scheduled, results must be communicated / gathered. If you have a single core, using multiple gorotuines cannot yield a performance improvement, only detriment.

Using goroutines may give a performance boost if you have multiple cores and the task you distribute are "large" enough so the goroutine management will be smaller than the gain of the parallel execution.

See related: Is golang good to use in multithreaded application?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM