简体   繁体   中英

Is it possible to install different graphic cards and use multi-GPU in pytorch?

I have a question.

Is it possible to install different graphic cards and use multi-GPU in pytorch? Is there any other problem?

Ex> Is the data parallel function of pytorch available in a combination of 3070 (1ea) + 3080 (1ea)?

Thank you in advance for your response.

I believe it is possible on the technical level, but it would be sub-optimal. The code that divides the load between the different GPUs assume they are the same. Having different GPUs will basically mean that you'll benefit according to the minimal performance of the GPUs you have.
For example, if you have a card with 1GB mem and another with 10GB you will only be able to work with batches suited for the 1GB card and have 9GB un-utilized on the second.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM