简体   繁体   中英

How to limit GPU memory use in TF Slim?

When training using TF Slim's train_image_classifier.py I would like to tell Slim to only allocate what GPU memory it needs, rather than allocating all the memory.

Were I using straight up TF and not Slim I could say this:

config = tf.ConfigProto()
config.gpu_options.allow_growth=True
sess = tf.Session(config=config)

Or even just this to put a hard cap on GPU memory use:

gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.333)
sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))

How can I tell Slim the same thing(s)?

My comprehension fail is that Slim seems to use it's own loop and I can't find docs on the nitty gritty of configuring the loop. So, even if someone could point me to good Slim docs that'd be fantastic.

Thanks in advance!

You can pass the allow_growth option via the session_config parameter that is passed to the train method as follow:

session_config = tf.ConfigProto()
session_config.gpu_options.allow_growth = True
slim.learning.train(..., session_config=session_config)

See tensorflow/contrib/slim/python/slim/learning.py#L615 and tensorflow #5530 .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM