简体   繁体   中英

How To Reduce Python Script Memory Usage

I have a very large python script, 200K, that I would like to use as little memory as possible. It looks something like:

# a lot of data structures
r = [34, 78, 43, 12, 99]

# a lot of functions that I use all the time
def func1(word):
    return len(word) + 2

# a lot of functions that I rarely use
def func1(word):
    return len(word) + 2


# my main loop
while 1:
   # lots of code
   # calls functions

If I put the functions that I rarely use in a module, and import them dynamically only if necessary, I can't access the data. That's as far as I've gotten.

I'm new at python.

Can anyone put me on the right path? How can I break this large script down so that it uses less memory? Is it worth putting rarely used code in modules and only calling them when needed?

Organazing:

Your python script seems indeed to be huge, maybe you should consider reorganizing your code first, to split in into several modules or packages . It will probably make easier the code profiling and the optimization tasks.

You may want to have a look there:

And possibly:

Optimizing:

There is a lot of things that can be done for optimizing your code ...

For instance, regarding your data structures ... If you make a big use of lists or lists comprehensions, you could try to figure out where do you really need lists, and where they might be replaced by non-mutable data structures like tuples or by "volatile" objects, "lazy" containers, like generator expressions.

See:

On these pages, you could find some useful information and tips:

Also, you should study your ways for doing things and wonder whether there is a way to do that less greedily, a way that it's better to do it in Python (you will find some tips in the tag pythonic ) ... That is especially true in Python, since in Python, there is often one "obvious" way (and only one) to do things which is better than the others (see The Zen of Python ), which is said to be pythonic . It's not especially related to the shape of your code, but also - and above all - to the performances. Unlike a lot of languages, which promote the idea that there should be many ways for doing anything, Python prefers to focus on the best way only. So obviously, there are many ways for doing something, but often, one is really better.

Now, you should also verify whether you are using the best methods for doing things, because pythonicality won't arrange your algorithms for you.

But at last, it's very depending on your code it's difficult to answer without having seen it.

And, make sure to take into account the comments made by eumiro and Amr .

这段视频可能会给你一些好主意: http//pyvideo.org/video/451/pycon-2011---quot-dude--where--39-s-my-ram--quot-

The advice on generator expressions and making use of modules is good. Premature optimization causes problems, but you should always spend a few minutes thinking about your design before sitting down to write code. Particularly if that code is meant to be reused.

Incidentally, you mention that you have a lot of data structures defined at the top of your script, which implies that they're all loaded into memory at the start. If this is a very large dataset, consider moving specific datasets to separate files, and loading it in only as needed. (using the csv module, or numpy.loadtxt() , etc)

Separate from using less memory, also look into ways to use memory more efficiently. For example, for large sets of numeric data, numpy arrays are a way of storing information that will provide better performance in your calculations. There is some slightly dated advice at http://wiki.python.org/moin/PythonSpeed/PerformanceTips

Moving functions around won't change your memory usage. As soon as you import that other module, it will define all the functions in the module. But functions don't take up much memory. Are they extremely repetitive, perhaps you can have less code by refactoring the functions?

@eumiro's question is right: are you sure your script uses too much memory? How much memory does it use, and why is it too much?

If you're taking advantage of OOP and have some objects, say:

class foo:
    def __init__(self, lorem, ipsum):
        self.lorem = lorem
        self.ipsum = ipsum
    # some happy little methods

You can have the object take up less memory by putting in:

__slots__ = ("lorem", "ipsum")

right before the __init__ function, as shown:

class foo:
    def __init__(self, lorem, ipsum):
        self.lorem = lorem
        self.ipsum = ipsum
    # some happy little methods

Of course, "premature optimization is the root of all evil". Also profile mem usage before and after the addition to see if it actually does anything. Beware of breaking code (shcokingly) with the understanding that this might end up not working.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM