简体   繁体   中英

Allocation memory for Enumerable c#

I found this code in a library I am using:

public IEnumerable<BDDNode> Nodes {
    get {
        if (Low == null && High == null) {
            return new [] { this };
        } else {
            return new [] { this }.Union(Low.Nodes.Union(High.Nodes));
        }
    }
}

The problem is with Big models (ten thousands of nodes), The code allocates Gigabytes of memory.

Do I have a way to change new [] { this } to anything else that will not create an object in every call of Nodes getter?

Keeping it all inside LinQ territory without additional materializing things looks better at a first glance:

private IEnumerable<BDDNode> ThisNodes {
    get {
       yield return this;
    }
}

public IEnumerable<BDDNode> Nodes {
    get {
        if (Low == null && High == null) {
            return this.ThisNodes;
        } else {
            return this.ThisNodes.Union(Low.Nodes.Union(High.Nodes));
        }
    }
}

But still not solves your problem that a Union will need to get all results in one place to perform it's function. This is currently done with a Set that sooner or later contains all (distinct) elements from both sequences. There is no way around that. If you want a union (meaning duplicate removal) you have to materialize your data. So maybe you are better served by Concat ing the sequences and worrying about duplicates later? Concatenating two sequences is purely LinQ deferred execution and involves no materialization of the results.

But that is a design decision that only you can make based on what your algorithm does.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM