简体   繁体   中英

python : can reduce be translated into list comprehensions like map, lambda and filter?

When programming in python, I now avoid map , lambda and filter by using list comprehensions because it is easier to read and faster in execution. But can reduce be replaced as well?

Eg an object has an operator union() that works on another object, a1.union(a2) , and gives a 3rd object of same type.

I have a list of objects:

L = [a1, a2, a3, ...]

How to have the union() of all these objects with list comprehensions, the equivalent of:

result = reduce(lambda a, b :a.union(b), L[1:], L[0])

It is no secret that reduce is not among the favored functions of the Pythonistas.

Generically, reduce is a left fold on a list

It is conceptually easy to write a fold in Python that will fold left or right on a iterable:

def fold(func, iterable, initial=None, reverse=False):
    x=initial
    if reverse:
        iterable=reversed(iterable)
    for e in iterable:
        x=func(x,e) if x is not None else e
    return x

Without some atrocious hack, this cannot be replicated in a comprehension because there is not accumulator type function in a comprehension.

Just use reduce -- or write one that makes more sense to you.

Since a list comprehension definitionally generates another list, you can't use it to generate a single value. The aren't for that. (Well... there is this nasty trick that uses a leaked implementation detail in old versions of python that can do it. I'm not even going to copy the example code here. Don't do this.)

If you're worried about the stylistic aspects of reduce() and its ilk, don't be. Name your reductions and you'll be fine. So while:

all_union = reduce(lambda a, b: a.union(b), L[1:], L[0])

isn't great, this:

from functools import reduce

def full_union(input):
    """ Compute the union of a list of sets """
    return reduce(set.union, input[1:], input[0])

result = full_union(L)

is pretty clear.

If you're worried about speed, check out the toolz and cytoolz packages, which are 'fast' and 'insanely fast,' respectively. On large datasets, they'll often let you avoid processing your data more than once or loading the whole set in memory at once, in contrast to list comprehensions.

Not really. List comprehensions are more similar to map , and possibly filter .

A common use of reduce is to flatten a list of lists. You can use a list comprehension instead.

L = [[1, 2, 3], [2, 3, 4], [3, 4, 5]]

with reduce

from functools import reduce  # python 3
flattened = reduce(lambda x, y: x + y, L)

print(flattened)

[1, 2, 3, 2, 3, 4, 3, 4, 5]

with list comp

flattened = [item for sublist in L for item in sublist]

print(flattened)

[1, 2, 3, 2, 3, 4, 3, 4, 5]

If your problem can be solved by operating on the flattened list, this is an effective replacement. Contrast these one-liners for the given example:

all_union = reduce(lambda a, b: set(a).union(set(b)), L)

{1, 2, 3, 4, 5}

all_union = set([item for sublist in L for item in sublist])

{1, 2, 3, 4, 5}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM