简体   繁体   English

具有可变数量元素的附加解包概括 (PEP 448)

[英]Additional Unpacking Generalizations (PEP 448) with variable number of elements

The acceptance of PEP 448 has introduced Additional Unpacking Generalizations in Python 3.5 . PEP 448的接受在Python 3.5引入了额外的解包概括

For example:例如:

>>> l1 = [1, 2, 3]
>>> l2 = [4, 5, 6]

# unpack both iterables in a list literal
>>> joinedList = [*l1, *l2]
>>> print(joinedList)
[1, 2, 3, 4, 5, 6]

QUESTION : Is there a way to do similar thing with a list of lists?问题:有没有办法用列表列表做类似的事情?

This code does not work:此代码不起作用:

SyntaxError: iterable unpacking cannot be used in comprehension SyntaxError:无法在理解中使用可迭代解包

# List of variable size
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = [*l for l in list_of_lists]

Of course, you could do the following but that looks less elegant and does not look efficient:当然,您可以执行以下操作,但这看起来不太优雅且效率不高:

# List of variable size
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = list()
for l in list_of_lists:
    joined_list += l

How about going old school: sum()去老学校怎么样: sum()

Code:代码:

joined_list = sum(list_of_lists, [])

Test Code:测试代码:

# List of variable size
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = sum(list_of_lists, [])
print(joined_list)

Results:结果:

[1, 2, 3, 4, 5, 6, 7, 8, 9]

I'm going to discourage using sum here, as it's a form of Schlemiel the Painter's algorithm .我不鼓励在这里使用sum ,因为它是Schlemiel the Painter's algorithm的一种形式。 sum actually forbids it with str ; sum实际上用str禁止它; they didn't try to block all sequence uses to avoid slowing down sum trying to block every misuse, but it's still a bad idea.他们没有试图阻止所有序列使用以避免减慢sum试图阻止每一次误用,但这仍然是一个坏主意。

The problem is that it means you build progressively larger temporary list s each time, throwing away the last temporary after building the next one by copying everything you've seen so far, plus new stuff, over and over.问题在于,这意味着您每次都构建越来越大的临时list ,在构建下一个临时list后,通过一遍又一遍地复制您迄今为止看到的所有内容以及新内容,丢弃最后一个临时list If the first list has a million items in it, and you have ten more list s to concatenate onto it, you're copying at least 10 million elements (even if the other ten list s are empty).如果第一个列表中有 100 万个项目,并且您还有 10 个list可以连接到它上面,那么您至少要复制 1000 万个元素(即使其他 10 个list是空的)。 Your original code was actually better, in that using the += operator performed in-place extension, keeping the worst case performance in the O(n) (for n elements across all list s) range, rather than O(n*m) (for n elements across m list s).您的原始代码实际上更好,因为使用+=运算符执行就地扩展,将最坏情况的性能保持在O(n) (对于所有listn元素)范围内,而不是O(n*m) (对于m listn元素)。

It also has the problem of only working for one consistent type;它也存在只适用于一种一致类型的问题; if some inputs are list s, some tuple s, and some generators, sum won't work (because list.__add__ won't accept non- list operands for the other side).如果某些输入是list s、一些tuple s 和一些生成器, sum将不起作用(因为list.__add__不会接受另一侧的非list操作数)。

So don't do that.所以不要那样做。 This is what itertools.chain and it's alternate constructor, itertools.chain.from_iterable were made for :这就是itertools.chain及其替代构造函数itertools.chain.from_iterable的用途

from itertools import chain

list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = list(chain.from_iterable(list_of_lists))

It's guaranteed O(n) , works with any iterables you throw at it, etc.它保证O(n) ,适用于您扔给它的任何可迭代对象等。

Yes, obviously if you've just got three list s of three elements a piece, it hardly matters.是的,很明显,如果你只得到三个包含三个元素的list ,那就没什么关系了。 But if the size of the input iterables or the number of iterables is arbitrarily large, or the types aren't consistent, chain will work, sum won't.但是如果输入的iterables的大小或iterables的数量任意大,或者类型不一致, chain会起作用, sum不会。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM