简体   繁体   中英

How to combine two rdd into on rdd in spark(Python)

For example, there are two rdds such as "rdd1 = [[1,2],[3,4]], rdd2 = [[5,6],[7,8]]". And how to combine both into this style: [[1,2,5,6],[3,4,7,8]]. Is there any function can solve this problem?

You need to basically combine your rdds together using rdd.zip() and perform map operation on the resulting rdd to get your desired output :

rdd1 = sc.parallelize([[1,2],[3,4]])
rdd2 = sc.parallelize([[5,6],[7,8]])

#Zip the two rdd together
rdd_temp = rdd1.zip(rdd2)

#Perform Map operation to get your desired output by flattening each element
#Reference : https://stackoverflow.com/questions/952914/making-a-flat-list-out-of-list-of-lists-in-python
rdd_final = rdd_temp.map(lambda x: [item for sublist in x for item in sublist])

#rdd_final.collect()
#Output : [[1, 2, 5, 6], [3, 4, 7, 8]]

You can also check out the results on the Databricks notebook at this link .

Another (longer) way to achieve this using rdd join:

rdd1 = sc.parallelize([[1,2],[3,4]])
rdd2 = sc.parallelize([[5,6],[7,8]])

# create keys for join
rdd1=rdd1.zipWithIndex().map(lambda (val, key): (key,val))
rdd2=rdd2.zipWithIndex().map(lambda (val, key): (key,val))
# join and flatten output
rdd_joined=rdd1.join(rdd2).map(lambda (key, (val1, val2)): val1+val2)

rdd_joined.take(2)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM