[英]Left Join errors out: org.apache.spark.sql.AnalysisException: Detected implicit cartesian product
"left join" requires either "spark.sql.crossJoin.enabled=true" or calling "persist()" on one dataframe. “左连接”要求在一个数据帧上使用“ spark.sql.crossJoin.enabled = true”或调用“ persist()”。
SELECT * FROM LHS left join RHS on LHS.R = RHS.R
How do I make "left join" work without both "spark.sql.crossJoin.enabled=true" and persisting a dataframe? 如何在没有“ spark.sql.crossJoin.enabled = true”和持久保存数据帧的情况下使“左联接”工作?
The exception below occurs in both Spark 2.3.3 and 2.4.4. 以下异常在Spark 2.3.3和2.4.4中均发生。
Exception in thread "main" org.apache.spark.sql.AnalysisException: Detected implicit cartesian product for LEFT OUTER join between logical plans OneRowRelation and ... Join condition is missing or trivial. 线程“主”中的异常org.apache.spark.sql.AnalysisException:检测到逻辑计划OneRowRelation和...之间的LEFT OUTER联接的隐式笛卡尔积。...联接条件丢失或微不足道。 Either: use the CROSS JOIN syntax to allow cartesian products between these relations, or: enable implicit cartesian products by setting the configuration variable spark.sql.crossJoin.enabled=true; 或者:使用CROSS JOIN语法在这些关系之间允许笛卡尔乘积,或者:通过设置配置变量spark.sql.crossJoin.enabled = true启用隐式笛卡尔乘积;或者
Spark2.4.3 using dataframe 使用数据框的Spark2.4.3
scala> var lhs = spark.createDataFrame(Seq((1,"sda"),(2,"abc"))).toDF("id","value")
scala> var rhs = spark.createDataFrame(Seq((2,"abc"),(3,"xyz"))).toDF("id1","value1")
scala> lhs.join(rhs,col("id")===col("id1"),"left_outer")
scala> lhs.join(rhs,col("id")===col("id1"),"left_outer").show
+---+-----+----+------+
| id|value| id1|value1|
+---+-----+----+------+
| 1| sda|null| null|
| 2| abc| 2| abc|
+---+-----+----+------+
Not facing any issue. 没有面临任何问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.