[英]How to add column with constant in Spark-java data frame
I have imported 我已经进口了
import org.apache.spark.sql.Column;
import org.apache.spark.sql.functions;
in my Java-Spark driver 在我的Java-Spark驱动程序中
But 但
DataFrame inputDFTwo = hiveContext.sql("select * from sourcing_src_tbl");
inputDFTwo.withColumn("asofdate", lit("2016-10-2"));
here "lit" is still showing error in eclipse(windows).Which library should I include to make it work. 这里“点亮”仍然在eclipse(windows)中显示错误。我应该包含哪些库才能使其正常工作。
Either import object like you do know and use it to access method: 像你知道的那样导入对象并使用它来访问方法:
import org.apache.spark.sql.functions;
df.withColumn("foo", functions.lit(1));
or use import static
and call method directly: 或直接使用import static
和call方法:
import static org.apache.spark.sql.functions.lit;
df.withColumn("foo", lit(1));
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.