简体   繁体   中英

Spark use inherited scala functions (Java/SparkSQL)

I'm trying to use a inherited Scala function ( StuctType.diff() ) and im getting a NoSuchMethodError .


Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.types.StructType.diff(Lscala/collection/GenSeq;)Lscala/collection/Seq; at TableNode.neighborNode(SparkSQLTest.java:112) at SparkSQLTest.main(SparkSQLTest.java:58) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
The code looks like:

 public StructType foo(DataFrame df){ StructField sf = this.schema.diff(df.schema()).last(); StructType schema_tmp = new StructType().add(sf); return schema_tmp; } 

Someone has any ideas? I'm using Spark 1.6.2 and Scala 2.10

Well I'm using Spark 2.0 and StructType method is supported.Please have a look at the code snippet below I hope its helpful :

val schema = new StructType(Array(
StructField("STORE_ID", StringType,true),
StructField("SALE_DATE", DateType,true)))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM