[英]Which jar has org.apache.spark.sql.types?
I am on Spark 1.x, and attempting to read csv files. 我在Spark 1.x上,尝试读取csv文件。 If I need to specify some data types, as per the documentation , I need to import the types defined in the package org.apache.spark.sql.types . 如果需要根据文档指定某些数据类型,则需要导入在org.apache.spark.sql.types包中定义的类型。
import org.apache.spark.sql.types.{StructType,StructField,StringType};
This works fine when I use this interactively in spark-shell, but as I want to run this thru spark-submit, I wrote some Scala code to do this. 当我在spark-shell中以交互方式使用它时,这很好用,但是当我想通过spark-submit运行此程序时,我编写了一些Scala代码来执行此操作。 But, when I attempt to compile my Scala code, it gives me an error saying it could NOT find org.apache.spark.sql.types. 但是,当我尝试编译Scala代码时,它给我一个错误,提示它找不到org.apache.spark.sql.types。 I looked up the jar contents of spark-sql
, but couldn't find these types defined in there. 我查找了spark-sql
的jar内容,但是找不到其中定义的这些类型。
So, which jar has org.apache.spark.sql.types? 那么,哪个罐子具有org.apache.spark.sql.types?
I looked at the source code for spark-sql at GitHub to realize that these types can be found in the spark-catalyst jar. 我在GitHub上查看了spark-sql的源代码,以了解可以在spark-catalyst jar中找到这些类型。 That didn't seem intuitive. 那似乎并不直观。
Also, since StructType has this code 另外,由于StructType具有此代码
org.json4s.JsonDSL._
we end up with another dependent jar - json4s-core. 我们最终得到另一个依赖的jar-json4s-core。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.