[英]need to write json dataframe into avro file format from spark-shell
嗨,我必须读取 json 数据来触发 DF,然后我需要从 spark shell 以 avro 文件格式写入 DF,我收到以下错误:
org.apache.spark.sql.AnalysisException: Failed to find data source: avro. Avro is built-in but external data source module since Spark 2.4. Please deploy the application as per the deployment section of "Apache Avro Data Source Guide".;
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:647)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:245)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
... 49 elided
您是否尝试过添加 spark-avro 库?
您可以在启动 spark-shell 时执行以下操作:
spark-shell --packages org.apache.spark:spark-avro_2.11:2.4.6
spark-avro
外部库,您必须将spark-avro
package 导入spark-shell
。
检查下面
spark-shell --packages org.apache.spark:spark-avro_2.11:2.4.0
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.