简体   繁体   English

使用 Spark 或 Hive 控制台从 avro 表中获取异常读取 - 无法从文件架构中获取 varchar 字段的 maxLength 值:“字符串”

[英]Getting exception reading from avro table using Spark or Hive console - Failed to obtain maxLength value for varchar field from file schema: "string"

I have created 2 tables in Hive我在 Hive 中创建了 2 个表

CREATE external TABLE avro1(id INT,name VARCHAR(64),dept VARCHAR(64)) PARTITIONED BY (yoj VARCHAR(64)) STORED AS avro;

CREATE external TABLE avro2(id INT,name VARCHAR(64),dept VARCHAR(64)) PARTITIONED BY (yoj VARCHAR(64)) STORED AS avro;

Entered data into table avro1 from hive console:-从 hive 控制台将数据输入表 avro1:-

INSERT INTO TABLE avro1 PARTITION (yoj = 2015) (id,name,dept) VALUES (1,'Mohan','CS');
INSERT INTO TABLE avro1 PARTITION (yoj = 2015) (id,name,dept) VALUES (2,'Rahul','HR');
INSERT INTO TABLE avro1 PARTITION (yoj = 2016) (id,name,dept) VALUES (3,'Kuldeep','EE');

Now ran,a spark structured Streaming application to enter data into table avro2 Now when I am reading from hive console or using Spark,from table avro 2 I am getting this error现在运行了一个 spark 结构的流应用程序,将数据输入表 avro2 现在当我从 hive 控制台读取或使用 Spark 时,从表 avro 2 我收到这个错误

Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.avro.AvroSerdeException: Failed to obtain maxLength value for varchar field from file schema: "string因异常 java.io.IOException:org.apache.hadoop.hive.serde2.avro.AvroSerdeException 失败:无法从文件模式中获取 varchar 字段的 maxLength 值:“字符串

Could you please try following command to insert data in hiive table from spark-shell,您能否尝试以下命令从 spark-shell 向 hive 表中插入数据,

spark.sql("INSERT INTO TABLE avro1 PARTITION (yoj = 2015) (id,name,dept) VALUES (1,'Mohan','CS')"); spark.sql("插入表 avro1 分区 (yoj = 2015) (id,name,dept) VALUES (1,'Mohan','CS')"); spark.sql("INSERT INTO TABLE avro1 PARTITION (yoj = 2015) (id,name,dept) VALUES (2,'Rahul','HR')"); spark.sql("插入表 avro1 分区 (yoj = 2015) (id,name,dept) VALUES (2,'Rahul','HR')"); spark.sql("INSERT INTO TABLE avro1 PARTITION (yoj = 2016) (id,name,dept) VALUES (3,'Kuldeep','EE')"); spark.sql("插入表 avro1 分区 (yoj = 2016) (id,name,dept) VALUES (3,'Kuldeep','EE')");

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM