[英]Hive on Spark is not working - Failed to create spark client
我在執行hive查詢作為spark引擎時遇到錯誤。
Error:
Failed to execute spark task, with exception org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
Hive Console:
hive> set hive.execution.engine=spark;
hive> set spark.master=spark://INBBRDSSVM15.example.com:7077;
hive> set spark.executor.memory=2g;
Hadoop - 2.7.0
Hive - 1.2.1
Spark - 1.6.1
YARN容器內存小於Spark Executor要求。 我將YARN Container內存和最大值設置為大於Spark Executor Memory + Overhead。 檢查'yarn.scheduler.maximum-allocation-mb'和/或'yarn.nodemanager.resource.memory-mb'。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.