簡體   English   中英

無法在 azure 數據塊中保存表

[英]couldn't save table in azure databricks

在將數據幀保存到 Azure Databricks 中的表時,我收到錯誤消息,

val employeesDf = Seq(
("Rafferty", Some(31)), ("Jones", Some(33)), ("Heisenberg", Some(33)), 
("Robinson", Some(34)), ("Smith", Some(34)), ("Williams", null)
 ).toDF("LastName","DepartmentID").write.format("parquet").mode("overwrite").saveAsTable("employ    ees_table")

org.apache.spark.sql.AnalysisException:org.apache.hadoop.hive.ql.metadata.HiveException:MetaException(消息:javax.jdo.JDOUserException:表“partition_keys”已指定主鍵以包含列“ TBL_ID”,但在表中找不到此列。請檢查您的列規范。<div class="ansiout"> 在 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:549) 在 org.datanucleus。 api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732) at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) at org.apache.hadoop.hive.metastore.ObjectStore.ObjectStore. java:719) at sun.reflect.GeneratedMethodAccessor441.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org .apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:10 8) com.sun.proxy.$Proxy32.createTable(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261) at org.apache.hadoop.hive.metastore .HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294) at sun.reflect.GeneratedMethodAccessor439.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.invoke(DelegatingMethodAccessorImpl.java:43) (Method.java:498) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105) at com.sun.proxy.$Proxy33.create_table_with_environment_context(Unknown Source) ...

此外,我在從 databricks 運行示例筆記本時也遇到錯誤,其中從 dbfs 上的路徑創建表

%sql
DROP TABLE IF EXISTS diamonds;

CREATE TABLE diamonds
USING csv
OPTIONS (path "/databricks-datasets/Rdatasets/data- 
001/csv/ggplot2/diamonds.csv", header "true")

SQL 語句中的錯誤: AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:javax.jdo.JDOUserException: Table "partition_keys" has been specified with a primary-key to include column "TBL_ID" but此列未在表中找到。請檢查您的列規范。在 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:549) 在 org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistent(JDOPersistent) :732) 在 org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) 在 org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719) 在 sun.reflect.GeneratedMethodAccessor441。 invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RawStoreProxy。在 com.sun.proxy 調用(RawStoreProxy.java:108) .$Proxy32.createTable(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore) .java:1294) at sun.reflect.GeneratedMethodAccessor439.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105) at com.sun.proxy.$Proxy33.create_table_with_environment_context(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable( HiveMetaStoreClient.java:558) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.) java:62) 在 sun.reflect.De LegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at com .sun.proxy.$Proxy34.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613) at org.apache.spark.sql.hive.client.HiveClientImpl $$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:528) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:526) 在 org .apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:526) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply( HiveClientImpl.scala:322) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$retryLocked$1.apply(HiveClientImpl.scala:230) 在 org.apache.spark.sql.hive.client.HiveClientImpl$ $anonfun$retryLocked$1.apply(HiveClientImpl.scala:222) 在 org. apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:266) 在 org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:222) 在 org.apache.spark。 org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:526) 的 sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:305) org.apache.spark.sql.hive。 client.PoolingHiveClient$$anonfun$createTable$1.apply(PoolingHiveClient.scala:286) 在 org.apache.spark.sql.hive.client.PoolingHiveClient$$anonfun$createTable$1.apply(PoolingHiveClient.scala:285) 在 org. apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:112) at org.apache.spark.sql.hive.client.PoolingHiveClient.createTable(PoolingHiveClient.scala:285) at org.apache.spark。 sql.hive.HiveExternalCatalog.saveTableIntoHive(HiveExternalCatalog.scala:554) 在 org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$createDataSourceTable(HiveExte rnalCatalog.scala:461) 在 org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply$mcV$sp(HiveExternalCatalog.scala:325) 在 org.apache.spark.sql.hive.HiveExternalCatalog$ $anonfun$createTable$1.apply(HiveExternalCatalog.scala:298) 在 org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:298) 在 org.apache.spark.sql。 hive.HiveExternalCatalog$$anonfun$withClient$1$$anonfun$apply$1.apply(HiveExternalCatalog.scala:141) 在 org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$ maySynchronized(HiveExternalCatalog.scala:104) at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$withClient$1.apply(HiveExternalCatalog.scala:139) at com.databricks.backend.daemon.driver.ProgressReporter$.withClient$1.apply(HiveExternalCatalog.scala:139) (ProgressReporter.scala:345) 在 com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:331) 在 com.databricks.spark.util.SparkDatabricksProgressReporter$.w ithStatusCode(ProgressReporter.scala:34) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:137) at org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.8scala):299在 org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:99) 在 org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:349) 在 com.databricks .sql.DatabricksSessionCatalog.createTable(DatabricksSessionCatalog.scala:144) 在 org.apache.spark.sql.execution.command.CreateDataSourceTableCommand.run(createDataSourceTables.scala:118) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec .sideEffectResult$lzycompute(commands.scala:72) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:70) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect (commands.scala:81) 在 org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset .scala:205) 在 org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:205) 在 org.apache.spark.sql.Dataset$$anonfun$55.apply(Dataset.scala:3424 ) 在 org.apache.spark.sql.Dataset$$anonfun$55.apply(Dataset.scala:3419) 在 org.apache.spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1.apply(SQLExecution.scala:99) ) 在 org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:228) 在 org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:85) 在 org.apache。 spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:158) at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withAction(Dataset.scala:3419) at org. apache.spark.sql.Dataset.(Dataset.scala:205) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:89) at org.apache.spark.sql.SparkSession.sql(SparkSession. scala:696) 在 org.apache.spark.sql.SQLContext.sql(SQLContext.scala:707) 在 com.databricks.backend.daemon.driver.SQLDriverLocal$$anonfu n$1.apply(SQLDriverLocal.scala:87) 在 com.databricks.backend.daemon.driver.SQLDriverLocal$$anonfun$1.apply(SQLDriverLocal.scala:33) 在 scala.collection.TraversableLike$$anonfun$map$1.apply (TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.immutable.List.foreach(List.scala:392) at scala.collection。 TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.immutable.List.map(List.scala:296) at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:33)在 com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:136) 在 com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:323) 在 com .databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:303) at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:235) at scala .util.DynamicVariable.wit hValue(DynamicVariable.scala:58) at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:230) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:47) at com .databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:268) 在 com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:47) 在 com.databricks.backend.daemon.driver.DriverLocal .execute(DriverLocal.scala:303) 在 com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:591) 在 com.databricks.backend.daemon.driver.DriverWrapper$$ anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:591) at scala.util.Try$.apply(Try.scala:192) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:586)在 com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:477) 在 com.databricks.backend.daemon.driv er.DriverWrapper.executeCommand(DriverWrapper.scala:544) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:383) at com.databricks.backend.daemon.driver.DriverWrapper.runWraper. scala:330) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:216) at java.lang.Thread.run(Thread.java:748) NestedThrowablesStackTrace:表“partition_keys”已指定包含列“TBL_ID”的主鍵,但在表中找不到該列。 請檢查您的色譜柱規格。 org.datanucleus.exceptions.NucleusUserException:表“partition_keys”已指定主鍵以包含列“TBL_ID”,但在表中找不到此列。 請檢查您的色譜柱規格。 在 org.datanucleus.store.rdbms.table.ElementContainerTable.applyUserPrimaryKeySpecification(ElementContainerTable.java:217) 在 org.datanucleus.store.rdbms.table.CollectionTable.initialize(CollectionTable.java:240) 在 org.datanucleus.store.rdbms .RDBMSStoreManager$ClassAdder.initializeClassTables(RDBMSStoreManager.java:3283) 在 org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3185) 在 org.datanucleus.store.rdbms.RDBMSManagerManager(RDBMSManager) .java:2841) 在 org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122) 在 org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605) 在 org.datanucleus.store。 AbstractStoreManager.addClass(AbstractStoreManager.java:954) 在 org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679) 在 org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.jav a:2045) at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365) at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827) at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java) :2571) 在 org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513) 在 org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232) 在 org.datanucleus.NewContextImplsistProvider(java:513) 1414) 在 org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218) 在 org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065) 在 org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:2218) 在 org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:191) datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217) 在 org.datanucleus.api.jdo.JDOPersistenceManager.jdoMak ePersistent(JDOPersistenceManager.java:727) at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719) at sun .reflect.GeneratedMethodAccessor441.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive .metastore.RawStoreProxy.invoke(RawStoreProxy.java:108) at com.sun.proxy.$Proxy32.createTable(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261) ) 在 org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294) at sun.reflect.GeneratedMethodAccessor439.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethod:javaAccessor3I) ) 在 java.lang.reflect.Method.invoke(Method.java:498) 在 org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105) at com.sun.proxy.$Proxy33.create_table_with_environment_context(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable( HiveMetaStoreClient.java:558) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.) java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke (RetryingMetaStoreClient.java:89) at com.sun.proxy.$Proxy34.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613) at org.apache。 spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:528) 在 org.apache。 spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:526) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala :526) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:322) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$ retryLocked$1.apply(HiveClientImpl.scala:230) 在 org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$retryLocked$1.apply(HiveClientImpl.scala:222) 在 org.apache.spark.sql.hive .client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:266) 在 org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:222) 在 org.apache.spark.sql.hive.client.HiveClientImpl .withHiveState(HiveClientImpl.scala:305) 在 org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:526) 在 org.apache.spark.sql.hive.client.PoolingHiveClient$$anonfun$ createTable$1.apply(PoolingHiveClient.scala:286) 在 o rg.apache.spark.sql.hive.client.PoolingHiveClient$$anonfun$createTable$1.apply(PoolingHiveClient.scala:285) 在 org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:112) ) 在 org.apache.spark.sql.hive.client.PoolingHiveClient.createTable(PoolingHiveClient.scala:285) 在 org.apache.spark.sql.hive.HiveExternalCatalog.saveTableIntoHive(HiveExternalCatalog.scala:554) 在 org.apache。 spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$createDataSourceTable(HiveExternalCatalog.scala:461) 在 org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply$ mcV$sp(HiveExternalCatalog.scala:325) 在 org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:298) 在 org.apache.spark.sql.hive.HiveExternalCatalog$ $anonfun$createTable$1.apply(HiveExternalCatalog.scala:298) 在 org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$withClient$1$$anonfun$apply$1.apply(HiveExternalCatalog.scal a:141) 在 org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$maybeSynchronized(HiveExternalCatalog.scala:104) 在 org.apache.spark.sql.hive.HiveExternalCatalog $$anonfun$withClient$1.apply(HiveExternalCatalog.scala:139) 在 com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:345) 在 com.databricks.backend.daemon.driver.ProgressReporter$ .withStatusCode(ProgressReporter.scala:331) at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog)a在 org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:298) 在 org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:99) 在 org.apache.spark .sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:349) 在 com.databricks.sql.DatabricksSessio nCatalog.createTable(DatabricksSessionCatalog.scala:144) 在 org.apache.spark.sql.execution.command.CreateDataSourceTableCommand.run(createDataSourceTables.scala:118) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$ lzycompute(commands.scala:72) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:70) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands. scala:81) 在 org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:205) 在 org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:205)在 org.apache.spark.sql.Dataset$$anonfun$55.apply(Dataset.scala:3424) 在 org.apache.spark.sql.Dataset$$anonfun$55.apply(Dataset.scala:3419) 在 org.apache .spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1.apply(SQLExecution.scala:99) 在 org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:228) 在 org.apache。 spark.sql.execution.SQLExecution$.wit hCustomExecutionEnv(SQLExecution.scala:85) 在 org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:158) 在 org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset $$withAction(Dataset.scala:3419) at org.apache.spark.sql.Dataset.(Dataset.scala:205) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:89) at org .apache.spark.sql.SparkSession.sql(SparkSession.scala:696) 在 org.apache.spark.sql.SQLContext.sql(SQLContext.scala:707) 在 com.databricks.backend.daemon.driver.SQLDriverLocal$$ anonfun$1.apply(SQLDriverLocal.scala:87) 在 com.databricks.backend.daemon.driver.SQLDriverLocal$$anonfun$1.apply(SQLDriverLocal.scala:33) 在 scala.collection.TraversableLike$$anonfun$map$1.apply (TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.immutable.List.foreach(List.scala:392) at scala.collection。 TraversableLike$class.map(TraversableLike.scala:234) 在 scala.collection.immutable.List.map(List .scala:296) 在 com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:33) 在 com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:136) 在 com。 databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:323) 在 com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:303) ) 在 com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:235) 在 scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) 在 com.databricks.logging.UsageLogging$class。 withAttributionContext(UsageLogging.scala:230) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:47) at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:268) .databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:47) 在 com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLoca l.scala:303) 在 com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:591) 在 com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecuting $2.apply(DriverWrapper.scala:591) at scala.util.Try$.apply(Try.scala:192) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:586) at com。 databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:477) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:544) at com.databricks.backend.daemon.driver. DriverWrapper.runInnerLoop(DriverWrapper.scala:383) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:330) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.s 216) 在 java.lang.Thread.run(Thread.java:748) );

我可以毫無問題地運行相同的查詢。

在此處輸入圖片說明

這個問題看起來很奇怪。 要對此問題進行更深入的調查和即時幫助,如果您有支持計划,則可以提交支持票。

有關更多詳細信息,請參閱“Azure Databricks 快速入門指南”。

此錯誤是由於外部 Metastore 的配置不正確

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM