简体   繁体   English

Databricks笔记本电脑永久挂起

[英]Databricks notebook hang forever

Actually I am doing some dataframe work for ETL, the dataframe is read from Azure datawarehouse . 实际上,我正在为ETL做一些数据框架工作,该数据框架是从Azure数据仓库读取的。 and seems somehow to notebook hang forever , but I don't know where it is and why it hang so long !!! 并且似乎以某种方式使笔记本计算机永久挂起,但我不知道它在哪里以及为什么挂那么久!

在此处输入图片说明

Any one has idea and the experience ? 任何人都有想法和经验吗?

There are various rare scenarios and corner cases that can cause a streaming or batch job to hang. 在各种罕见的情况和极端情况下,可能会导致流式处理或批处理作业挂起。 It is also possible for a job to hang because the Databricks internal metastore has become corrupted. 由于Databricks内部元存储已损坏,作业也可能挂起。

Often restarting the cluster or creating a new one resolves the problem. 通常,重新启动集群或创建新集群可以解决该问题。 In other cases, run the following script to unhang the job and collect notebook information, which can be provided to Databricks Support. 在其他情况下,请运行以下脚本来取消作业并收集笔记本信息,这些信息可以提供给Databricks支持。

For more details, refer " How to Resolve Job Hands and Collect Diagnostic Information ". 有关更多详细信息,请参阅“ 如何解决作业手和收集诊断信息 ”。

Hope this helps. 希望这可以帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 PostgreSQL永远挂在可序列化事务上 - postgresql hang forever on serializable transaction 在数据块中如何自动化笔记本运行 - In databricks how to automate notebook runs 如何在databricks笔记本sql中格式化日期 - how to format date in databricks notebook sql Databricks 笔记本 - 从表中返回所有值 - Databricks notebook - return all values from a table 如何在数据库中将带空格的字符串参数传递给SQL notebook? - How to pass string arguments with spaces to SQL notebook in databricks? 如何将SQL表转换为pyspark / python数据结构并在databricks笔记本中返回sql - How to convert sql table into a pyspark/python data structure and return back to sql in databricks notebook 如何将pyspark.sql.dataframe.DataFrame转换回databricks笔记本中的sql表 - How can I convert a pyspark.sql.dataframe.DataFrame back to a sql table in databricks notebook 我们如何使用sql删除databricks notebook中数据库下的所有表? - How could we can delete all tables under the database in databricks notebook using sql? 如果在Azure Databricks笔记本中大于100万,如何知道SQL查询结果中的总行数? - How to know the total no: of rows in our SQL query result if it is greater than 1 million in Azure databricks notebook? executeQuery()使程序挂起 - executeQuery() making the program hang
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM