简体   繁体   English

实例化org.apache.spark.sql.hive.HiveSessionStateBuilder的Pyspark错误:“

[英]Pyspark error instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':"

I was trying this code i also found on stackoverflow 我正在尝试这个代码,我也在stackoverflow上找到了

from pyspark.mllib.linalg.distributed import RowMatrix
rows = sc.parallelize([(1, 2, 3), (4, 5, 6), (7, 8, 9), (10, 11, 12)])

# Convert to RowMatrix
mat = RowMatrix(rows)

# Calculate exact and approximate similarities
exact = mat.columnSimilarities()
approx = mat.columnSimilarities(0.05)

# Output
exact.entries.collect()
[MatrixEntry(0, 2, 0.991935352214),
 MatrixEntry(1, 2, 0.998441152599),
 MatrixEntry(0, 1, 0.997463284056)]

then when I ran exaxt.entries I got this error 然后当我运行exaxt.entries时出现此错误

---------------------------------------------------------------------------
IllegalArgumentException                  Traceback (most recent call last)
<ipython-input-3-912f64c8ec62> in <module>()
----> 1 print exact.entries

D:\opt\spark\spark-2.2.0-bin-hadoop2.7\python\pyspark\mllib\linalg\distributed.pyc in entries(self)
    824         # DataFrame on the Scala/Java side. Then we map each Row in
    825         # the DataFrame back to a MatrixEntry on this side.
--> 826         entries_df = callMLlibFunc("getMatrixEntries", self._java_matrix_wrapper._java_model)
    827         entries = entries_df.rdd.map(lambda row: MatrixEntry(row[0], row[1], row[2]))
    828         return entries

D:\opt\spark\spark-2.2.0-bin-hadoop2.7\python\pyspark\mllib\common.pyc in callMLlibFunc(name, *args)
    128     sc = SparkContext.getOrCreate()
    129     api = getattr(sc._jvm.PythonMLLibAPI(), name)
--> 130     return callJavaFunc(sc, api, *args)
    131 
    132 

D:\opt\spark\spark-2.2.0-bin-hadoop2.7\python\pyspark\mllib\common.pyc in callJavaFunc(sc, func, *args)
    121     """ Call Java Function """
    122     args = [_py2java(sc, a) for a in args]
--> 123     return _java2py(sc, func(*args))
    124 
    125 

D:\opt\spark\spark-2.2.0-bin-hadoop2.7\python\lib\py4j-0.10.4-src.zip\py4j\java_gateway.py in __call__(self, *args)
   1131         answer = self.gateway_client.send_command(command)
   1132         return_value = get_return_value(
-> 1133             answer, self.gateway_client, self.target_id, self.name)
   1134 
   1135         for temp_arg in temp_args:

D:\opt\spark\spark-2.2.0-bin-hadoop2.7\python\pyspark\sql\utils.pyc in deco(*a, **kw)
     77                 raise QueryExecutionException(s.split(': ', 1)[1], stackTrace)
     78             if s.startswith('java.lang.IllegalArgumentException: '):
---> 79                 raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
     80             raise
     81     return deco

IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':"

I installed Spark on windows10 using this site https://medium.com/@GalarnykMichael/install-spark-on-windows-pyspark-4498a5d8d66c 我使用此网站在Windows10上安装了Spark Spark https://medium.com/@GalarnykMichael/install-spark-on-windows-pyspark-4498a5d8d66c

any fix on this? 有什么解决办法吗? thanks 谢谢

*update: additional error from cmd I dont understand the error. *更新:来自cmd的其他错误我不理解该错误。 there actually exist a metastore_db in my folder 我的文件夹中确实存在metastore_db

18/02/04 16:48:22 ERROR Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@700d488a, see the next exception for details.
        at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
        at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
        at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
        at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
        at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
        at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
        at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
        at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
        at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
        at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
        at java.sql.DriverManager.getConnection(Unknown Source)
        at java.sql.DriverManager.getConnection(Unknown Source)
        at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
        at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
        at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
        at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
        at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
        at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
        at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
        at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
        at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
        at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
        at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362)
        at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
        at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
        at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
        at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
        at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193)
        at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
        at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
        at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
        at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
        at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
        at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
        at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
        at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
        at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
        at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:129)
        at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:65)
        at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:283)
        at org.apache.spark.mllib.api.python.PythonMLLibAPI.getMatrixEntries(PythonMLLibAPI.scala:1207)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:280)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Unknown Source)
Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@700d488a, see the next exception for details.
        at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
        at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
        ... 113 more
Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database C:\Users\USER\Desktop\ColumnSimilarities\metastore_db.
        at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
        at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
        at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
        at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
        at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
        at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
        at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
        at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
        at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
        at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
        at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
        at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
        at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
        at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
        at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
        at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
        at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
        ... 110 more

*Update *更新

I used this portion of code I found in the documentations it was working. 我使用了在工作的文档中找到的这部分代码。

from pyspark.mllib.linalg.distributed import  RowMatrix, IndexedRowMatrix,CoordinateMatrix, MatrixEntry

#Create an RDD of coordinate entries.
#   - This can be done explicitly with the MatrixEntry class:
# entries = sc.parallelize([MatrixEntry(0, 0, 1.2), MatrixEntry(1, 0, 2.1), MatrixEntry(6, 1, 3.7)])
#   - or using (long, long, float) tuples:
entries = sc.parallelize([(1, 2, 3), (4, 5, 6), (7, 8, 9), (10, 11, 12)])
#([(0, 0, 1.2), (1, 0, 2.1), (2, 1, 3.7)])
mat1 = RowMatrix(entries)

exact = mat1.columnSimilarities()
exact.entries.collect()

tl;dr Close the other instance of the Spark application in C:\\Users\\USER\\Desktop\\ColumnSimilarities\\metastore_db and start over. tl; drC:\\Users\\USER\\Desktop\\ColumnSimilarities\\metastore_db关闭Spark应用程序的另一个实例,然后重新开始。

The exception: 例外:

ERROR XSDB6: Another instance of Derby may have already booted the database C:\\Users\\USER\\Desktop\\ColumnSimilarities\\metastore_db 错误XSDB6:Derby的另一个实例可能已经启动了数据库C:\\ Users \\ USER \\ Desktop \\ ColumnSimilarities \\ metastore_db

says it all. 全部说明。 You've got two instances of a Spark application up and running in C:\\Users\\USER\\Desktop\\ColumnSimilarities . 您已经在C:\\Users\\USER\\Desktop\\ColumnSimilarities启动并运行了两个Spark应用程序实例。 It is not possible with the default hive external catalog (aka metastore). 默认配置单元外部目录(aka metastore)是不可能的。

Spark SQL uses an external catalog (aka metastore) to manage the metadata of persistent tables and uses Derby as the underlying database that allows a single client to access it. Spark SQL使用外部目录(aka元存储)来管理持久性表的元数据,并将Derby用作允许单个客户端访问它的基础数据库。

You can only have one Spark application in a given directory (or you'd have to change the directory where the HiveExternalCatalog stores the metadata). 给定目录中只能有一个Spark应用程序(或者您必须更改HiveExternalCatalog存储元数据的目录)。

from pyspark.mllib.linalg.distributed import  RowMatrix, IndexedRowMatrix,CoordinateMatrix, MatrixEntry

#Create an RDD of coordinate entries.
#   - This can be done explicitly with the MatrixEntry class:
# entries = sc.parallelize([MatrixEntry(0, 0, 1.2), MatrixEntry(1, 0, 2.1), MatrixEntry(6, 1, 3.7)])
#   - or using (long, long, float) tuples:
entries = sc.parallelize([(1, 2, 3), (4, 5, 6), (7, 8, 9), (10, 11, 12)])
#([(0, 0, 1.2), (1, 0, 2.1), (2, 1, 3.7)])
mat1 = RowMatrix(entries)

exact = mat1.columnSimilarities()
exact.entries.collect()

Did another try using the code found in the documentation. 是否再次尝试使用文档中找到的代码。

i had same mistake on spark 2.2.0 so i checked the ports i discovered some other spark processes hold the port . 我在spark 2.2.0上也犯了同样的错误,所以我检查了端口,发现其他一些Spark进程持有该端口。 so all u have to do is go to your shell and type the "JPS" command to see if other spark process does exist 因此,您要做的就是转到您的shell并键入“ JPS”命令以查看是否确实存在其他Spark进程

user@server:~$ JPS

so you will see a list of java process which have been running on your system like : 因此,您将看到在系统上运行的java进程列表,例如:

78849 Jps
53409 RemoteInterpreterServer
78627 RemoteInterpreterServer
78515 RemoteInterpreterServer
76244 RemoteInterpreterServer
58566 SparkSubmit
77510 NameNode
74601 ZeppelinServer
17912 Master
77755 SecondaryNameNode
74684 LivyServer
77854 SparkSubmit

enter image description here 在此处输入图片说明

then you have to kill spark session by their Process ID use the below command : 那么您必须使用以下命令通过其进程ID终止spark会话:

sudo kill -9 #process id#
ex : sudo kill -9 58566

or you can use this one 或者你可以用这个

 sudo killall -9 SparkSubmit

if there is no open spark session in your memory you will receive : " no process found" 如果您的内存中没有打开spark会话,您将收到:“未找到进程”

now try to run your code again ! 现在尝试再次运行您的代码!

Godspeed 一帆风顺

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spark安装问题:实例化org.apache.spark.sql.hive.HiveSessionStateBuilder时出错 - Spark installation issue: Error while instantiating org.apache.spark.sql.hive.HiveSessionStateBuilder PySpark org.apache.spark.sql.AnalysisException:找不到表或视图: - PySpark org.apache.spark.sql.AnalysisException: Table or view not found: Spark-PySpark SQL错误 - Spark - PySpark sql error Spark saveAsTable append saves data to hive but throws an error: org.apache.hadoop.hive.ql.metadata.Hive.alterTable - Spark saveAsTable append saves data to hive but throws an error: org.apache.hadoop.hive.ql.metadata.Hive.alterTable 我们如何在 Pyspark 中使用 import org.apache.spark.sql.catalyst.parser.CatalystSqlParser - How can we use import org.apache.spark.sql.catalyst.parser.CatalystSqlParser in Pyspark pyspark 是否有 org.apache.spark.functions.transform 的等价物? - Does pyspark have an equivalent of org.apache.spark.functions.transform? PySpark中的org.apache.spark.ml.feature.Tokenizer NPE - org.apache.spark.ml.feature.Tokenizer NPE in PySpark Apache Spark JDBC SQL注入(pyspark) - Apache Spark JDBC SQL Injection (pyspark) Apache Spark:启动PySpark时出错 - Apache Spark: Error while starting PySpark PySpark: org.apache.spark.sql.AnalysisException: 属性名称... 包含无效字符 (s) 在 ",;{}()\n\t=" 请使用别名重命名 - PySpark: org.apache.spark.sql.AnalysisException: Attribute name ... contains invalid character(s) among " ,;{}()\n\t=". Please use alias to rename it
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM