简体   繁体   English

有没有一种方法可以强制等待Java进程结束在Scala或Spark中结束?

[英]Is there a way to force waiting for the end of a java process to end in Scala or Spark?

In a Scala app, deployed through Spark, I have a code line which calls to a Java function executing native C++ code through a JNI. 在通过Spark部署的Scala应用程序中,我有一条代码行,该行调用通过JNI执行本机C ++代码的Java函数。 This call takes time, and if it is not the only one of his kind running, a resource usage conflict appears with a *** stack smashing detected ***: <unknown> terminated . 此调用需要时间,并且如果这不是唯一运行的调用,则会出现资源使用冲突,并*** stack smashing detected ***: <unknown> terminated

Here's the call, and it's scope : 这是电话,范围是:

[spark RDD].mapPartitionsWithIndex(f = (index: Int, it: Iterator[Row]) => {
  val sourceData: String = it.mkString()

  val result: List[List[String]] = new WrapperClass(sourceData, [misc parameters).getResult

  [wrinting result to a file]
}).take(1)

My WrapperClass.getResult , very simple, looks like this : 我的WrapperClass.getResult非常简单,如下所示:

[java call related variables initialization]

UnitexJni.execUnitexTool("UnitexTool {InstallLingResourcePackage -p " + appliedPkg + " -x " + resDir + "} " + "{" + runScriptCmd + "} " + "{InstallLingResourcePackage -p " + appliedPkg + " -x " + resDir + " --uninstall}")

[retrieving, formatting and returning result]

The UnitexJni.execUnitexTool() is the java call. UnitexJni.execUnitexTool()是java调用。

So I would like to know if there is a way to force to wait until the end of this process before calling it over agin using Scala, Java or Spark functionality. 因此,我想知道是否有一种方法可以强制使用此过程,直到使用Scala,Java或Spark功能重新调用它为止。

You could use sys.process._ You will pass the shell script to the below process function with the script path. 您可以使用sys.process._您将通过脚本路径将shell脚本传递给以下流程函数。 Also, you need to handle the shell script to have a return code. 另外,您需要处理shell脚本以获取返回码。 For example, If 0 success else failed . 例如, If 0 success else failedIf 0 success else failed Please take care of the ! 请保重! at the end of the line. 在该行的末尾。 You could also check more details to run quick command lines from this tutorial 您也可以从本教程中查看更多详细信息以运行快速命令行

import scala.sys.process.Process
val externalShellScript = Process("sh", Seq(scriptPath)).!    
  if (externalShellScript != 0) {
    throw new Exception(
      "Error in executing external shell script from " + scriptPath)
  }

The Spark job will not continue unless this process finish. 除非此过程完成,否则Spark作业将不会继续。 Below is simple shell script and the output. 下面是简单的shell脚本和输出。

touch test.txt
echo "any massage"

output in the console will be 控制台中的输出将是

any massage

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM