简体   繁体   English

SBT 在尝试运行 spark 时给出 java.lang.NullPointerException

[英]SBT gives java.lang.NullPointerException when trying to run spark

I'm trying to compile spark with sbt 1.7.2 on a Linux machine which system is CentOs6.我正在尝试在系统为 CentOs6 的 Linux 机器上使用 sbt 1.7.2 编译 spark。

When I try to run clean command: ./build/sbt clean当我尝试运行 clean 命令时:./build/ ./build/sbt clean

I get the following output:我得到以下输出:

java.lang.NullPointerException
    at sun.net.util.URLUtil.urlNoFragString(URLUtil.java:50)
    at sun.misc.URLClassPath.getLoader(URLClassPath.java:526)
    at sun.misc.URLClassPath.getNextLoader(URLClassPath.java:498)
    at sun.misc.URLClassPath.getResource(URLClassPath.java:252)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:406)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:406)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:406)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:406)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
    at sbt.internal.XMainConfiguration.run(XMainConfiguration.java:51)
    at sbt.xMain.run(Main.scala:46)
    at xsbt.boot.Launch$.$anonfun$run$1(Launch.scala:149)
    at xsbt.boot.Launch$.withContextLoader(Launch.scala:176)
    at xsbt.boot.Launch$.run(Launch.scala:149)
    at xsbt.boot.Launch$.$anonfun$apply$1(Launch.scala:44)
    at xsbt.boot.Launch$.launch(Launch.scala:159)
    at xsbt.boot.Launch$.apply(Launch.scala:44)
    at xsbt.boot.Launch$.apply(Launch.scala:21)
    at xsbt.boot.Boot$.runImpl(Boot.scala:78)
    at xsbt.boot.Boot$.run(Boot.scala:73)
    at xsbt.boot.Boot$.main(Boot.scala:21)
    at xsbt.boot.Boot.main(Boot.scala)
[error] [launcher] error during sbt launcher: java.lang.NullPointerException

It also happened when I use sbt 1.7.3, But it can success clean and compile spark when I use sbt 1.6.2.当我使用 sbt 1.7.3 时也发生了,但是当我使用 sbt 1.6.2 时它可以成功清理和编译 spark。

What should I check first?我应该先检查什么? I'd really appreciate any advice anyone can offer.我真的很感激任何人可以提供的任何建议。

Several advices how to debug Spark and sbt.一些关于如何调试 Spark 和 sbt 的建议。

How to build Spark in IntelliJ.如何在 IntelliJ 中构建 Spark。

Clone https://github.com/apache/spark , open it in IntelliJ as sbt project.克隆https://github.com/apache/spark ,在 IntelliJ 中打开它作为 sbt 项目。

I had to execute sbt compile and re-open the project before I can run my code in IntelliJ (I had an error object SqlBaseParser is not a member of package org.apache.spark.sql.catalyst.parser before that).在我可以在 IntelliJ 中运行我的代码之前,我必须执行sbt compile并重新打开项目(之前我有一个错误object SqlBaseParser is not a member of package org.apache.spark.sql.catalyst.parser )。 For example I can put the following object in sql/core/src/main/scala and run/debug it in IntelliJ例如,我可以将以下对象放入sql/core/src/main/scala并在 IntelliJ 中运行/调试它

// scalastyle:off
import org.apache.spark.sql.{Dataset, SparkSession}

object MyMain extends App {
  val spark = SparkSession.builder()
    .master("local")
    .appName("SparkTestApp")
    .getOrCreate()

  case class Person(id: Long, name: String)

  import spark.implicits._

  val df: Dataset[Person] = spark.range(10).map(i => Person(i, i.toString))

  df.show()

//+---+----+
//| id|name|
//+---+----+
//|  0|   0|
//|  1|   1|
//|  2|   2|
//|  3|   3|
//|  4|   4|
//|  5|   5|
//|  6|   6|
//|  7|   7|
//|  8|   8|
//|  9|   9|
//+---+----+

}

I also pressed Run npm install , Load Maven project when these pop-up windows appeared but I haven't noticed the difference.当这些弹出窗口出现时,我还按下了Run npm installLoad Maven project ,但我没有注意到其中的区别。

Also once I had to keep in Project Structure in sql/catalyst/target/scala-2.12/src_managed only one source root sql/catalyst/target/scala-2.12/src_managed/main (and not sql/catalyst/target/scala-2.12/src_managed/main/antlr4 ).还有一次我不得不在sql/catalyst/target/scala-2.12/src_managed中保留Project Structure只有一个源根sql/catalyst/target/scala-2.12/src_managed/main (而不是sql/catalyst/target/scala-2.12/src_managed/main/antlr4 )。 I had errors like SqlBaseLexer is already defined as class SqlBaseLexer before that.我有错误,例如SqlBaseLexer is already defined as class SqlBaseLexer

Build Apache Spark Source Code with IntelliJ IDEA: https://yujheli-wordpress-com.translate.goog/2020/03/26/build-apache-spark-source-code-with-intellij-idea/?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=uk&_x_tr_pto=wapp (original in Chinese: https://yujheli.wordpress.com/2020/03/26/build-apache-spark-source-code-with-intellij-idea/ )使用 IntelliJ IDEA 构建 Apache Spark 源代码: https ://yujheli-wordpress-com.translate.goog/2020/03/26/build-apache-spark-source-code-with-intellij-idea/?_x_tr_sl=auto&_x_tr_tl= en&_x_tr_hl=uk&_x_tr_pto=wapp (中文原文: https ://yujheli.wordpress.com/2020/03/26/build-apache-spark-source-code-with-intellij-idea/)

Why does building Spark sources give "object sbt is not a member of package com.typesafe"? 为什么构建 Spark 源会给出“object sbt is not a member of package com.typesafe”?

How to build sbt in IntelliJ.如何在 IntelliJ 中构建 sbt。

sbt itself is tricky https://www.lihaoyi.com/post/SowhatswrongwithSBT.html and building it is a little tricky too. sbt 本身很棘手https://www.lihaoyi.com/post/SowhatswrongwithSBT.html并且构建它也有点棘手。

Clone https://github.com/sbt/sbt , open it in IntelliJ.克隆https://github.com/sbt/sbt ,在 IntelliJ 中打开它。 Let's try to run the previous Spark code using this cloned sbt.让我们尝试使用这个克隆的 sbt 运行之前的 Spark 代码。

sbt seems to be not intended to run in a specified directory. sbt 似乎不打算在指定目录中运行。 I put the following object in client/src/main/scala我将以下对象放在client/src/main/scala

object MyClient extends App {
  System.setProperty("user.dir", "../spark")
  sbt.client.Client.main(Array("sql/runMain MyMain"))
}

(Generally, mutating the system property user.dir is not recommended: How to use "cd" command using Java runtime? ) (通常,不建议改变系统属性user.dirHow to use "cd" command using Java runtime?

I had to execute sbt compile firstly (this includes the command sbt generateContrabands --- sbt uses sbt plugin sbt-contraband ( ContrabandPlugin , JsonCodecPlugin ), formerly sbt-datatype , for code generation: https://github.com/sbt/contraband https://www.scala-sbt.org/contraband/ https://www.scala-sbt.org/1.x/docs/Datatype.html https://github.com/eed3si9n/gigahorse/tree/develop/core/src/main/contraband ).我必须首先执行sbt compile (这包括命令sbt generateContrabands --- sbt 使用 sbt 插件sbt-contrabandContrabandPluginJsonCodecPlugin ),以前是sbt-datatype ,用于代码生成: https ://github.com/sbt/contraband https://www.scala-sbt.org/contraband/ https://www.scala-sbt.org/1.x/docs/Datatype.html https://github.com/eed3si9n/gigahorse/tree/develop /core/src/main/违禁品)。 I had error not found: value ScalaKeywords before that.我有错误not found: value ScalaKeywords

Next error is type ExcludeItem is not a member of package sbt.internal.bsp .下一个错误是type ExcludeItem is not a member of package sbt.internal.bsp You can just remove in protocol/src/main/contraband-scala/sbt/internal/bsp/codec the files ExcludeItemFormats.scala , ExcludesItemFormats.scala , ExcludesParamsFormats.scala , ExcludesResultFormats.scala .您可以在protocol/src/main/contraband-scala/sbt/internal/bsp/codec中删除文件ExcludeItemFormats.scalaExcludesItemFormats.scalaExcludesParamsFormats.scalaExcludesResultFormats.scala They are outdated auto-generated files.它们是过时的自动生成文件。 You can check that if you remove the content of directory protocol/src/main/contraband-scala (this is a root for auto-generated sources) and execute sbt generateContrabands all the files except these four will be restored.您可以检查是否删除目录protocol/src/main/contraband-scala的内容(这是自动生成源的根)并执行sbt generateContrabands除了这四个文件之外的所有文件都将被恢复。 For some reason these files didn't confuse sbt but confuse IntelliJ.由于某些原因,这些文件并没有混淆 sbt,而是混淆了 IntelliJ。

Now, while running, MyClient produces现在,在运行时, MyClient产生

//[info] +---+----+
//[info] | id|name|
//[info] +---+----+
//[info] |  0|   0|
//[info] |  1|   1|
//[info] |  2|   2|
//[info] |  3|   3|
//[info] |  4|   4|
//[info] |  5|   5|
//[info] |  6|   6|
//[info] |  7|   7|
//[info] |  8|   8|
//[info] |  9|   9|
//[info] +---+----+

sbt.client.Client is called the thin client. sbt.client.Client被称为瘦客户端。 Alternatively, you can publish it locally and use as a dependency或者,您可以在本地发布它并作为依赖项使用

build.sbt ( https://github.com/sbt/sbt/blob/v1.8.0/build.sbt#L1160 ) build.sbt ( https://github.com/sbt/sbt/blob/v1.8.0/build.sbt#L1160 )

lazy val sbtClientProj = (project in file("client"))
  .enablePlugins(NativeImagePlugin)
  .dependsOn(commandProj)
  .settings(
    commonBaseSettings,
    scalaVersion := "2.12.11",
    publish / skip := false, // change true to false
    name := "sbt-client",
    .......

sbt publishLocal

A new project:一个新项目:

build.sbt构建.sbt

scalaVersion := "2.12.17"

// ~/.ivy2/local/org.scala-sbt/sbt-client/1.8.1-SNAPSHOT/jars/sbt-client.jar
libraryDependencies += "org.scala-sbt" % "sbt-client" % "1.8.1-SNAPSHOT"

src/main/scala/Main.scala源代码/main/scala/Main.scala

object Main extends App {
  System.setProperty("user.dir", "../spark")
  sbt.client.Client.main(Array("sql/runMain MyMain"))
  
  //[info] +---+----+
  //[info] | id|name|
  //[info] +---+----+
  //[info] |  0|   0|
  //[info] |  1|   1|
  //[info] |  2|   2|
  //[info] |  3|   3|
  //[info] |  4|   4|
  //[info] |  5|   5|
  //[info] |  6|   6|
  //[info] |  7|   7|
  //[info] |  8|   8|
  //[info] |  9|   9|
  //[info] +---+----+
}

But the thin client is not how sbt normally runs.但是瘦客户端不是 sbt 正常运行的方式。 sbt.xMain from your stack trace is from https://github.com/sbt/sbt .堆栈跟踪中的sbt.xMain来自https://github.com/sbt/sbt It's here: https://github.com/sbt/sbt/blob/1.8.x/main/src/main/scala/sbt/Main.scala#L44 But xsbt.boot.Boot from the stack trace is not from this repo, it's from https://github.com/sbt/launcher , namely https://github.com/sbt/launcher/blob/1.x/launcher-implementation/src/main/scala/xsbt/boot/Boot.scala它在这里: https ://github.com/sbt/sbt/blob/1.8.x/main/src/main/scala/sbt/Main.scala#L44 但是来自堆栈跟踪的xsbt.boot.Boot不是来自这个repo,它来自https://github.com/sbt/launcher ,即https://github.com/sbt/launcher/blob/1.x/launcher-implementation/src/main/scala/xsbt/boot/Boot .scala

The thing is that sbt runs in two steps.问题是 sbt 分两步运行。 The sbt executable (usually downloaded from https://www.scala-sbt.org/download.html#universal-packages ) is a shell script, firstly it runs sbt-launch.jar (the object xsbt.boot.Boot ) sbt 可执行文件(通常从https://www.scala-sbt.org/download.html#universal-packages下载)是一个 shell 脚本,首先它运行sbt-launch.jar (对象xsbt.boot.Boot

https://github.com/sbt/sbt/blob/v1.8.0/sbt#L507-L512 https://github.com/sbt/sbt/blob/v1.8.0/sbt#L507-L512

execRunner "$java_cmd" \
  "${java_args[@]}" \
  "${sbt_options[@]}" \
  -jar "$sbt_jar" \
  "${sbt_commands[@]}" \
  "${residual_args[@]}"

and secondly the latter reflectively calls sbt (the class sbt.xMain )其次,后者反射性地调用 sbt(类sbt.xMain

https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/Launch.scala#L147-L149 https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/Launch.scala#L147-L149

val main = appProvider.newMain()
try {
  withContextLoader(appProvider.loader)(main.run(appConfig))

https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/Launch.scala#L496 https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/Launch.scala#L496

// implementation of the above appProvider.newMain()
else if (AppMainClass.isAssignableFrom(entryPoint)) mainClass.newInstance

https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/PlainApplication.scala#L13 https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/PlainApplication.scala#L13

// implementation of the above main.run(appConfig)
mainMethod.invoke(null, configuration.arguments).asInstanceOf[xsbti.Exit]

Then xMain#run via XMainConfiguration#run reflectively calls xMain.run然后xMain#run通过XMainConfiguration#run反射调用xMain.run

https://github.com/sbt/sbt/blob/v1.8.0/main/src/main/scala/sbt/Main.scala#L44-L47 https://github.com/sbt/sbt/blob/v1.8.0/main/src/main/scala/sbt/Main.scala#L44-L47

class xMain extends xsbti.AppMain {
  def run(configuration: xsbti.AppConfiguration): xsbti.MainResult =
    new XMainConfiguration().run("xMain", configuration)
}

https://github.com/sbt/sbt/blob/v1.8.0/main/src/main/java/sbt/internal/XMainConfiguration.java#L51-L57 https://github.com/sbt/sbt/blob/v1.8.0/main/src/main/java/sbt/internal/XMainConfiguration.java#L51-L57

Class<?> clazz = loader.loadClass("sbt." + moduleName + "$");
Object instance = clazz.getField("MODULE$").get(null);
Method runMethod = clazz.getMethod("run", xsbti.AppConfiguration.class);
try {
  .....
  return (xsbti.MainResult) runMethod.invoke(instance, updatedConfiguration);

Then it downloads and runs necessary version of Scala (specified in a build.sbt ) and necessary version of the rest of sbt (specified in a project/build.properties ).然后它下载并运行必要版本的 Scala(在build.sbt中指定)和其余 sbt 的必要版本(在project/build.properties中指定)。

What is the launcher.启动器是什么。

Let's consider a helloworld for the launcher.让我们为启动器考虑一个 helloworld。

The launcher consists of a library (interfaces)启动器由一个库(接口)组成

https://mvnrepository.com/artifact/org.scala-sbt/launcher-interface https://mvnrepository.com/artifact/org.scala-sbt/launcher-interface

https://github.com/sbt/launcher/tree/1.x/launcher-interface https://github.com/sbt/launcher/tree/1.x/launcher-interface

and the launcher runnable jar和启动器可运行的罐子

https://mvnrepository.com/artifact/org.scala-sbt/launcher https://mvnrepository.com/artifact/org.scala-sbt/launcher

https://github.com/sbt/launcher/tree/1.x/launcher-implementation/src https://github.com/sbt/launcher/tree/1.x/launcher-implementation/src

Create a project (depending on launcher interfaces at compile tome)创建一个项目(取决于编译时的启动器界面)

build.sbt构建.sbt

lazy val root = (project in file("."))
  .settings(
    name := "scalademo",
    organization := "com.example",
    version := "0.1.0-SNAPSHOT",
    scalaVersion := "2.13.10",
    libraryDependencies ++= Seq(
      "org.scala-sbt" % "launcher-interface" % "1.4.1" % Provided,
    ),
  )

src/main/scala/mypackage/Main.scala (this class will be an entry point while working with the launcher) src/main/scala/mypackage/Main.scala (此类将是使用启动器时的入口点)

package mypackage

import xsbti.{AppConfiguration, AppMain, Exit, MainResult}

class Main extends AppMain {
  def run(configuration: AppConfiguration): MainResult = {
    val scalaVersion = configuration.provider.scalaProvider.version

    println(s"Hello, World! Running Scala $scalaVersion")
    configuration.arguments.foreach(println)

    new Exit {
      override val code: Int = 0
    }
  }
}

Do sbt publishLocal .sbt publishLocal The project jar will be published at ~/.ivy2/local/com.example/scalademo_2.13/0.1.0-SNAPSHOT/jars/scalademo_2.13.jar项目 jar 将发布在~/.ivy2/local/com.example/scalademo_2.13/0.1.0-SNAPSHOT/jars/scalademo_2.13.jar

Download launcher runnable jar https://repo1.maven.org/maven2/org/scala-sbt/launcher/1.4.1/launcher-1.4.1.jar下载启动器可运行 jar https://repo1.maven.org/maven2/org/scala-sbt/launcher/1.4.1/launcher-1.4.1.jar

Create launcher configuration创建启动器配置

my.app.configuration我的应用程序配置

[scala]
  version: 2.13.10
[app]
  org: com.example
  name: scalademo
  version: 0.1.0-SNAPSHOT
  class: mypackage.Main
  cross-versioned: binary
[repositories]
  local
  maven-central
[boot]
  directory: ${user.home}/.myapp/boot

Then command java -jar launcher-1.4.1.jar @my.app.configuration abc produces然后命令java -jar launcher-1.4.1.jar @my.app.configuration abc产生

//Hello world! Running Scala 2.13.10
//a
//b
//c

There appeared files出现了文件

~/.myapp/boot/scala-2.13.10/com.example/scalademo/0.1.0-SNAPSHOT
  scalademo_2.13.jar
  scala-library-2.13.10.jar
~/.myapp/boot/scala-2.13.10/lib
  java-diff-utils-4.12.jar
  jna-5.9.0.jar
  jline-3.21.0.jar
  scala-library.jar
  scala-compiler.jar
  scala-reflect.jar

So launcher helps to run application in environments with only Java installed (Scala is not necessary), Ivy dependency resolution will be used.因此启动器有助于在仅安装 Java 的环境中运行应用程序(不需要 Scala),将使用 Ivy 依赖项解析。 There are features to handle return codes, reboot application with a different Scala version, launch servers etc.有一些功能可以处理返回码、使用不同的 Scala 版本重启应用程序、启动服务器等。

Alternatively, any of the following commands can be used或者,可以使用以下任何命令

java -Dsbt.boot.properties=my.app.configuration -jar launcher-1.4.1.jar
java -jar launcher-repacked.jar      # put my.app.configuration to sbt/sbt.boot.properties/ and repack the jar

https://www.scala-sbt.org/1.x/docs/Launcher-Getting-Started.html https://www.scala-sbt.org/1.x/docs/Launcher-Getting-Started.html

How to run sbt with the launcher.如何使用启动器运行 sbt。

Sbt https://github.com/sbt/sbt uses sbt plugin SbtLauncherPlugin https://github.com/sbt/sbt/blob/v1.8.0/project/SbtLauncherPlugin.scala so that from the raw launcher launcher Sbt https://github.com/sbt/sbt使用 sbt 插件SbtLauncherPlugin https://github.com/sbt/sbt/blob/v1.8.0/project/SbtLauncherPlugin.scala以便从原始启动器启动launcher

https://github.com/sbt/launcher/tree/1.x/launcher-implementation/src https://github.com/sbt/launcher/tree/1.x/launcher-implementation/src

https://mvnrepository.com/artifact/org.scala-sbt/launcher https://mvnrepository.com/artifact/org.scala-sbt/launcher

it builds sbt-launch它构建sbt-launch

https://github.com/sbt/sbt/tree/v1.8.0/launch https://github.com/sbt/sbt/tree/v1.8.0/launch

https://mvnrepository.com/artifact/org.scala-sbt/sbt-launch https://mvnrepository.com/artifact/org.scala-sbt/sbt-launch

Basically, sbt-launch is different from launcher in having default config sbt.boot.properties injected.基本上, sbt-launchlauncher的不同之处在于注入了默认配置sbt.boot.properties

If we'd like to run sbt with the launcher then we should find a way to specify a working directory for sbt (similarly to how we did this while working with thin client).如果我们想用启动器运行 sbt 那么我们应该找到一种方法来为 sbt 指定一个工作目录(类似于我们在使用瘦客户端时的做法)。

Working directory can be set either 1) in sbt.xMain ( sbt ) or 2) in xsbt.boot.Boot ( sbt-launcher ).工作目录可以设置为 1) 在sbt.xMain ( sbt ) 中或 2) 在xsbt.boot.Boot ( sbt-launcher ) 中。

1) Make sbt.xMain non-final so that it can be extended 1)使sbt.xMain非最终的,以便它可以被扩展

/*final*/ class xMain extends xsbti.AppMain { 
...........

https://github.com/sbt/sbt/blob/v1.8.0/main/src/main/scala/sbt/Main.scala#L44 https://github.com/sbt/sbt/blob/v1.8.0/main/src/main/scala/sbt/Main.scala#L44

Put a new class to main/src/main/scala (a launcher-style entry point)将新类放入main/src/main/scala (启动器样式的入口点)

import sbt.xMain
import xsbti.{ AppConfiguration, AppProvider, MainResult }
import java.io.File

class MyXMain extends xMain {
  override def run(configuration: AppConfiguration): MainResult = {
    val args = configuration.arguments

    val (dir, rest) =
      if (args.length >= 1 && args(0).startsWith("dir=")) {
        (
          Some(args(0).stripPrefix("dir=")),
          args.drop(1)
        )
      } else {
        (None, args)
      }

    dir.foreach { dir =>
      System.setProperty("user.dir", dir)
    }

    // xMain.run(new AppConfiguration { // not ok
    // new xMain().run(new AppConfiguration { // not ok
    super[xMain].run(new AppConfiguration { // ok
      override val arguments: Array[String] = rest
      override val baseDirectory: File =
        dir.map(new File(_)).getOrElse(configuration.baseDirectory)
      override val provider: AppProvider = configuration.provider
    })
  }
}

sbt publishLocal

my.sbt.configuration我的.sbt.configuration

[scala]
  version: auto
  #version: 2.12.17
[app]
  org: org.scala-sbt
  name: sbt
  #name: main  # not ok
  version: 1.8.1-SNAPSHOT
  class: MyXMain
  #class: sbt.xMain
  components: xsbti,extra
  cross-versioned: false
  #cross-versioned: binary
[repositories]
  local
  maven-central
[boot]
  directory: ${user.home}/.mysbt/boot
[ivy]
  ivy-home: ${user.home}/.ivy2

A command:一个命令:

java -jar launcher-1.4.1.jar @my.sbt.configuration dir=/path_to_spark/spark "sql/runMain MyMain"

or要么

java -jar sbt-launch.jar @my.sbt.configuration dir=/path_to_spark/spark "sql/runMain MyMain"

//[info] +---+----+
//[info] | id|name|
//[info] +---+----+
//[info] |  0|   0|
//[info] |  1|   1|
//[info] |  2|   2|
//[info] |  3|   3|
//[info] |  4|   4|
//[info] |  5|   5|
//[info] |  6|   6|
//[info] |  7|   7|
//[info] |  8|   8|
//[info] |  9|   9|
//[info] +---+----+

( sbt-launch.jar is taken from ~/.ivy2/local/org.scala-sbt/sbt-launch/1.8.1-SNAPSHOT/jars or just https://mvnrepository.com/artifact/org.scala-sbt/sbt-launch since we haven't modified launcher yet) sbt-launch.jar取自~/.ivy2/local/org.scala-sbt/sbt-launch/1.8.1-SNAPSHOT/jars或只是https://mvnrepository.com/artifact/org.scala-sbt /sbt-launch因为我们还没有修改启动器)

I had to copy scalastyle-config.xml from spark , otherwise it wasn't found.我必须从spark复制scalastyle-config.xml ,否则找不到。

Still I have warnings fatal: Not a git repository (or any parent up to mount parent...) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).我仍然有fatal: Not a git repository (or any parent up to mount parent...) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).

2)

project/Dependencies.scala ( https://github.com/sbt/sbt/blob/v1.8.0/project/Dependencies.scala#L25 )项目/Dependencies.scala ( https://github.com/sbt/sbt/blob/v1.8.0/project/Dependencies.scala#L25 )

val launcherVersion = "1.4.2-SNAPSHOT" // modified

Clone https://github.com/sbt/launcher and make the following changes克隆https://github.com/sbt/launcher并进行以下更改

build.sbt ( https://github.com/sbt/launcher/blob/v1.4.1/build.sbt#L11 ) build.sbt ( https://github.com/sbt/launcher/blob/v1.4.1/build.sbt#L11 )

ThisBuild / version := {
  val orig = (ThisBuild / version).value
  if (orig.endsWith("-SNAPSHOT")) "1.4.2-SNAPSHOT" // modified
  else orig
}

launcher-implementation/src/main/scala/xsbt/boot/Launch.scala ( https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/Launch.scala#L17 #L21 )启动器实现/src/main/scala/xsbt/boot/Launch.scala ( https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/启动.scala#L17 #L21 )

class LauncherArguments(
    val args: List[String],
    val isLocate: Boolean,
    val isExportRt: Boolean,
    val dir: Option[String] = None // added
)

object Launch {
  def apply(arguments: LauncherArguments): Option[Int] =
    apply((new File(arguments.dir.getOrElse(""))).getAbsoluteFile, arguments) // modified

  .............

launcher-implementation/src/main/scala/xsbt/boot/Boot.scala ( https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/Boot.scala#L41-L67 )启动器实现/src/main/scala/xsbt/boot/Boot.scala ( https://github.com/sbt/launcher/blob/v1.4.1/launcher-implementation/src/main/scala/xsbt/boot/启动.scala#L41-L67 )

  def parseArgs(args: Array[String]): LauncherArguments = {
    @annotation.tailrec
    def parse(
        args: List[String],
        isLocate: Boolean,
        isExportRt: Boolean,
        remaining: List[String],
        dir: Option[String] // added
    ): LauncherArguments =
      args match {
        ...................
        case "--locate" :: rest        => parse(rest, true, isExportRt, remaining, dir) // modified
        case "--export-rt" :: rest     => parse(rest, isLocate, true, remaining, dir) // modified

          // added
        case "--mydir" :: next :: rest => parse(rest, isLocate, isExportRt, remaining, Some(next))

        case next :: rest              => parse(rest, isLocate, isExportRt, next :: remaining, dir) // modified
        case Nil                       => new LauncherArguments(remaining.reverse, isLocate, isExportRt, dir) // modified
      }
    parse(args.toList, false, false, Nil, None)
  }

sbt-launcher: sbt publishLocal sbt 启动器: sbt publishLocal

sbt: sbt publishLocal sbt: sbt publishLocal

my.sbt.configuration我的.sbt.configuration

[scala]
  version: auto
[app]
  org: org.scala-sbt
  name: sbt
  version: 1.8.1-SNAPSHOT
  #class: MyXMain
  class: sbt.xMain
  components: xsbti,extra
  cross-versioned: false
[repositories]
  local
  maven-central
[boot]
  directory: ${user.home}/.mysbt/boot
[ivy]
  ivy-home: ${user.home}/.ivy2

A command:一个命令:

java -jar launcher-1.4.2-SNAPSHOT.jar @my.sbt.configuration --mydir /path_to_spark/spark "sql/runMain MyMain"

or要么

java -jar sbt-launch.jar @my.sbt.configuration --mydir /path_to_spark/spark "sql/runMain MyMain"

or要么

java -jar sbt-launch.jar --mydir /path_to_spark/spark "sql/runMain MyMain" (using default sbt.boot.properties rather than my.sbt.configuration ) java -jar sbt-launch.jar --mydir /path_to_spark/spark "sql/runMain MyMain" (使用默认sbt.boot.properties而不是my.sbt.configuration

(we're using modified launcher or new sbt-launch using this modified launcher ). (我们正在使用修改后的launcher或使用此修改后的launcher器的新sbt-launch )。

Alternatively, we can specify "program arguments" in "Run configuration" for xsbt.boot.Boot in IntelliJ或者,我们可以在 IntelliJ 中的xsbt.boot.Boot的“运行配置”中指定“程序参数”

@/path_to_sbt_config/my.sbt.configuration --mydir /path_to_spark/spark "sql/runMain MyMain"

Also it's possible to specify working directory /path_to_spark/spark in "Run configuration" in IntelliJ.也可以在 IntelliJ 的“运行配置”中指定工作目录/path_to_spark/spark Then remaining "program arguments" are然后剩下的“程序参数”是

@/path_to_sbt_config/my.sbt.configuration "sql/runMain MyMain"

I tried to use "org.scala-sbt" % "launcher" % "1.4.2-SNAPSHOT" or "org.scala-sbt" % "sbt-launch" % "1.8.1-SNAPSHOT" as a dependency but got No RuntimeVisibleAnnotations in classfile with ScalaSignature attribute: class Boot .我尝试使用"org.scala-sbt" % "launcher" % "1.4.2-SNAPSHOT""org.scala-sbt" % "sbt-launch" % "1.8.1-SNAPSHOT"作为依赖但得到了No RuntimeVisibleAnnotations in classfile with ScalaSignature attribute: class Boot

Your setting.你的设置。

So we can run/debug sbt-launcher code in IntelliJ and/or with println s and run/debug sbt code with println s (because there is no runnable object).因此,我们可以在 IntelliJ 中和/或使用println运行/调试 sbt-launcher 代码,并使用println运行/调试 sbt 代码(因为没有可运行的对象)。

From your stack trace I have suspection that one of classloader urls is null从您的堆栈跟踪中我怀疑其中一个类加载器urls为 null

https://github.com/openjdk/jdk/blob/jdk8-b120/jdk/src/share/classes/sun/misc/URLClassPath.java#L82 https://github.com/openjdk/jdk/blob/jdk8-b120/jdk/src/share/classes/sun/misc/URLClassPath.java#L82

Maybe you can add to sbt.xMain#run or MyXMain#run something like也许你可以添加到sbt.xMain#runMyXMain#run类的东西

var cl = getClass.getClassLoader
while (cl != null) {
  println(s"classloader: ${cl.getClass.getName}")
  cl match {
    case cl: URLClassLoader =>
      println("classloader urls:")
      cl.getURLs.foreach(println)
    case _ =>
      println("not URLClassLoader")
  }
  cl = cl.getParent
}

in order to see what url is null.为了查看什么 url 为空。

https://www.scala-sbt.org/1.x/docs/Developers-Guide.html https://www.scala-sbt.org/1.x/docs/Developers-Guide.html

https://github.com/sbt/sbt/blob/1.8.x/DEVELOPING.md https://github.com/sbt/sbt/blob/1.8.x/DEVELOPING.md

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 尝试运行简单的“Hello!”示例脚本时,SBT会给出java.lang.NullPointerException - SBT gives java.lang.NullPointerException when trying to run simple “Hello!” example script 使用元组时引发java.lang.NullPointerException - Spark java.lang.NullPointerException when using tuples 使用Spark的群集中的“ java.lang.NullPointerException” - “ java.lang.NullPointerException” in Clustering using Spark Spark Scala 上的 java.lang.NullPointerException 问题 - Problem with java.lang.NullPointerException on Spark Scala 运行sbt的项目根目录中的java.lang.NullPointerException - java.lang.NullPointerException in project root running sbt 当将带有Java语音匹配库的rdd映射到null值时,Spark会引发java.lang.NullPointerException - Spark throws java.lang.NullPointerException when mapping rdd with java phonetic matching library on null values 通过 spark-submit 运行应用程序时出现错误 java.lang.NullPointerException - Getting the error java.lang.NullPointerException when running application through spark-submit Spark scala:foreach循环中的SELECT返回java.lang.NullPointerException - Spark scala: SELECT in a foreach loop returns java.lang.NullPointerException Scala java.lang.NullPointerException - scala java.lang.NullPointerException Scala中的PowerMock:getPackage时出现java.lang.NullPointerException - PowerMock in Scala: java.lang.NullPointerException when getPackage
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM