簡體   English   中英

通過flink yarn集群上的maven運行帶有隨附配置的Java Jar

[英]Running Java Jar with included config via maven on flink yarn cluster

我在maven / java項目中使用flink,需要在創建的jar中內部包含我的配置。

因此,我在pom文件中添加了以下內容。 這包括jar中我所有的yml配置(位於src / main / resources文件夾中),我的名稱將在執行時作為參數傳遞。

    <resources>
        <resource>
            <directory>src/main/resources</directory>
            <includes>
                <include>**/*.yml</include>
            </includes>
        </resource>
    </resources>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <finalName>${project.artifactId}-${project.version}</finalName>
                        <shadedArtifactAttached>true</shadedArtifactAttached>
                        <transformers>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>com.exmaple.MyApplication</mainClass>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>

下面的主類代碼接收一個arg,我根據該arg決定從資源中選擇要使用的配置,讀取(使用snakeyaml)並使用。

public static void main(String[] args) throws Exception {
    final ParameterTool parameterTool = ParameterTool.fromArgs(args);

    ClassLoader classLoader = MyApplication.class.getClassLoader();
    Yaml yaml = new Yaml();

    String filename = parameterTool.getRequired("configFilename");

    InputStream in  = classLoader.getSystemResourceAsStream(filename);
    MyConfigClass = yaml.loadAs(in, MyConfigClass.class);

    ...

}

mvn clean install創建“ my-shaded-jar.jar”

我使用命令執行

java -jar /path/to/my-shaded-jar.jar --configFilename filename

如果我與其他人共享jar,它可以在多個系統上工作。

但是,當我嘗試使用以下命令在Hadoop上的紗線群集中運行相同的jar時,我遇到了問題:

HADOOP_CLASSPATH=`hadoop classpath` HADOOP_CONF_DIR=/etc/hadoop/conf ./flink-1.6.2/bin/flink run -m yarn-cluster -yd -yn 5 -ys 30 -yjm 10240 -ytm 10240 -yst -ynm some-job-name -yqu queue-name ./my-shaded-jar.jar --configFilename filename

我收到以下錯誤消息:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
    at org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:83)
    at org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:78)
    at org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:120)
    at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:238)
    at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:216)
    at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1053)
    at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1129)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
    at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1129)
Caused by: org.yaml.snakeyaml.error.YAMLException: java.io.IOException: Stream closed
    at org.yaml.snakeyaml.reader.StreamReader.update(StreamReader.java:200)
    at org.yaml.snakeyaml.reader.StreamReader.<init>(StreamReader.java:60)
    at org.yaml.snakeyaml.Yaml.loadAs(Yaml.java:444)
    at com.example.MyApplication.main(MyApplication.java:53)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
    ... 13 more
Caused by: java.io.IOException: Stream closed
    at java.io.PushbackInputStream.ensureOpen(PushbackInputStream.java:74)
    at java.io.PushbackInputStream.read(PushbackInputStream.java:166)
    at org.yaml.snakeyaml.reader.UnicodeReader.init(UnicodeReader.java:90)
    at org.yaml.snakeyaml.reader.UnicodeReader.read(UnicodeReader.java:122)
    at java.io.Reader.read(Reader.java:140)
    at org.yaml.snakeyaml.reader.StreamReader.update(StreamReader.java:184)

為什么我的解決方案可以在任何普通的linux / mac系統上運行,但是在紗線群集上使用flink run命令運行時,具有相同args的同一個jar失敗。 我們通常執行罐子的方式與執行紗線的方式之間有區別。

任何幫助表示贊賞。

classLoader.getSystemResourceAsStream(filename)替換為classLoader.getResourceAsStream(filename)

  1. java.lang.ClassLoader#getSystemResourceAsStream通過系統類加載器定位資源,該系統類加載器通常用於啟動應用程序。

  2. java.lang.ClassLoader#getResourceAsStream將首先搜索父類加載器。 失敗的話,它將調用當前類加載器的findResource

為了避免依賴性沖突,Flink應用程序中的類分為兩個域[1],這兩個域也適用於Flink客戶端,例如CliFrontend

Java類路徑包括Apache Flink的類及其核心依賴項。
動態用戶代碼包括用戶jar的類(和資源)。

因此,為了找到打包在jar文件中的“配置文件”,我們應該使用用戶代碼類加載器(您可以在org.apache.flink.client.program.PackagedProgram找到userCodeClassLoader的詳細信息)系統類加載器。

  1. https://ci.apache.org/projects/flink/flink-docs-stable/monitoring/debugging_classloading.html

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM