簡體   English   中英

如何在 EMR 集群 AWS 中使用 java runtime 11

[英]How to use java runtime 11 in EMR cluster AWS

我在 EMR aws 中創建了一個混亂,當 spark 運行我的應用程序時,我收到以下錯誤:

Exception in thread "main" java.lang.UnsupportedClassVersionError: 
com/example/demodriver/MyClassFromJAR has been compiled by a more recent version of the Java Runtime (class file version 55.0), 
this version of the Java Runtime only recognizes class file versions up to 52.0

我在集群上使用 releaseLabel emr-6.5.0 ,我的驅動程序 jar 是用 java11 構建的

如何在 EMR 中運行 java11 應用程序? 或者這個錯誤是關於其他什么的?

在最新版本的 EMR 中,安裝了 java 11。 要啟用它,您可以提供以下配置。

[
    {
        "Classification": "spark-env",
        "Configurations": [
            {
                "Classification": "export",
                "Properties": {
                    "JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
                }
            }
        ]
    },
    {
        "Classification": "spark-defaults",
        "Properties": {
            "spark.executorEnv.JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
        }
    }
]

這似乎沒有記錄。

defaultJavaOptionsextraJavaOptions可能包含 java 11 的不兼容選項,您可能仍需要調整/更新。

這是完整的配置,包括必要的 JVM 選項:

[
    {
        "Classification": "spark-env",
        "Configurations": [
            {
                "Classification": "export",
                "Properties": {
                    "JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
                }
            }
        ]
    },
    {
        "Classification": "spark-defaults",
        "Properties": {
            "spark.executorEnv.JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64",
            "spark.driver.defaultJavaOptions": "-XX:OnOutOfMemoryError='kill -9 %p' -XX:MaxHeapFreeRatio=70",
            "spark.executor.defaultJavaOptions": "-verbose:gc -Xlog:gc*::time -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:OnOutOfMemoryError='kill -9 %p' -XX:MaxHeapFreeRatio=70 -XX:+IgnoreUnrecognizedVMOptions"
        }
    }
]

為了在 EMR 6 (6.9.0) 上運行 Spark (3.3.0),我必須提供以下內容(注意額外的“hadoop-env”和空的“Properties”元素):

[
{
    "Classification": "hadoop-env", 
    "Configurations": [
        {
            "Classification": "export", 
            "Configurations": [], 
            "Properties": {
                "JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
            }
        }
    ], 
    "Properties": {}
}, 
{
    "Classification": "spark-env",
    "Configurations": [
        {
            "Classification": "export",
             "Configurations": [],
            "Properties": {
                "JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
            }
        }
    ],
    "Properties": {}
},
{
    "Classification": "spark-defaults",
    "Properties": {
        "spark.executorEnv.JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
    }
}

]

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM