[英]Why I cannot access a package-private class in another jar (NOT sealed)?
I've encounter a strange behaviour of Java classloader: 我遇到了Java类加载器的奇怪行为:
Assuming that I submit an Apache Spark jar to a cluster, which contains an extension of HiveServer2: 假设我将Apache Spark jar提交给包含HiveServer2扩展名的集群:
package org.apache.hive.service.server;
public class MyOP2 extends HiveServer2.ServerOptionsProcessor(
String var) {
...
The class HiveServer2.ServerOptionsProcessor is already pre-loaded on the cluster (as a Spark dependency), but is declared as package-private. HiveServer2.ServerOptionsProcessor类已经预加载在群集上(作为Spark依赖项),但已声明为package-private。
package org.apache.hive.service.server;
public class HiveServer2 extends CompositeService {
...
static interface ServerOptionsExecutor {
...
}
}
This class is loaded first the JVM when the cluster is setup. 设置集群时,将首先在JVM中加载此类。 Then my class (in another jar) is loaded by the same JVM when my application is submitted.
然后,在提交我的应用程序时,我的类(在另一个jar中)由相同的JVM加载。
At this point I got the following error: 此时,我收到以下错误:
Exception in thread "main" java.lang.IllegalAccessError: class org.apache.hive.service.server.DPServerOptionsProcessor cannot access its superclass org.apache.hive.service.server.HiveServer2$ServerOptionsProcessor at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.hive.thriftserver.DPHiveThriftServer2$.main(DPHiveThriftServer2.scala:26) at org.apache.spark.s
线程“主”中的异常java.lang.IllegalAccessError:类org.apache.hive.service.server.DPServerOptionsProcessor无法访问其超类org.apache.hive.service.server.HiveServer2 $ ServerOptionsProcessor的java.lang.ClassLoader.defineClass1( java.net.URLClassLoader.defineClass(URLClassLoader.java:467)上java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)上java.lang.ClassLoader.defineClass(ClassLoader.java:763)的本地方法.net.URLClassLoader.access $ 100(URLClassLoader.java:73)at java.net.URLClassLoader $ 1.run(URLClassLoader.java:368)at java.net.URLClassLoader $ 1.run(URLClassLoader.java:362)at java.security java.net.URLClassLoader.findClass(URLClassLoader.java:361)上的.AccessController.doPrivileged(本机方法)java.lang.ClassLoader.loadClass(ClassLoader.java)上java.lang.ClassLoader.loadClass(ClassLoader.java:424)上的。 :357),位于org.apache.spark.s的org.apache.spark.sql.hive.thriftserver.DPHiveThriftServer2 $ .main(DPHiveThriftServer2.scala:26) ql.hive.thriftserver.DPHiveThriftServer2.main(DPHiveThriftServer2.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ql.hive.thriftserver.DPHiveThriftServer2.main(DPHiveThriftServer2.scala)位于sun.reflect.NativeMethodAccessorImpl.invoke0(原生方法)位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)位于sun.reflect.DeleginAccess在org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.java.lang.reflect.Method.invoke(Method.java:498)处的DelegatingMethodAccessorImpl.java:43)。 scala:731)位于org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:181)位于org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:206)位于org.apache。 spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:121)位于org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I'm under the impression that package-private class can be accessed by any other class in the same package. 我的印象是,同一包中的任何其他类都可以访问package-private类。 And I have double checked the manifest files in Spark's jars, none of them declare org.apache.hive.service.server as a sealed package.
而且我已经仔细检查了Spark罐子中的清单文件,没有一个将org.apache.hive.service.server声明为密封包。 So why JVM classloader gave me this error?
那么,为什么JVM类加载器给我这个错误? What condition has JVM used to trigger the exception?
JVM使用什么条件触发异常?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.