简体   繁体   中英

NoSuchMethodError in JAVA when using forEach Method

I have below code snipet:

JSONArray processNodes = new JSONObject(customCon.geteOutput())
                        .getJSONArray("process-node");
processNodes.forEach(item -> {JSONObject node = (JSONObject) item;});

I added dependency in pom.xml as:

<dependency>
    <groupId>org.json</groupId>
    <artifactId>json</artifactId>
    <version>20160810</version>
</dependency>

But runtime it gives error as java.lang.NoSuchMethodError:org.json.JSONArray.forEach(Ljava/util/function/Consumer;)

Any idea why i am having this error?

Any idea why I am having this error ?

There is a mismatch between the version of JSONArray that you compiled against and the one that you are using at runtime. That causes the error.

According to the javadoc for the 20160810 version in your POM file, there is a forEach method on org.json.JSONArray that is defined by the Iterable interface.

However it is clear from the exception that the version of JSONArray that you are using at runtime does not have that method.

Note that the method is not present in the Android version (see https://developer.android.com/reference/org/json/JSONArray ) and won't be present in versions prior to Java 8 ... because Java 7 Iterable doesn't have a forEach method ( javadoc ).

Check for correct import statement.Your Code seems fine. I did quick check in Eclipse works fine.

import org.json.JSONArray;
import org.json.JSONObject;

public class Test {

    public static void main(String[] args) {
          
                JSONObject jo =   new JSONObject(5);
                jo.put("red",new JSONArray(List.of(1,2,3,4,5)));
                jo.put("blue", "green");

         
          
          jo.getJSONArray("red").forEach(item -> {String var = item.toString();});

    }

}

This is because while running, the jar is picking from the jar folder of Spark, inorder to override this, specify the jar in --jar of spark submit and also add conf like this:

--conf spark.driver.extraClassPath=json-20200518.jar 
--conf spark.executor.extraClassPath=json-20200518.jar

https://hadoopsters.com/2019/05/08/how-to-override-a-spark-dependency-in-client-or-cluster-mode/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM