简体   繁体   中英

How to convert ArrayList into Scala Array in Spark

I want to create a StructType dynamically out of a json file. I iterate my fields and I want to find out how (if it is even possible) can I create some list with my iteration, and then to create a StructType from it.

The code I've tried:

List<StructField> structFieldList = new ArrayList<>();
for (String field : fields.values()) {
  StructField sf = Datatypes.createStructField(field, DataTypes.StringType, true);
  structFieldList.add(sf);
}
StructType structType = new StructType(structFieldList.toArray());

But this one is pretty impossible. Is there any way to figure this out?

Here you don't need to convert ArrayList to Scala Array , as StructType constructor takes java StructField[] array as argument.

Your code can be changed by setting type when calling .toArray() method in your last line of code snippet so it returns a StructField[] array instead of an Object[] array:

List<StructField> structFieldList = new ArrayList<>();
for (String field : fields.values()) {
  StructField sf = DataTypes.createStructField(field, DataTypes.StringType, true);
  structFieldList.add(sf);
}
StructType structType = new StructType(structFieldList.toArray(new StructField[0]));

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM