I have CSV
file which I need to parse. The schema for this file is following:
name, contacts - where contacts are many strings for each person where the number of this contact columns is not regular.
For example:
john, john@wick.com, 123 123 123, fb/john.wick
mike, 123 0303 11
dave,
I'm trying to create a CsvSchema
with Jacskon CSV
for my bean:
public class Person {
private String name;
private String[] contacts;
}
By creating custom schema:
CsvSchema schema = CsvSchema.builder()
.addColumn("name")
.addArrayColumn("contacts", ",")
.build();
But I am getting this:
com.fasterxml.jackson.dataformat.csv.CsvMappingException: Too many entries: expected at most 2
How to with Jackson CSV
solve problem like that?
Java
code:
CsvMapper mapper = new CsvMapper();
CsvSchema schema = CsvSchema.builder()
.addColumn("name")
.addArrayColumn("contacts", ",")
.build();
MappingIterator<Person> it = mapper.readerFor(Person.class).with(schema)
.readValues(csvString);
List<Person> all = it.readAll();
You can use CsvParser.Feature.WRAP_AS_ARRAY feature and read whole row as List<String>
. In constructor you can convert List
to Person
object. See below example:
import com.fasterxml.jackson.databind.MappingIterator;
import com.fasterxml.jackson.dataformat.csv.CsvMapper;
import com.fasterxml.jackson.dataformat.csv.CsvParser;
import java.io.File;
import java.util.List;
import java.util.stream.Collectors;
public class CsvApp {
public static void main(String[] args) throws Exception {
File csvFile = new File("./resource/test.csv").getAbsoluteFile();
CsvMapper csvMapper = new CsvMapper();
csvMapper.enable(CsvParser.Feature.WRAP_AS_ARRAY);
MappingIterator<List<String>> rows = csvMapper.readerFor(List.class).readValues(csvFile);
List<Person> persons = rows.readAll().stream()
.filter(row -> !row.isEmpty())
.map(Person::new)
.collect(Collectors.toList());
persons.forEach(System.out::println);
}
}
class Person {
private String name;
private List<String> contacts;
public Person(List<String> row) {
this.name = row.remove(0);
this.contacts = row;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public List<String> getContacts() {
return contacts;
}
public void setContacts(List<String> contacts) {
this.contacts = contacts;
}
@Override
public String toString() {
return "Person{" +
"name='" + name + '\'' +
", contacts=" + contacts +
'}';
}
}
For your input above code prints:
Person{name='john', contacts=[john@wick.com, 123 123 123, fb/john.wick]}
Person{name='mike', contacts=[123 0303 11]}
Person{name='dave', contacts=[]}
I think the problem is with the column separator being the same as the array column separator.
You can use ;
instead of ,
in your CSV.
Should be: john,john@wick.com;123 123 123;fb/john.wick
This way you can continue to use the features of Jackson instead of having to manually instantiate from a List of Strings.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.