I want to convert a big csv file like 20000 to 50000 record file into json array but it takes nearly 1 min to convert in is there any way to achieve it in less then 5 sec.
ResourceBundle rb = ResourceBundle.getBundle("settings");
String path = rb.getString("fileandfolder.Path");
System.out.println(path + "ssdd");
String csvPath = request.getParameter("DP") != null ? request
.getParameter("DP").toString() : "";
String orname = path + csvPath;
File file = new File(orname);
FileReader fin = new FileReader(file); //Read file one by one
BufferedReader bi = new BufferedReader(fin);
int res;
String csv = "";
while ((res = fin.read()) != -1) {
csv = csv + ((char) res); //Converted int to char and stored in csv
}
long start3 = System.nanoTime();
JSONArray array = CDL.toJSONArray(csv);
String Csvs = array.toString();
long time3 = System.nanoTime() - start3;
System.out
.printf("Took %.3f seconds to convert to a %d MB file, rate: %.1f MB/s%n",
time3 / 1e9, file.length() >> 20, file.length()
* 1000.0 / time3);
Try
StringBuilder sb = new StringBuilder();
while ((res = fin.read()) != -1) {
sb.append((char) res); //Converted int to char and stored in csv
}
String csv = sb.toString();
Concating strings using + is slow, you should use StringBuilfer or StringBuffer
There are two glaring performance problems in your code, both of them in this snippet:
while ((res = fin.read()) != -1) {
csv = csv + ((char) res);
}
First problem: fin
is an unbuffered FileReader
, so each read()
call is actually doing a system call. Each syscall is hundreds or even thousands of instructions. And you are doing that for each and every character in the input file.
Remedy: Read from bi
rather than from fin
. (That's what you created it for ... presumably.)
Second problem: each time you execute csv = csv + ((char) res);
you are creating a new String that is one character longer than the previous one. If you have N
characters in your input file, you end up copying roughly N^2
characters to build the string.
Remedy: Instead of concatenating Strings, use a StringBuilder ... like this:
StringBuilder sb = new StringBuilder();
....
sb.append((char) res);
....
String csv = sb.toString();
At this point, it is not clear to me if there is also a performance problem in converting the csv
string to JSON; ie in this snippet.
JSONArray array = CDL.toJSONArray(csv);
String Csvs = array.toString();
Unfortunately, we don't know what JSONArray
and CDL
classes you are actually using. Hence, it is difficult to say why they are slow, or whether there is a faster way to do the conversion. (But I suspect, that the biggest performance problems are in the earlier snippet.)
This csv = csv + ((char) res)
is very slow, you are reading one char at a time, then allocating a new string with the old string and the new char.
To load all text from a file into a string do this:
static String readFile(String path, Charset encoding)
throws IOException
{
byte[] encoded = Files.readAllBytes(Paths.get(path));
return new String(encoded, encoding);
}
(from https://stackoverflow.com/a/326440/360211 , note there is a cleaner way if using java 7)
Use like this instead of loop:
String csv = readFile(orname, StandardCharsets.UTF_8);
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.