How can I skip bad records from a CSV file in BigQuery? I have tried using these functions:
configLoad.setMaxBadRecords(10);
configLoad.getMaxBadRecords();
I have one bad record in the input CSV. But the code returns null
when I use the functions above and the code fails rather than ignoring the bad record.
I suspect the problem here is in how to check for success vs error.
Jobs return their current status as:
state: PENDING|RUNNING|DONE
errorResult: { ... }
errors: [{...}, {...}, ...]
When a job is in the DONE
state, then errorResult
determines whether the job was overall a success (no errorResult present) or failure (a structured error in the errorResult field).
The errors in the errors
list will contain all fatal and non-fatal errors encountered.
Here is an example status result from a successfully completed load job that contained 1 bad row with a setMaxBadRecords(10) set on the load job configuration:
"status": {
"errors": [
{
"location": "File: 0 / Line:1",
"message": "Too many columns: expected 2 column(s) but got 3 column(s). For additional help: http://goo.gl/RWuPQ",
"reason": "invalid"
}
],
"state": "DONE"
},
Without the setMaxResults, it would be a failing job as follows:
"status": {
"errorResult": {
"message": "Too many errors encountered. Limit is: 0.",
"reason": "invalid"
},
"errors": [
{
"location": "File: 0 / Line:1",
"message": "Too many columns: expected 2 column(s) but got 3 column(s). For additional help: http://goo.gl/RWuPQ",
"reason": "invalid"
},
{
"message": "Too many errors encountered. Limit is: 0.",
"reason": "invalid"
}
],
"state": "DONE"
},
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.