简体   繁体   中英

Mongo to Big Query Import Issues

When importing from mongo to big-query the following errors occurs. We have a script that prepares the data from a mongo dump on s3 (around 2.8GB) and then converts it to "NEWLINE_DELIMITED_JSON".

This script was working fine until recently and has not been changed.

Does anybody know how to troubleshoot this issue and find the document causing the issues?

"status": {
    "errorResult": {
      "message": "Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 41081; errors: 1. Please look into the errors[] collection for mor
e details.",
      "reason": "invalid"
    },
    "errors": [
      {
        "message": "Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 41081; errors: 1. Please look into the errors[] collection for m
ore details.",
        "reason": "invalid"
      },
      {
        "message": "Error while reading data, error message: JSON processing encountered too many errors, giving up. Rows: 41081; errors: 1; max bad: 0; error percent: 0",
        "reason": "invalid"
      },
      {
        "message": "Error while reading data, error message: JSON parsing error in row starting at position 2890606042: Parser terminated before end of string",
        "reason": "invalid"
      }
    ],
    "state": "DONE"

Be careful with your data. I had the same issue, it turned out that one field had a NaN value, which was okay for our app (in Python/TS) but not for BigQuery.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM