简体   繁体   中英

How to load JSON file to google Bigquery using node.js

I'm using fetch function for getting the attached JSON object, and using my node.js backend to load this JSON data to Bigquery using the following code:

await bigquery
      .dataset(datasetId)
      .table(tableId).insert(JSON_obj)

But unfortunately getting the following error :

Unhandled rejection PartialFailureError: A failure occurred during this request

JSON OBJ

{
  "totalResults": 418,
  "profileInfo": {
    "profileId": "104881487",
    "profileName": "All Mobile App Data",
    "accountId": "64812694",
    "tableId": "ga:105536427",
    "internalWebPropertyId": "100521715",
    "webPropertyId": "UA-648333494-1"
  },
  "totalsForAllResults": {
    "ga:users": "427",
    "ga:totalEvents": "682",
    "ga:eventValue": "0"
  },
  "query": {
    "max-results": 1000,
    "start-index": 1,
    "start-date": "today",
    "end-date": "today",
    "dimensions": "ga:eventCategory,ga:eventAction,ga:eventLabel,ga:dateHourMinute",
    "metrics": [
      "ga:users",
      "ga:totalEvents",
      "ga:eventValue"
    ],
    "ids": "ga:104831427",
    "sort": [
      "-ga:totalEvents"
    ]
  },
  "selfLink": "https://www.googleapis.com/analytics/v3/data/ga?ids=ga:10483467&dimensions=ga:eventCategory,ga:eventAction,ga:eventLabel,ga:dateHourMinute&metrics=ga:users,ga:totalEvents,ga:eventValue&sort=-ga:totalEvents&start-date=today&end-date=today",
  "columnHeaders": [
    {
      "name": "ga:eventCategory",
      "columnType": "DIMENSION",
      "dataType": "STRING"
    },
    {
      "name": "ga:eventAction",
      "columnType": "DIMENSION",
      "dataType": "STRING"
    },
    {
      "name": "ga:eventLabel",
      "columnType": "DIMENSION",
      "dataType": "STRING"
    },
    {
      "name": "ga:dateHourMinute",
      "columnType": "DIMENSION",
      "dataType": "STRING"
    },
    {
      "name": "ga:users",
      "columnType": "METRIC",
      "dataType": "INTEGER"
    },
    {
      "name": "ga:totalEvents",
      "columnType": "METRIC",
      "dataType": "INTEGER"
    },
    {
      "name": "ga:eventValue",
      "columnType": "METRIC",
      "dataType": "INTEGER"
    }
  ],
  "containsSampledData": false,
  "id": "https://www.googleapis.com/analytics/v3/data/ga?ids=ga:104831427&dimensions=ga:eventCategory,ga:eventAction,ga:eventLabel,ga:dateHourMinute&metrics=ga:users,ga:totalEvents,ga:eventValue&sort=-ga:totalEvents&start-date=today&end-date=today",
  "itemsPerPage": 1000,
  "kind": "analytics#gaData",
  "rows": [
    [
      "video_screen",
      "click_on_screen",
      "false",
      "202011190517",
      "1",
      "32",
      "0"
    ],
    [
      "video_screen",
      "click_on_screen",
      "false",
      "202011190730",
      "1",
      "17",
      "0"
    ],
    ...

When you submit a JSON to BigQuery Insert to table function, you need to provide only the required data. Here you provide a big big JSON and the library need to guess the data to get in it.

Personally I guess in the rows array, but I'm not sure. And I'm also not sure about the field order!!

So, extract the useful data from your JSON, format them as you want (CSV, JSON,...) and submit them to BigQuery. It will work better!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM