简体   繁体   中英

Large data set exceeding maximum execution time

I have a script that is loading the contents of a spreadsheet into scriptDB but the spreadsheet has around 15,000 rows, 9 columns and keeps giving me an "exceeded maximum execution time" error.

I'm using the function given in Google's documentation to load the data:

function loadDatabaseFromSheet() {
  var spreadsheet = SpreadsheetApp.openById(SPREADSHEET_KEY);
  var sheet = spreadsheet.getActiveSheet();
  var columns = spreadsheet.getLastColumn();
  var data = sheet.getDataRange().getValues();
  var keys = data[0];
  var db = ScriptDb.getMyDb();
  for (var row = 1; row < data.length; row++) {
    var rowData = data[row];
    var item = {};
    for (var column = 0; column < keys.length; column++) {
      item[keys[column]] = rowData[column];
    }
    db.save(item);
  }
}

Is there any way to speed things up or am I just going to have to break it up into chunks of a few thousand?

Calling db.save(item) 15000 times is what is causing the slowness. Instead, use bulk operations if you're going to be saving that much data.

  var db = ScriptDb.getMyDb();
  var items = [];
  for (var row = 1; row < data.length; row++) {
    var rowData = data[row];
    var item = {};
    for (var column = 0; column < keys.length; column++) {
      item[keys[column]] = rowData[column];
    }
    items.push(item);
  }
  db.saveBatch(items,false);

Calling the save operation once at the end saves you all of the round-trip times, so this should speed up your code a lot and finish before it exceeds the maximum execution time.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM