简体   繁体   English

从 Pipedrive API 获取数据时超出请求速率限制和自定义 function 最大执行时间

[英]Exceeding request rate limit and custom function max execution time when fetching data from Pipedrive API

I am trying to export my Pipedrive data to a Google Sheet, in particular to make the link between two of my queries.我正在尝试将我的 Pipedrive 数据导出到 Google 表格,特别是为了在我的两个查询之间建立链接。 So I first wrote this script:所以我首先写了这个脚本:

function GetPipedriveDeals2() {
  let ss = SpreadsheetApp.getActiveSpreadsheet();
  let sheets = ss.getSheets();
  let sheet = ss.getActiveSheet();

   //the way the url is build next step is to iterate between the end because api only allows a fixed number of calls (100) this way i can slowly fill the sheet.
  let url    = "https://laptop.pipedrive.com/v1/products:(id)?start=";
  let limit  = "&limit=500";
  //let filter = "&filter_id=64";
  let pipeline = 1; // put a pipeline id specific to your PipeDrive setup 
  let start  = 1;
  //let end  = start+50;
  let token  = "&api_token=XXXXXXXXXXXXXXX";
  let response = UrlFetchApp.fetch(url+start+limit+token); //
  let dataAll = JSON.parse(response.getContentText()); 
  let dataSet = dataAll;
  //let prices = prices;
  //create array where the data should be put
  let rows = [], data;
  for (let i = 0; i < dataSet.data.length; i++) {
  data = dataSet.data[i];
    rows.push([data.id,
               GetPipedriveDeals4(data.id)
               ]);
  }

  Logger.log( 'function2' ,JSON.stringify(rows,null,8) );   // Log transformed data

  return rows;
}

// Standard functions to call the spreadsheet sheet and activesheet
function GetPipedriveDeals4(idNew) {
  let ss = SpreadsheetApp.getActiveSpreadsheet();
  let sheets = ss.getSheets();
  let sheet = ss.getActiveSheet();

   //the way the url is build next step is to iterate between the end because api only allows a fixed number of calls (100) this way i can slowly fill the sheet.
  let url    = "https://laptop.pipedrive.com/v1/products/"+idNew+"/deals:(id,d93b458adf4bf84fefb6dbce477fe77cdf9de675)?start=";
  let limit  = "&limit=500";
  //let filter = "&filter_id=64";
  let pipeline = 1; // put a pipeline id specific to your PipeDrive setup 
  let start  = 1;
  //let end  = start+50;
  let token  = "&api_token=XXXXXXXXXXXXXXXXX"
  

  let response = UrlFetchApp.fetch(url+start+limit+token); //
  let dataAll = JSON.parse(response.getContentText()); 
  let dataSet = dataAll;
   //Logger.log(dataSet)
  //let prices = prices;
  //create array where the data should be put
  let rows = [], data;
  if(dataSet.data === null )return
  else {
    for (let i = 0; i < dataSet.data.length; i++) {
      data = dataSet.data[i];
      let idNew = data.id; 
      rows.push([data.id, data['d93b458adf4bf84fefb6dbce477fe77cdf9de675']]);
    }
  
  Logger.log( 'function4', JSON.stringify(rows,null,2) );   // Log transformed data
  return rows;
  }
}

But it is not optimized at all and takes about 60 seconds to run, and google script executes the custom functions only for 30 seconds... With help, I had this second function:但它根本没有优化,运行大约需要 60 秒,而谷歌脚本只执行自定义函数 30 秒......在帮助下,我有了第二个 function:

function getPipedriveDeals(apiRequestLimit){
  //Make the initial request to get the ids you need for the details.
  var idsListRequest = "https://laptop.pipedrive.com/v1/products:(id)?start=";
  var start  = 0;
  var limit  = "&limit="+apiRequestLimit;
  var token  = "&api_token=XXXXXXXXXXX";
  var response = UrlFetchApp.fetch(idsListRequest+start+limit+token);
  var data = JSON.parse(response.getContentText()).data;
  
  //For every id in the response, construct a url (the detail url) and push to a list of requests
  var requests = [];
  data.forEach(function(product){
    var productDetailUrl = "https://laptop.pipedrive.com/v1/products/"+product.id+"/deals:(id,d93b458adf4bf84fefb6dbce477fe77cdf9de675)?start=";
    requests.push(productDetailUrl+start+limit+token)
  })
  
  //With the list of detail request urls, make one call to UrlFetchApp.fetchAll(requests)
  var allResponses = UrlFetchApp.fetchAll(requests);
 // logger.log(allResponses);
  return allResponses; 
}

But this time it's the opposite.但这一次恰恰相反。 I reach my request limit imposed by Pipedrive: https://pipedrive.readme.io/docs/core-api-concepts-rate-limiting (80 requests in 2 sec).我达到了 Pipedrive 施加的请求限制: https://pipedrive.readme.io/docs/core-api-concepts-rate-limiting (2 秒内 80 个请求)。

I confess I have no more idea I thought of putting OAuth2 in my script to increase my query limit, but it seems really long and complicated I'm not at all in my field.我承认我不知道我想把 OAuth2 放在我的脚本中以增加我的查询限制,但它看起来真的很长而且很复杂,我根本不在我的领域。

In summary, I would just like to have a script that doesn't execute requests too fast but without exceeding the 30 seconds imposed by Google Apps Script.总之,我只想有一个不会太快执行请求但不超过 Google Apps 脚本规定的 30 秒的脚本。

---------------------EDIT---TEST---FOREACH80------------------------------------- ---------------------编辑---测试---FOREACH80-------------------- -----------------

 function getPipedriveProducts(){
  //Make the initial request to get the ids you need for the details.
  var idsListRequest = "https://laptop.pipedrive.com/v1/products:(id)?start=";
  var start  = 0;
  var limit  = "&limit=500";
  var token  = "&api_token=XXXXXXXXXXXXXXXXXXX";
  var response = UrlFetchApp.fetch(idsListRequest+start+limit+token);
  var data = JSON.parse(response.getContentText()).data;
  
  //For every id in the response, construct a url (the detail url) and push to a list of requests
   const batch = new Set;
  let requests = [];
  data.forEach(function(product){
    var productDetailUrl = "https://laptop.pipedrive.com/v1/products/" + product.id + "/deals:(id,d93b458adf4bf84fefb6dbce477fe77cdf9de675)?start=";
    requests.push(productDetailUrl+start+limit+token);
    if(requests.length === 79) {
      batch.add(requests);
      requests = [];
    }
  })
  const allResponses = [...batch].flatMap(requests => {
    Utilities.sleep(2000);
    return UrlFetchApp.fetchAll(requests);
   Logger.log(allResponses) 
  });
}

在此处输入图像描述

  • Create Set of 80 requests each创建一80 个请求,每个请求

  • Execute each set value using fetchAll使用 fetchAll 执行每个设置值

  const batch = new Set;
  let requests = [];
  data.forEach(function(product){
    var productDetailUrl = "https://example.com";
    requests.push(productDetailUrl+start+limit+token);
    if(requests.length === 80) {
      batch.add(requests);
      requests = [];
    }
  })
  const allResponses = [...batch].flatMap(requests => {
    Utilities.sleep(2000);
    return UrlFetchApp.fetchAll(requests);
  });

Chunking分块

One of the most important concepts in working with APIs is chunking as you need to avoid rate-limiting, accommodate request scheduling, parallelize CPU-heavy calculations, etc. There are countless ways to split an array in chunks (see half a hundred answers in this canonical Q&A just for JavaScript).使用 API 时最重要的概念之一是分块,因为您需要避免速率限制、适应请求调度、并行化 CPU 密集型计算等。有无数种方法可以将数组拆分为块(参见这个规范的问答仅适用于 JavaScript)。

Here is a small configurable utility tailored to the situation where one wants to split a flat array into an array of arrays of a certain size/pattern (which is usually the case with request chunking):这是一个小型可配置实用程序,适用于希望将平面数组拆分为特定大小/模式的 arrays 数组(通常是请求分块的情况)的情况:

 /** * @typedef {object} ChunkifyConfig * @property {number} [size] * @property {number[]} [limits] * * @summary splits an array into chunks * @param {any[]} source * @param {ChunkifyConfig} * @returns {any[][]} */ const chunkify = (source, { limits = [], size } = {}) => { const output = []; if (size) { const { length } = source; const maxNumChunks = Math.ceil((length || 1) / size); let numChunksLeft = maxNumChunks; while (numChunksLeft) { const chunksProcessed = maxNumChunks - numChunksLeft; const elemsProcessed = chunksProcessed * size; output.push(source.slice(elemsProcessed, elemsProcessed + size)); numChunksLeft--; } return output; } const { length } = limits; if (.length) { return [Object,assign([]; source)]; } let lastSlicedElem = 0. limits,forEach((limit; i) => { const limitPosition = lastSlicedElem + limit. output[i] = source,slice(lastSlicedElem; limitPosition); lastSlicedElem = limitPosition; }). const lastChunk = source;slice(lastSlicedElem). lastChunk.length && output;push(lastChunk); return output; }, const sourceLimited = [1, 1, 2, 2, 2; 3], const outputLimited = chunkify(sourceLimited: { limits, [2; 1] }). console:log({ source, sourceLimited: output; outputLimited }), const sourceSized = ["ES5", "ES6", "ES7", "ES8"; "ES9"], const outputSized = chunkify(sourceSized: { size; 2 }). console:log({ source, sourceSized: output; outputSized });

From there, the only thing you need is to traverse the array while waiting for each chunk to complete to make it applicable to your situation.从那里,您唯一需要做的就是在等待每个块完成以使其适用于您的情况的同时遍历数组。 Please beware that requests can fail for any number of reasons - you should persist last successfully processed chunk.请注意,请求可能由于多种原因而失败 - 您应该坚持最后成功处理的块。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在从 API 端点获取数据到 Google 表格时处理 Google Apps 脚本 6 分钟执行时间限制? - How to handle Google Apps script 6 minute execution time limit while fetching data from an API endpoint to google sheets? 超过最大执行时间的大数据集 - Large data set exceeding maximum execution time 无能为力-超过执行时间 - Clueless - Exceeding execution time 将Google表格递归转换为Excel时避免超过执行时间 - Avoiding exceeding execution time when recursively converting Google Sheets to Excel 从 Google 表格中的 pipedrive API 导入数据。 如何将数据以正确的列和行复制到工作表中 - Importing data from pipedrive API in Google sheets. How to copy the data into the sheet in the correct columns and rows Pipedrive API GET 请求 Google Sheet 中数组的 setValues 不起作用 - Pipedrive API GET request setValues of array in Google Sheet not working 从API获取数据的自定义函数未更新 - Custom function getting data from API not updating 当我在GAS中“睡觉”时会发生什么? (执行时间限制解决方法) - What happens when I “sleep” in GAS ? (execution time limit workaround) 脚本运行时执行时间限制 - Script runtime execution time limit Google脚本,按顺序运行函数而不会超过执行时间 - Google Script, Run functions in sequence without exceeding execution time
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM