简体   繁体   English

Salesforce -Fire apex 仅在数据加载完成后触发

[英]Salesforce -Fire apex trigger only after complete data load

So here is the issue We are loading data into a CustomObject__c using DataLoader.所以这里的问题是我们正在使用 DataLoader 将数据加载到 CustomObject__c 中。 Usually the no of records that are passed are 3. Also, if there is any issue with the data passed, they run the dataloader again and pass the corrected data.通常传递的记录数为 3。此外,如果传递的数据有任何问题,它们会再次运行数据加载器并传递更正后的数据。 Now, the older data has to be deleted.现在,必须删除旧数据。 So, I am handling it in before insert code and calling a batch in after insert code.所以,我在插入代码之前处理它并在插入代码之后调用批处理。

Here is the code for my trigger:这是我的触发器的代码:

trigger TriggerCustom on CustomObject__c (before insert, after insert) {
  List<CustomObject__c> customobjectlist = [Select Id from CustomObject__c WHERE CreatedDate = TODAY ];
    if (Trigger.isBefore) {
        delete exchlisttoday;
        
    }
    if(Trigger.isAfter)
    {
         BatchApex b = BatchApex();    
            Database.executebatch(b);
    }
}

This was designed keeping in mind they pass only 3 records at a time.这样做的目的是记住他们一次只通过 3 条记录。 However, now they want to pass more than 200 records using data loader.但是,现在他们想使用数据加载器传递 200 多条记录。 How can I modify my trigger so that it fires only after one single dataload is completed (for eg if they pass 1000 records at once, the trigger has to fire only after the 1000 records are completely inserted如何修改我的触发器,使其仅在一个数据加载完成后触发(例如,如果它们一次传递 1000 条记录,则触发器必须在 1000 条记录完全插入后触发

Trigger will not know when you are done, after 3, 203 or 10000 records (you can use bulk api to load large volumes, they'll be chunked into 10K packets but still - triggers will work 200 at a time).在 3、203 或 10000 条记录之后,触发器将不知道您何时完成(您可以使用批量 api 加载大量数据,它们将被分成 10K 个数据包,但仍然 - 触发器一次将工作 200 个)。

If you have scripted data load - maybe you can update something else as next step.如果您有脚本数据加载 - 也许您可以在下一步更新其他内容。 Another object (something dummy that has just 1 record) and have trigger on this?另一个对象(只有 1 条记录的虚拟对象)并对此触发?

If you have scripted data load - maybe you can query the Ids and then pass them to delete operation which would run before the upload task.如果您有脚本数据加载 - 也许您可以查询 Id,然后将它们传递给删除操作,该操作将在上传任务之前运行。 This becomes bit too much for poor little data loader but Talend, Informatica, Azure Data Factory, Jitterbit etc proper ETL tools could do it.这对于可怜的小数据加载器来说有点太多了,但 Talend、Informatica、Azure 数据工厂、Jitterbit 等适当的 ETL 工具可以做到。 (although deleting before is bit brave... what if the load fails? You're screwed... Maybe delete should be after successful update) (虽然之前删除有点勇敢......如果加载失败怎么办?你搞砸了......也许删除应该在成功更新之后)

Maybe you can guarantee that last record in your daily load will have some flag set and in the trigger - look for that flag?也许您可以保证每日负载中的最后一条记录将设置一些标志并在触发器中 - 寻找该标志?

Maybe you can schedule the batch to run every hour.也许您可以安排批处理每小时运行一次。 You can't do it easily from UI but you can write the cron expression and schedule as 1-liner in dev console.您无法从 UI 轻松完成此操作,但您可以在开发控制台中将 cron 表达式和时间表编写为 1-liner。 In the Schedulable's execute() make it check if there was anything loaded today and if there was even single record - trigger the batch?在 Scheduable 的execute()检查今天是否加载了任何内容以及是否有单个记录 - 触发批处理?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM