简体   繁体   中英

ColdFusion GC Overhead Limit while looping over API Json Response

To start, I understand what the "GC Overhead limit" error means, in general. I received this message while running a script that does the following:

  1. Create Instance of an object that makes a CFHTTP GET request to an external API
  2. Store JSON response (an array) as a property of the object Instance (ie VARIABLES.data)
  3. Loop through the JSON response array using a for in loop
  4. Create an instance on an object that calls a SQL Server Stored Procedure passing in the properties of the JSON object (the SQL Stored Procedure performs and UPDATE or INSERT based on the existence of a record for the object's key)

Debug output shows that the SP call takes between 3-12 milliseconds.

When I run this script with a limited dataset (~3,000 records), it runs to completion without throwing a GC exception.

When I run the script with the complete dataset (~14,000 records), the GC exception is thrown.

Here's my pseudo-code:

    for (LOCAL.WidgetJson in VARIABLES.data) {
        LOCAL.Widget=new Widget();
        LOCAL.Widget
            .save(argumentCollection=LOCAL.WidgetJson);
    }

Widget.cfc:

private void function saveStoredProc() {
    cfstoredproc(procedure="SaveWidget") {
        cfprocparam(
            dbvarname="@id",
            type="in",
            cfsqltype="CF_SQL_INT",
            value=VARIABLES.id
            );
        <!--- Rest of cfprocparam() tags here --->
}

private void function save() {
    for (LOCAL.Property in ARGUMENTS) {
        if  (StructKeyExists(ARGUMENTS, LOCAL.Property)) {
            if  (IsSimpleValue(ARGUMENTS[LOCAL.Property])) {
                VARIABLES[LOCAL.Property] = Trim(ARGUMENTS[LOCAL.Property]);
            }
            else {
                VARIABLES[LOCAL.Property] = ARGUMENTS[LOCAL.Property];
            }
        }
    }

    saveStoredProc();
}

I'm wondering if the way that I'm creating objects or looping could be improved to prevent GC exceptions/memory leaks.

Any ideas for improvements?

I don't think the garbage collection will happen during a single request even if it is necessary. You could either increase memory or split this into multiple threads that process smaller amounts of data.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM