简体   繁体   中英

How can I extract Data(fieldOutputs) which is bigger than my RAM from an Abaqus odb-file using the c++ API

I am using the c++ API to access *.odb files. Reading the file is no problem, unless the file is bigger than my RAM.

There are two routines in the documentation to read the data (in my case fieldOutputs) from the odb-file.

1. Bulk data

odb_FieldOutput& disp = lastFrame.fieldOutputs()["U"];
const odb_SequenceFieldBulkData& seqDispBulkData = disp.bulkDataBlocks();
int numDispBlocks = seqDispBulkData.size();
for (int iblock=0; iblock<numDispBlocks; iblock++) {
    const odb_FieldBulkData& bulkData = seqDispBulkData[iblock];
    int numNodes = bulkData.length();
    int numComp = bulkData.width();
    float* data = bulkData.data();
    int* nodeLabels = bulkData.nodeLabels();
    for (int node=0,pos=0; node<numNodes; node++) {
        int nodeLabel = nodeLabels[node];
        cout << "Node = " << nodeLabel;
        cout << " U = ";
        for (int comp=0;comp<numComp;comp++) {
            cout << data[pos++] << " ";
        }
        cout << endl;
    }
}

2 Value

const odb_SequenceFieldValue& displacements =  lastFrame.fieldOutputs()["U"].values();
int numValues = displacements.size();
int numComp = 0;
for (int i=0; i<numValues; i++) {
    const odb_FieldValue val = displacements[i];
    cout << "Node = " << val.nodeLabel();
    const float* const U = val.data(numComp);
    cout << ", U = ";
    for (int comp=0;comp<numComp;comp++)
        cout << U[comp] << " ";
    }
    cout << endl;
}

What I would like to do is to read the Data from the file and save them into a mat file.

Shape of the data:

Odb-file is a data base which can be represented as a tree structure.

It contains steps. Each step contains frames and each frame contains fieldOutputs. Those fieldOutputs can be matrices or vectors. The dimension depends on the number of nodes and the number of parameters per fieldOutput.

My question:

Is one of the mentioned routines capable of loading files bigger than the RAM successively? If yes, I would be happy to get some hints.

Additional information:

Documentation: http://abaqus.software.polimi.it/v6.12/books/ker/default.htm and http://xn--90ajn.xn--p1ai:2080/v6.12/pdf_books/SCRIPT_USER.pdf I am using Abaqus 6.12 and visual studio 2010 compiler.

Is one single field Output really bigger than your RAM? Do you have >1 Billion Elements?

I think you are running over a large amount of field Outputs and running out of memory when doing so.

There you can run out of memory, because the Abaqus Odb API doesn't release memory correctly (to my observation). There are some undocumented functions to release memory in the C++ API, which i can provide if I find them.

Even with this i could't get the API to release the memory. I got around this issue in (opening the Odb -> read a chunk of data -> close the Odb -> reopen the odb and read the next chunk of data) My observation was, that it would be helpful to wait 1 or two seconds after each chunk that the memory is being released properly.

So reading the data chunk after chunk to Matlab (saving it in Matlab) would be a way to get it to work.

Of course the bulkData aproach would be favorable if you read whole field Outputs.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM