简体   繁体   中英

memory allocated in assembly using malloc - want to convert it to a3-D array in C++

I have an assembly segment of the program that does a huge malloc (typically of the order of 8Gb), populates it and does computations on it.

For debugging purposes I want to be able to convert this allocated and pre-filled memory as a 3-D array in C/C++. I specifically do not want to allocate another 8 GB because declaring unsigned char* debug_arr[crystal_size][crystal_size][crystal_size] and doing an element-by-element copy will result in a stack overflow.

I would ideally love to type cast the memory pointer to an 3D array pointer ... Is it possible ?

Objective is to verify the computation results done in Assembly segment.

My C/C++ knowledge is average. I mostly use 64-bit assembly, so request give me the C++ typecasting in some detail, please?

Env : Intel Core i7 2600K @4.4 GHz with 16 GB RAM, 64 bit assembly programming on 64 bit Windows 7, Visual Studio Express 2012

Thanks...

If you want to access a single unsigned char entry as if from a 3D array, you obviously need the relevant dimensions (call them nXDim , nYDim , nZDim for the sake of argument) and you need to know what dimension order has been assumed during writing.

If we assume that z changes less frequently than y and y less frequently than x then you can access your array via a function such as this:

unsigned char* GetEntry(int nX, int nY, int nZ)
{
    return &pYourArray[(nZ * nXDim * nYDim) + (nY * nXDim) + nX];
}

First check what orderin is done in your memory . there are two types raw major orderin or column major

For row major ordering
Address = Base + ((depthindex*col_size+colindex) * row_size + rowindex) * Element_Size
For column major ordering
Address = Base + ((rowindex*col_size+colindex) * depth_size + depthindex) * Element_Size

Here is an example for you to expand on:

char array[10000];    // One dimensional array
char * mat[100];      // Matrix for 2D array
for ( int i = 0; i < 100; i++ )
    mat[i] = array + i * 100;

Now, you have the matrix as a 100x100 element 2D array in the same memory as the array.

If you know the dimensions at compile time, then something like this

void * crystal_cube = 0; // set by asm magic;
typedef unsigned char * DEBUG_CUBE[2044][2044][2044];
DEBUG_CUBE debug_cube = (DEBUG_CUBE) crystal_cube;

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM