I'm writing an R package that calls a C script. The C script uses a structure with a dynamically determined length: the length of the array forest->edges
depends on the data passed from R.
typedef struct {
unsigned int n_edge;
...
unsigned int max_node;
unsigned int edges[];
} forest;
forest * forest_new (unsigned int *n_edge) {
forest *f = malloc(sizeof(forest) + (2 * *n_edge * sizeof(int)));
f->n_edge = *n_edge;
...
f->max_node = 0;
return f;
}
The code runs successfully in C, but crashes when an R call triggers forest_new
. My hunch is that the crash results from memory allocation, and indeed the R manual mentions alternative means of allocating memory (eg R_alloc
, Calloc
), which threads elsewhere seems to suggest should be used in place of malloc
/ calloc
.
So part 1 of the question is when calls to malloc
/ calloc
should, or must, be replaced by R-safe equivalents (perhaps they are irrelevant to my problem?). Part 2 of the question is how the R-safe functions can handle structures whose length is dynamically determined.
You can use malloc/calloc
and free
in packages to allocate and free memory, but you have to handle out of memory errors like in any C application. You can use Calloc/Free/Realloc
provided by R and then you will get the error handled in the "R way" (an R error when out of memory). The R_alloc
function allows you to allocate temporary data that is freed automatically when your external function exits (when you get back to R, it is stack-based allocation).
R does not care about whether your C code uses structures with dynamically determined length (flexible arrays), R does not access your structure at all.
If you need more help please post a self-contained example or provide more information about the error. Perhaps it is also worth checking n_edge
has a correct/sane value in forest_new
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.