[英]Access numpy array from separate C process using shared memory
I have a 1-D numpy array in memory 我在内存中有一个n-n numpy数组
>>> x = np.arange(5)
I want to share this data with a separate and independent (not forked) C process on the same computer using shared memory. 我想在使用共享内存的同一台计算机上使用单独且独立(非分叉)的C进程共享此数据。
I expect to do something like the following: 我期望做类似以下的事情:
What is the best way to acheive these steps? 实现这些步骤的最佳方法是什么? There appear to be several Python solutions to allocate array data in shared memory. 似乎有几种Python解决方案可以在共享内存中分配数组数据。 All examples I can find involve sharing between two Python processes rather than between Python and another language. 我能找到的所有例子都涉及两个Python进程之间的共享,而不是Python和另一个语言之间的共享
Minimal examples are greatly welcomed. 最受欢迎的例子非常受欢迎。
Here is a minimal example: 这是一个最小的例子:
import os
import posix_ipc
import numpy as np
x = np.arange(1000, dtype='i4')
f = posix_ipc.SharedMemory('test', flags=posix_ipc.O_CREAT, size=x.nbytes, read_only=False)
ff = os.fdopen(f.fd, mode='wb')
ff.write(x.data)
ff.close() # flush doesn't work, but this does.
// shm.c
#include <stdlib.h>
#include <stdio.h>
#include <fcntl.h>
#include <sys/mman.h>
int main(int argc, char *argv[])
{
int i, fd;
int *data;
fd = shm_open("test", O_RDONLY, 0);
if (fd == -1)
printf("Error!, bad file desciptor");
data = mmap(NULL, sizeof(int), PROT_READ, MAP_PRIVATE, fd, 0);
if (data == MAP_FAILED)
printf("Error!, map failed!");
for(i = 0; i < 1000; i++){
printf("%d, ", (int)data[i]);
}
return 0;
}
$ gcc shm.c -lrt -o shm
$ ./shm
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, ...
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.