I am running a Jupyter notebook on a remote server. Part of this notebook calls a Cython .pyx
file cython_file
which contains a c++ function definition called cpp_function
and is called from the notebook like this:
from clibs.cython_file import cpp_function
Inside the .pyx
file I am calling a c++ header file cpp_file.h
like this:
cdef extern from "/home/user/cpp_file.h":
Inside this header file I have the function as defined in cython which, for argument sake, is just a simple logic function.
My issue is this; sometimes when I change the .h
c++ file and restart the notebook kernel and re-run the code, nothing changes. It still uses the old version of the .h
file. As if it's being cached somewhere.
I have deleted all .pyxbldc
and .pyc
files before restarting the kernel to no avail.
My .pyxbld
file looks like this:
def make_ext(modname, pyxfilename):
from distutils.extension import Extension
return Extension(name=modname,
sources=[pyxfilename], extra_compile_args=['-fopenmp', '-w'], extra_link_args=['-fopenmp'], language='c++')
Any ideas on how to stop the c++ file from being cached?!
!rm -rf ~/.cache/ipython/cython
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.