I've got a really bizarre case: my Chaco-based Python GUI plotting application crashes, or doesn't, according to the comments in the code below:
# self.ideal_signal = x # Causes crash.
# self.ideal_signal = x[:-70000] # Fixes crash.
# self.ideal_signal = x[:-62500] # Causes crash again.
self.ideal_signal = 0.6 * x # Also fixes crash
# self.ideal_signal = 0.7 * x # Causes crash again.
x
is a 256,000 element NumPy nd-array of type float64
.
I have no idea what to do here. Does anyone have a suggestion?
It's almost as if crash/don't crash is determined by the cumulative sum of the amplitude (magnitude?) of the signal being plotted, but that seems awfully silly.
Try making a minimal reproducible example, if possible. This might lead you to the answer. If not, post the mimimal example at https://groups.google.com/forum/#!forum/ets-users
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.