[英]Jupyter Notebook kernel dies when I increase the number of samples
I am trying to execute the following python code:我正在尝试执行以下 python 代码:
plt.figure(figsize=(9,6))
plt.title("Dendrograms for number of clusters")
dend = sch.dendrogram(sch.linkage(scaled, method='ward'))
When I execute the above code with 12000 samples it works fine.当我用 12000 个样本执行上面的代码时,它工作正常。 However, when I increase the samples to 24000 it shows that
Kernel appears to be dead
in Jupyter notebook.但是,当我将样本增加到 24000 时,它表明
Kernel appears to be dead
。 KernelRestarter: restarting kernel (1/5), keep random ports Any help is really appreciated KernelRestarter:重启 kernel (1/5),保留随机端口非常感谢任何帮助
This was an issue with scipy package. I downgraded the package scipy from 1.7.3 to 1.7.1
and it is working.这是 scipy package 的问题。我将 package scipy 从
1.7.3 to 1.7.1
,它正在运行。 However, the downgraded version of scipy
has an issue of maximum recursion depth exceeded while getting str of an object
.但是,scipy 的降级版本在获取
scipy
的maximum recursion depth exceeded while getting str of an object
的问题。 The second issue can be resolved by expanding the limit.第二个问题可以通过扩大限制来解决。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.