问题描述
我正在用Python实现Kosaraju的强连接组件(SCC)图搜索算法.
I am implementing Kosaraju's Strong Connected Component(SCC) graph search algorithm in Python.
该程序在较小的数据集上运行良好,但是当我在超大图形(超过80万个节点)上运行时,它显示分段错误".
The program runs great on small data set, but when I run it on a super-large graph (more than 800,000 nodes), it says "Segmentation Fault".
可能是什么原因?谢谢!
What might be the cause of it? Thank you!
其他信息:首先,我在超大型数据集上运行时遇到此错误:
Additional Info:First I got this Error when running on the super-large data set:
"RuntimeError: maximum recursion depth exceeded in cmp"
然后我使用重置递归限制
Then I reset the recursion limit using
sys.setrecursionlimit(50000)
但出现细分错误"
相信我,这不是一个无限循环,它可以在相对较小的数据上正确运行.程序是否可能耗尽了资源?
Believe me it's not a infinite loop, it runs correct on relatively smaller data. It is possible the program exhausted the resources?
推荐答案
当python 扩展(用C语言编写)试图访问无法访问的内存时,就会发生这种情况.
This happens when a python extension (written in C) tries to access a memory beyond reach.
您可以通过以下方式对其进行跟踪.
You can trace it in following ways.
- 在
- 的第一行添加
sys.settrace
代码. -
按照此答案中 Mark 所述使用
gdb
.在命令提示符下
- Add
sys.settrace
at the very first line of the code. Use
gdb
as described by Mark in this answer.. At the command prompt
gdb python
(gdb) run /path/to/script.py
## wait for segfault ##
(gdb) backtrace
## stack trace of the c code
这篇关于是什么导致Python分段错误?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!