我最近一直在尝试寻找一种快速有效的方法,以使用Python语言在两个数组之间执行互相关检查。阅读后,我发现了以下两个选项:
NumPy.correlate()
方法,对于大型数组,它太慢了。 cv.MatchTemplate()
方法,它看起来要快得多。 由于明显的原因,我选择了第二个选项。我试图执行以下代码:
import scipy
import cv
image = cv.fromarray(scipy.float32(scipy.asarray([1,2,2,1])),allowND=True)
template = cv.fromarray(scipy.float32(scipy.asarray([2,2])),allowND=True)
result = cv.fromarray(scipy.float32(scipy.asarray([0,0,0])),allowND=True)
cv.MatchTemplate(image,template,result,cv.CV_TM_CCORR)
即使此代码假定非常简单,它也会引发下一个错误:
OpenCV Error: Bad flag (parameter or structure field) (Unrecognized or unsupported array type) in cvGetMat, file /builddir/build/BUILD/OpenCV-2.1.0/src/cxcore/cxarray.cpp, line 2476
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
cv.error: Unrecognized or unsupported array type
经过数小时的挫折尝试,我仍然陷于困境!有人有什么建议吗?
顺便说一句,这是我的Python版本输出:
Python 2.7 (r27:82500, Sep 16 2010, 18:03:06)
[GCC 4.5.1 20100907 (Red Hat 4.5.1-3)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
谢谢你们!
最佳答案
您不可能比使用基于fft的相关方法快得多。
import numpy
from scipy import signal
data_length = 8192
a = numpy.random.randn(data_length)
b = numpy.zeros(data_length * 2)
b[data_length/2:data_length/2+data_length] = a # This works for data_length being even
# Do an array flipped convolution, which is a correlation.
c = signal.fftconvolve(b, a[::-1], mode='valid')
# Use numpy.correlate for comparison
d = numpy.correlate(a, a, mode='same')
# c will be exactly the same as d, except for the last sample (which
# completes the symmetry)
numpy.allclose(c[:-1], d) # Should be True
现在进行时间比较:
In [12]: timeit b[data_length/2:data_length/2+data_length] = a; c = signal.fftconvolve(b, a[::-1], mode='valid')
100 loops, best of 3: 4.67 ms per loop
In [13]: timeit d = numpy.correlate(a, a, mode='same')
10 loops, best of 3: 69.9 ms per loop
如果您可以处理循环相关性,则可以删除副本。时间差将随着
data_length
的增加而增加。关于python - Python中的快速互相关方法,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/12323959/