问题描述
我正在使用GCC和专有的DSP交叉编译器运行C程序,以模拟某些功能。我正在使用以下代码来衡量程序特定部分的执行时间:
I am running a C program using GCC and a proprietary DSP cross-compiler to simulate some functioality. I am using the following code to measure the execution time of particular part of my program:
clock_t start,end;
printf("DECODING DATA:\n");
start=clock();
conv3_dec(encoded, decoded,3*length,0);
end=clock();
duration = (double)(end - start) / CLOCKS_PER_SEC;
printf("DECODING TIME = %f\n",duration);
其中 conv3_dec()
是定义的函数
现在,事情是当我的程序运行时, conv3_dec()
函数运行将近2个小时,但 printf( DECODING TIME =%f\n,duration)
的输出表示执行的功能仅需半秒即可完成( DECODING TIME = 0.455443
)。
Now the thing is when my program runs, the conv3_dec()
functions runs for almost 2 hours but the output from the printf("DECODING TIME = %f\n",duration)
says that the execution of the function finished in just half a second (DECODING TIME = 0.455443
) . This is very confusing for me.
我以前使用 clock_t
技术来测量程序的运行时间,但是差异从未如此巨大。这是由交叉编译器引起的吗?顺便提一句,模拟器仿真的DSP处理器仅以500MHz运行,因此DSP处理器和我的CPU的时钟速度之间的差异也是如此,这导致错误正在测量CLOCKS_PER_SEC。
I have used the clock_t
technique to measure the runtimes of programs previously but the difference has never been so huge. Is this being caused by the cross-compiler. Just as a side note, the simulator simulates a DSP processor running at just 500MHz, so is the difference in the clock speeds of the DSP processor and my CPU causing the error is measuring the CLOCKS_PER_SEC.
推荐答案
对于两个小时的时间段,我不会太担心 clock()
,它对于测量亚秒级的持续时间要有用得多。
For durations like two hours, I wouldn't be too concerned about clock()
, it's far more useful for measuring sub-second durations.
只需使用 time()
如果您想要实际经过的时间,则类似于(缺少的虚拟物品):
Just use time()
if you want the actual elapsed time, something like (dummy stuff supplied for what was missing):
#include <stdio.h>
#include <time.h>
// Dummy stuff starts here
#include <unistd.h>
#define encoded 0
#define decoded 0
#define length 0
static void conv3_dec (int a, int b, int c, int d) {
sleep (20);
}
// Dummy stuff ends here
int main (void) {
time_t start, end, duration;
puts ("DECODING DATA:");
start = time (0);
conv3_dec (encoded, decoded, 3 * length, 0);
end = time (0);
duration = end - start;
printf ("DECODING TIME = %d\n", duration);
return 0;
}
会生成:
DECODING DATA:
DECODING TIME = 20
这篇关于使用clock()衡量执行时间的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!