问题描述
我埃拉托色尼的筛上读了,而解决在的问题。我敢肯定,你们知道哪些问题IM谈论。
因此,这里的事情。我的code管理,以显示所有的不到100万的正确素数。
然而,当我尝试同样实现200万它给了我一个分段错误...
我为什么错误来了有一定的想法,但不知道如何纠正它...
这里的code为素数在1万元左右。
#包括LT&;&stdio.h中GT;
INT主要(无效)
{
INT I,K = 2;
诠释J;
INT N = 1000000;
INT素[2000000] = {};
对于(i = 0; I< N;我++)//初始化素数数组
{
黄金[我] =我;
}
对于(i = 2; I< N;我++)//筛的实施
{
如果(黄金[I]!= 0)
{
为(J = 2; J< N; J ++)
{
{
素[D *素[I] = 0;
如果(素[I] * J&将N)
打破;
}
}
}
}
对于(i = 0; I< N;我++)//打印素数
如果(黄金[I]!= 0)
{
的printf(%d个\\ N黄金[I]);
}
返回(0);
}
}
您会在堆栈中分配一个巨大的数组:
INT黄金[2000000] = {};
四个字节次200万等于8兆字节,这往往是最大堆栈大小。分配比这更导致分段错误。
您应该分配在堆数组,而不是:
为int *素;
素=的malloc(2000000 *的sizeof(INT));
如果(!素数){
/* 内存不足 */
}
/ * ...使用...素* /
免费(素数);
I read up on the sieve of Eratosthenes while solving a question on Project Euler. I'm sure you guys know which question im talking about.So here's the thing. My code manages to show all the primes under 1 million correctly.However when i try the same implementation for 2 million it's giving me a segmentation fault...I have a certain idea of why the error is coming but don't know how to correct it...Here's the code for primes under 1 million.
#include<stdio.h>
int main(void)
{
int i,k=2;
int j;
int n=1000000;
int prime[2000000]={};
for(i=0;i<n;i++) // initializes the prime number array
{
prime[i]=i;
}
for(i=2;i<n;i++) // Implementation of the Sieve
{
if(prime[i]!=0)
{
for(j=2;j<n;j++)
{
{
prime[j*prime[i]]=0;
if(prime[i]*j>n)
break;
}
}
}
}
for(i=0;i<n;i++) // Prints the prime numbers
if(prime[i]!=0)
{
printf("%d\n"prime[i]);
}
return(0);
}
}
You're allocating a huge array in stack:
int prime[2000000]={};
Four bytes times two million equals eight megabytes, which is often the maximum stack size. Allocating more than that results in segmentation fault.
You should allocate the array in heap, instead:
int *prime;
prime = malloc(2000000 * sizeof(int));
if(!prime) {
/* not enough memory */
}
/* ... use prime ... */
free(prime);
这篇关于埃拉托色尼筛的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!