本文介绍了C#的线程池线程限制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

好吧......我已经给网站一个公平的搜索和已阅读过有关此主题的许多职位。我发现这个问题:特别有用。

但是,因为它似乎总是,我需要什么略有不同。

我已经看过在MSDN的例子,它适合我的需求有点。我指的是例子是在这里:http://msdn.microsoft.com/en-us/library/3dasc8as(VS.80,printer).aspx

我的问题是这样的。我有一个相当简单的设置code的加载通过HttpWebRequest和WebResponse类网页并读取通过流的结果。我火了这个方法的线程,因为它需要多次执行。该方法本身是pretty短,但它需要的次数要发射(具有不同数据每次)而变化。它可以是从1到200的任何地方。

我读过的一切似乎表明ThreadPool类作为总理候选人。这里是事情变得棘手。我可能需要火过这件事情说100次,但我只能有3个线程,最多运行(这个特殊的任务)。

我试着通过设置在线程池的MaxThreads:

ThreadPool.SetMaxThreads(3,3);

我不完全相信这种方式工作。此外,我不想揍这将是上运行的系统上运行的其他网站或程序。因此,通过限制线程池线程的#,我可以肯定,这涉及到我的code和我的线程而已?

在MSDN示例使用事件驱动的方法,并调用WaitHandle.WaitAll的(doneEvents);,这是我如何做这个。

所以我的问题的心脏是,一个人如何确保或指定可以为他们的code可运行的线程的最大数目,但有code继续运行多个线程为previous那些完成了,直到某任意点?我是不是该解决的正确方法?

真诚的,

杰森


好吧,我添加了一个信号的方法,并彻底清除线程池code。这似乎很简单。我得到了我的信息来自:http://www.albahari.com/threading/part2.aspx

这是这个例子,教我如何:

[以下这里文本复制/粘贴从网站]

一个容量信号灯类似于互斥或开锁时,除了信号灯没有主人 - 这是线程无关。任何线程可以调用释放一个信号量,同时与互斥和锁定,只有获得的资源可以释放它的线程。

在下面的例子中,十线程执行一个循环,在中间一睡眠语句。一个信号,确保不超过三个线程可以同时执行该睡眠语句:

 类SemaphoreTest {  静态信号量S =新信号量(3,3); //可用= 3;容量= 3  静态无效的主要(){    的for(int i = 0;我小于10;我++)新的Thread(围棋)。开始();  }  静态无效围棋(){    而(真){      s.WaitOne();      Thread.sleep代码(100); //只有3个线程可以在这里一次      s.Release();    }  }}


解决方案

请注意:如果你是这个限制为3只是让你不要压倒机器运行您的应用程序,我会确保这是一个问题第一。线程池应该管理这个给你。在另一方面,如果你不想要压倒其他一些资源,然后阅读!


您不能管理线程池的大小(或真多关于它的任何东西)。

在这种情况下,我会使用一个信号来管理您的资源访问权限。在你的情况,你的资源正在运行的网络刮,或计算一些报告等。

要做到这一点,在你的静态类,创建信号对象:

  System.Threading.Semaphore S =新System.Threading.Semaphore(3,3);

然后,在每个线程,你这样做:

  System.Threading.Semaphore S =新System.Threading.Semaphore(3,3);尝试
{
    //等待轮到你了(递减)
    S.WaitOne();
    //做你的事
}最后{
    //释放,以便其他人可以去(增量)
    S.Release();
}

每个线程将阻塞在S.WaitOne(),直到它被赋予以继续该信号。一旦s已​​被减3次,所有的线程将阻塞,直到其中的一个计数器加1。

这方案并不完美。


如果你想要的东西一点点更清洁,更高效的,我建议你使用的BlockingQueue方法去,其中你排队,你想要执行成为一个全球性阻塞队列对象的工作。

同时,你有三个线程(你创造的 - 不是在线程池),弹出工作从队列中执行。这不是棘手的设置,是非常快速和简单。

例如:




Alright...I've given the site a fair search and have read over many posts about this topic. I found this question: Code for a simple thread pool in C# especially helpful.

However, as it always seems, what I need varies slightly.

I have looked over the MSDN example and adapted it to my needs somewhat. The example I refer to is here: http://msdn.microsoft.com/en-us/library/3dasc8as(VS.80,printer).aspx

My issue is this. I have a fairly simple set of code that loads a web page via the HttpWebRequest and WebResponse classes and reads the results via a Stream. I fire off this method in a thread as it will need to executed many times. The method itself is pretty short, but the number of times it needs to be fired (with varied data for each time) varies. It can be anywhere from 1 to 200.

Everything I've read seems to indicate the ThreadPool class being the prime candidate. Here is what things get tricky. I might need to fire off this thing say 100 times, but I can only have 3 threads at most running (for this particular task).

I've tried setting the MaxThreads on the ThreadPool via:

ThreadPool.SetMaxThreads( 3, 3 );

I'm not entirely convinced this approach is working. Furthermore, I don't want to clobber other web sites or programs running on the system this will be running on. So, by limiting the # of threads on the ThreadPool, can I be certain that this pertains to my code and my threads only?

The msdn example uses the event drive approach and calls WaitHandle.WaitAll( doneEvents );, which is how I'm doing this.

So the heart of my question is, how does one ensure or specify a maximum number of threads that can be run for their code, but have the code keep running more threads as the previous ones finish up until some arbitrary point? Am I tackling this the right way?

Sincerely,

Jason


Okay, I've added a semaphore approach and completely removed the threadpool code. It seems simple enough. I got my info from: http://www.albahari.com/threading/part2.aspx

It's this example that showed me how:

[text below here is a copy/paste from the site]

A Semaphore with a capacity of one is similar to a Mutex or lock, except that the Semaphore has no "owner" – it's thread-agnostic. Any thread can call Release on a Semaphore, while with Mutex and lock, only the thread that obtained the resource can release it.

In this following example, ten threads execute a loop with a Sleep statement in the middle. A Semaphore ensures that not more than three threads can execute that Sleep statement at once:

class SemaphoreTest {

  static Semaphore s = new Semaphore (3, 3);  // Available=3; Capacity=3



  static void Main() {

    for (int i = 0; i < 10; i++) new Thread (Go).Start();

  }



  static void Go() {

    while (true) {

      s.WaitOne();

      Thread.Sleep (100);   // Only 3 threads can get here at once

      s.Release();

    }

  }

}
解决方案

Note: if you are limiting this to "3" just so you don't overwhelm the machine running your app, I'd make sure this is a problem first. The threadpool is supposed to manage this for you. On the other hand, if you don't want to overwhelm some other resource, then read on!


You can't manage the size of the threadpool (or really much of anything about it).

In this case, I'd use a semaphore to manage access to your resource. In your case, your resource is running the web scrape, or calculating some report, etc.

To do this, in your static class, create a semaphore object:

System.Threading.Semaphore S = new System.Threading.Semaphore(3, 3);

Then, in each thread, you do this:

System.Threading.Semaphore S = new System.Threading.Semaphore(3, 3);

try
{
    // wait your turn (decrement)
    S.WaitOne();
    // do your thing
}

finally {
    // release so others can go (increment)
    S.Release();
}

Each thread will block on the S.WaitOne() until it is given the signal to proceed. Once S has been decremented 3 times, all threads will block until one of them increments the counter.

This solution isn't perfect.


If you want something a little cleaner, and more efficient, I'd recommend going with a BlockingQueue approach wherein you enqueue the work you want performed into a global Blocking Queue object.

Meanwhile, you have three threads (which you created--not in the threadpool), popping work out of the queue to perform. This isn't that tricky to setup and is very fast and simple.

Examples:

这篇关于C#的线程池线程限制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-20 06:43
查看更多