在设置一些计划任务之后,我很难理解如何运行Celery。

首先,我的项目目录的结构如下:

python - 不确定如何在Windows下运行Celery来执行定期任务-LMLPHP

blogpodapi\api\__init__.py包含

from tasks import app
import celeryconfig


blogpodapi\api\celeryconfig.py包含

from datetime import timedelta

# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_IMPORTS = ("api.tasks",)

CELERYBEAT_SCHEDULE = {
    'write-test': {
        'task': 'api.tasks.addrandom',
        'schedule': timedelta(seconds=2),
        'args': (16000, 42)
    },
}


blogpodapi\api\tasks.py包含

from __future__ import absolute_import
import random
from celery import Celery
app = Celery('blogpodapi')


@app.task
def add(x, y):
    r = x + y
    print "task arguments: {x}, {y}".format(x=x, y=y)
    print "task result: {r}".format(r=r)
    return r


@app.task
def addrandom(x, *args): # *args are not used, just there to be interchangable with add(x, y)
    y = random.randint(1,100)
    print "passing to add(x, y)"
    return add(x, y)


blogpodapi\blogpodapi\__init__.py包含

from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app  # noqa


blogpodapi\blogpodapi\settings.py包含

...

# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_IMPORTS = ("api.tasks",)

...


我在命令提示符下运行celery -A blogpodapi worker --loglevel=info并得到以下信息:

D:\blogpodapi>celery -A blogpodapi worker --loglevel=info

 -------------- celery@JM v3.1.23 (Cipater)
---- **** -----
--- * ***  * -- Windows-8-6.2.9200
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         blogpodapi:0x348a940
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/1
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[tasks]
  . api.tasks.add
  . api.tasks.addrandom
  . blogpodapi.celery.debug_task

[2016-08-13 13:01:51,108: INFO/MainProcess] Connected to redis://localhost:6379/0
[2016-08-13 13:01:52,122: INFO/MainProcess] mingle: searching for neighbors
[2016-08-13 13:01:55,138: INFO/MainProcess] mingle: all alone
c:\python27\lib\site-packages\celery\fixups\django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-08-13 13:02:00,157: WARNING/MainProcess] c:\python27\lib\site-packages\celery\fixups\django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-08-13 13:02:27,790: WARNING/MainProcess] celery@JM ready.


然后,我在命令提示符下运行celery -A blogpodapi beat并获得以下信息:

D:\blogpodapi>celery -A blogpodapi beat
celery beat v3.1.23 (Cipater) is starting.
__    -    ... __   -        _
Configuration ->
    . broker -> redis://localhost:6379/0
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> now (0s)
[2016-08-13 13:02:51,937: INFO/MainProcess] beat: Starting...


由于某种原因,我似乎无法查看我的定期任务正在记录。我做错了什么吗?

更新:这是我的celery.py ...

from __future__ import absolute_import
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'blogpodapi.settings')

from django.conf import settings  # noqa

app = Celery('blogpodapi')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

最佳答案

您需要使用celery设置文件运行celery beat

celery -A blogpodapi.celery beat --loglevel=INFO

关于python - 不确定如何在Windows下运行Celery来执行定期任务,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/38932602/

10-11 07:45