问题描述
我的Airflow 1.9在虚拟环境中运行,并使用Celery和Redis进行了设置,并且运行良好。但是,我想守护设置并使用。它对于Web服务器,调度程序和Flower效果很好,但对Worker则不起作用,这当然是这一切的核心。我的 airflow-worker.service
文件如下所示:
I have Airflow 1.9 running inside a virtual environment, set up with Celery and Redis and it works well. However, I wanted to daemon-ize the set up and used the instructions here. It works well for the Webserver, Scheduler and Flower, but fails for the Worker, which is of course, the core of it all. My airflow-worker.service
file looks like this:
[Unit]
Description=Airflow celery worker daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service
[Service]
EnvironmentFile=/etc/default/airflow
User=root
Group=root
Type=simple
ExecStart=/bin/bash -c 'source /home/codingincircles/airflow-master/bin/activate ; airflow worker'
Restart=on-failure
RestartSec=10s
[Install]
WantedBy=multi-user.target
奇怪的是,如果我按原样在CLI上运行 ExecStart
命令,则它可以完美运行和任务运行,一切都光荣。但是,当我执行 sudo服务airflow-worker启动
时,需要一段时间才能返回提示,并且Flower UI中没有任何显示。当我执行 journalctl -u airflow-worker.service -e
时,这是我看到的:
Curiously, if I run the ExecStart
command on the CLI as is, it runs perfectly and tasks run and everything is glorious. However, when I do a sudo service airflow-worker start
, it takes a while to return to prompt and nothing shows up in the Flower UI. When I do journalctl -u airflow-worker.service -e
, this is what I see:
systemd[1]: Started Airflow celery worker daemon.
bash[12392]: [2018-04-09 21:52:41,202] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python3.5/lib2to3/Grammar.txt
bash[12392]: [2018-04-09 21:52:41,252] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
bash[12392]: [2018-04-09 21:52:41,578] {configuration.py:206} WARNING - section/key [celery/celery_ssl_active] not found in config
bash[12392]: [2018-04-09 21:52:41,578] {default_celery.py:41} WARNING - Celery Executor will run without SSL
bash[12392]: [2018-04-09 21:52:41,579] {__init__.py:45} INFO - Using executor CeleryExecutor
systemd[1]: airflow-worker.service: Main process exited, code=exited, status=1/FAILURE
systemd[1]: airflow-worker.service: Unit entered failed state.
systemd[1]: airflow-worker.service: Failed with result 'exit-code'.
我在做什么错?使用Airflow的任何其他方法都可以使用,除非我尝试将其守护进程。即使在 airflow
命令后使用 -D
标志(例如 airflow worker -D
),但我不确定这是否是在生产环境中使用它的正确/安全/推荐方式,而宁愿使其成为服务并使用它。请帮忙。
What am I doing wrong? Any other method of using Airflow works, except when I try to daemon-ize it. Even using the -D
flag after the airflow
commands works (like airflow worker -D
), except I'm not sure if that is the right/safe/recommended way of using it in production and would rather make it a service and use it. Please help.
推荐答案
您的 airflow-worker.service
试图以 root
用户身份运行airflow worker。为了以根用户身份运行airflow worker,必须在气流环境文件( / etc / default / airflow
中设置 C_FORCE_ROOT = true
code>)。但是,不建议这样做,我怀疑这不是您的最佳解决方案。
Your airflow-worker.service
is trying to run the airflow worker as the root
user. In order to run airflow worker as root, you must set C_FORCE_ROOT="true"
in your airflow environment file (/etc/default/airflow
). However, this is not recommended and I suspect it is not the best fix for you.
当尝试以root身份手动运行airflow worker时,应该看到有关此问题的警告。因为您没有看到此警告,所以我怀疑您可以手动启动工作程序而不会出现问题,因为您以正确配置的 airflow
用户而不是<$ c的身份运行它$ c> root 。因此,建议的解决方案是更改 airflow-worker.service
文件中的以下行:
When trying to run airflow worker manually as root you should see a warning about this. Because you have not seen this warning I suspect that you are able to start the worker without issue manually because you are running it as a properly configured airflow
user and not as root
. Thus, the recommended solution would be to change the following lines in your airflow-worker.service
file:
User=airflow
Group=airflow
这篇关于气流工人守护程序没有明显原因退出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!