Airflow的Gunicorn正在发送错误日志

Airflow的Gunicorn正在发送错误日志

本文介绍了Airflow的Gunicorn正在发送错误日志的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Apache Airflow,并意识到gunicorn-error.log的大小在5个月内增长了50 GB以上。大多数日志消息都是INFO级别的日志,例如:

I'm using Apache Airflow and recognized that the size of the gunicorn-error.log grown over 50 GB within 5 months. Most of the log messages are INFO level logs like:

在Airflow配置文件中,我只能设置日志文件路径。有谁知道如何在Airflow中将 gunicorn 日志记录更改为另一个级别?我不需要这种细粒度的日志记录级别,因为它会填满我的硬盘驱动器。

Within the Airflow config file I'm only able to set the log file path. Does anyone know how to change the gunicorn logging to another level within Airflow? I do not need this fine grained logging level because it overfills my hard drive.

推荐答案

我设法通过设置一个环境变量:

I managed to solve the problem by setting an environment variable:

GUNICORN_CMD_ARGS="--log-level WARNING"

如果在 docker-compose.yml 文件中进行设置,则将使用apache-airflow =进行以下测试: = 1.10.6,带有gunicorn == 19.9.0:

If setting this in a docker-compose.yml file, the following is tested with apache-airflow==1.10.6 with gunicorn==19.9.0:

environment:
    - 'GUNICORN_CMD_ARGS=--log-level WARNING'

如果在 Dockerfile中设置它,以下内容使用apache-airflow == 1.10.6和gunicorn == 19.9.0进行了测试:

If setting this in a Dockerfile, the following is tested with apache-airflow==1.10.6 with gunicorn==19.9.0:

ENV GUNICORN_CMD_ARGS --log-level WARNING

这篇关于Airflow的Gunicorn正在发送错误日志的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-14 16:39