我遇到了如下的apache-airflow datetime问题

Process DagFileProcessor238215-Process:
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 388, in helper
    pickle_dags)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1832, in process_file
    self._process_dags(dagbag, dags, ti_keys_to_schedule)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1422, in _process_dags
    dag_run = self.create_dag_run(dag)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 856, in create_dag_run
    next_run_date = dag.normalize_schedule(min(task_start_dates))
TypeError: '<' not supported between instances of 'str' and 'datetime.datetime'

我在来自zhongjiajie/docker-airflow并基于puckel/docker-airflow的docker中使用apache-airflow。

和我的DAG这样定义
from airflow import DAG
from airflow.models import Variable
from airflow.operators.dummy_operator import DummyOperator
from udf.udf_hive_operator import HiveOperator
from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
from udf.udf_hive_to_oracle import HiveToOracleTransfer
from udf.utils.date_utils import gen_history_date_para, today_belong_business_day
from datetime import datetime, timedelta

TMPL_SQL_PATH = Variable.get("sql_path")
HIVE_DB = "default"
NOSTRICT_HIVE_PARTITION_MODE = "set hive.exec.dynamic.partition.mode=nonstrict;\n"

default_args = {
    "owner": "xx_monitor",
    "description": "workflow for xx monitor system",
    "depends_on_past": False,
    "start_date": datetime(2014, 1, 1),
    "email": ["[email protected]"],
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 3,
    "retry_delay": timedelta(minutes=5),
    # "queue": "bash_queue",
    # "pool": "backfill",
    # "priority_weight": 10,
    # "end_date": datetime(2016, 1, 1),
}

dag = DAG(
    dag_id="drug_monitor",
    default_args=default_args,
    schedule_interval="0 18 * * *",
    template_searchpath=TMPL_SQL_PATH
)
udf模块是我的用户定义函数

但是奇怪的事情发生了
  • 我转到webserver UI,将dag ON转为它,但它仍然失败,并且我在schedule中看到错误消息,如上所述
  • 我在cli中使用backfill作为airflow backfill -s 20140101 -e 20180101 <DAG_ID>,然后转到schedule,错误消息消失并且所有任务开始按计划或已排队

  • 我尝试了几种方法来解决此问题,但失败了。
  • 尝试将start_date中的default_args设置为airflow.utils.dates.days_ago对象,但失败,例如days_ago(2018, 9, 5)
  • 尝试将start_date中的default_args设置为airflow.utils.timezone.datetime对象,但失败,例如datetime(2018, 9, 5)
  • 尝试将schedule_interval中的DAG设置为DAG-runs变量,例如@daily,但
  • 失败
  • 尝试将schedule_interval中的DAG设置为datetime.timedelta对象,但
  • 失败

    每个人都有遇到这样的问题,我该如何解决?

    最佳答案

    在我的Dag文件中,我使用param start_date定义了一个任务,并修复了该问题以重命名param。

    关于docker - Airflow 日期错误dag.normalize_schedule TypeError,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/52176131/

    10-16 10:08