T. *_*ees 6 python airflow airflow-scheduler
我在Linux AMI中使用LocalExecutor和PostgreSQL数据库运行Airflow 1.9.0。我想手动触发DAG,但是每当我创建schedule_interval设置为None或的DAG时,@onceWeb服务器树视图就会崩溃,并显示以下错误(我仅显示最后一个调用):
File "/usr/local/lib/python2.7/site-packages/croniter/croniter.py", line 467, in expand
raise CroniterBadCronError(cls.bad_length)
CroniterBadCronError: Exactly 5 or 6 columns has to be specified for iteratorexpression.
Run Code Online (Sandbox Code Playgroud)
此外,当我手动触发DAG时,DAG运行开始,但是任务本身从未计划。我环顾四周,但看来我是唯一出现此类错误的人。之前有人遇到此错误并找到了解决方法吗?
触发问题的最小示例:
import datetime as dt
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
default_args = {
'owner': 'me'
}
bash_command = """
echo "this is a test task"
"""
with DAG('schedule_test',
default_args=default_args,
start_date = dt.datetime(2018, 7, 24),
schedule_interval='None',
catchup=False
) as dag:
first_task = BashOperator(task_id = "first_task", bash_command = bash_command)
Run Code Online (Sandbox Code Playgroud)
尝试这个:
schedule_interval为None不带'',或者根本不指定schedule_interval in your DAG。设置为None as a default。在这里的更多信息:气流文档 -搜索BaseOperator像这样:
import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.dummy_operator import DummyOperator
default_args = {
'owner': 'me'
}
bash_command = """
echo "this is a test task"
"""
with DAG('schedule_test',
default_args=default_args,
start_date = datetime(2018, 7, 24),
schedule_interval=None,
catchup=False
) as dag:
t1 = DummyOperator(
task_id='extract_data',
dag=dag
)
t2 = BashOperator(
task_id = "first_task",
bash_command = bash_command
)
#####ORCHESTRATION#####
## It is saying that in order for t2 to run, t1 must be done.
t2.set_upstream(t1)
Run Code Online (Sandbox Code Playgroud)