Bra*_*iro 3 python airflow airflow-scheduler
I have an airflow service that is currently running as separate docker containers for the webserver and scheduler, both backed by a postgres database. I have the dags synced between the two instances and the dags load appropriately when the services start. However, if I add a new dag to the dag folder (on both containers) while the service is running, the dag gets loaded into the dagbag but show up in the web gui with missing metadata. I can run "airflow initdb" after each update but that doesn't feel right. Is there a better way for the scheduler and webserver to sync up with the database?
Dag更新应自动获取。如果他们没有得到帮助,那通常是因为您所做的更改“打破了”了障碍。
要检查新任务是否确实被执行,请在您的网络服务器上运行:
airflow list_tasks <dag_name> --tree
Run Code Online (Sandbox Code Playgroud)
如果显示未找到Dag,则存在错误。
如果成功运行,则它应显示您的所有任务,并且在刷新时应在气流ui中拾取这些任务。
如果新的/更新的任务未在此处显示,请检查Web服务器上的dags文件夹并验证代码是否确实在更新。