我想暂停空闲且冗余的 DAG,我如何知道哪些 DAG 未暂停、哪些 DAG 已暂停?
因此,我有一个 DAG 列表,需要使用执行 .bashrc 的 bash 命令来取消暂停airflow pause <dag_id>。我想通过检查每个 DAG 的状态来了解命令是否成功pause。我检查过airflow webserver,似乎所有暂停的 DAG 仍在运行。
def pause_idle_dags(dags = ["myTutorial"]):
"""
Pauses dags from the airflow
:param dags: dags considered to be idle
:return: Success state
"""
# TODO
for dag in dags:
command = "airflow pause {}".format(dag)
print(executeBashCommand(command))
def executeBashCommand(command):
print('========RUN========', command)
p = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if p.returncode != 0:
print('========STDOUT========\n',stdout.decode())
print('========STDERR========\n',stderr.decode())
raise …Run Code Online (Sandbox Code Playgroud) 所以我在k8上有2个类似的部署,它们从GitLab提取相同的映像。显然,这导致我的第二次部署出现CrashLoopBackOff错误,并且我似乎无法连接到端口来检查/healthz我的Pod。记录Pod表示Pod收到了中断信号,而描述Pod则显示以下消息。
FirstSeen LastSeen Count From SubObjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
29m 29m 1 default-scheduler Normal Scheduled Successfully assigned java-kafka-rest-kafka-data-2-development-5c6f7f597-5t2mr to 172.18.14.110
29m 29m 1 kubelet, 172.18.14.110 Normal SuccessfulMountVolume MountVolume.SetUp succeeded for volume "default-token-m4m55"
29m 29m 1 kubelet, 172.18.14.110 spec.containers{consul} Normal Pulled Container image "..../consul-image:0.0.10" already present on machine
29m 29m 1 kubelet, 172.18.14.110 spec.containers{consul} Normal Created Created container
29m 29m 1 kubelet, 172.18.14.110 spec.containers{consul} Normal Started Started container
28m …Run Code Online (Sandbox Code Playgroud) 我目前正在尝试使用以下查询将日期列映射到其星期几,其中星期一为 1,星期日为 7:
EXTRACT(DAYOFWEEK FROM dates) AS day_of_week
然而,根据 BQ 的文档,该函数似乎使用星期日作为一周的第一天。有没有办法优雅地解决这个问题,而不需要在我的查询中使用条件表达式并手动调整结果?
BQ 文档:
DAYOFWEEK: Returns values in the range [1,7] with Sunday as the first day of the week.
我正在尝试第一次尝试 Airflow,并尝试将它连接到本地 SQLite 数据库。但我似乎无法理解如何真正做到这一点。
我已经阅读了Airflow 的文档,Set my executortoLocalExecutor和 set my sql_alchemy_conntosqlite:////home/myName/Programs/sqlite3/DatabaseName.db但它似乎不起作用,因为它抛出了一个
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 21, in <module>
from airflow import configuration
File "/usr/local/lib/python2.7/dist-packages/airflow/__init__.py", line 35, in <module>
from airflow import configuration as conf
File "/usr/local/lib/python2.7/dist-packages/airflow/configuration.py", line 520, in <module>
conf.read(AIRFLOW_CONFIG)
File "/usr/local/lib/python2.7/dist-packages/airflow/configuration.py", line 283, in read
self._validate()
File "/usr/local/lib/python2.7/dist-packages/airflow/configuration.py", line 169, in _validate
self.get('core', 'executor')))
airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the LocalExecutor
Run Code Online (Sandbox Code Playgroud)
我尝试运行时出错airflow initdb。我试着用谷歌搜索并尝试 …