Ami*_*man 5 python kubernetes airflow google-cloud-composer
我正在使用 GCP Composer 运行算法,在流结束时我想运行一个任务,该任务将执行多项操作,将文件和文件夹从卷复制和删除到存储桶我正在尝试执行这些复制和删除通过 a 进行操作kubernetespodoperator。我很难找到使用“cmds”运行多个命令的正确方法,我也尝试使用“cmds”和“arguments”。这是我KubernetesPodOperator尝试过的命令和参数组合:
post_algo_run = kubernetes_pod_operator.KubernetesPodOperator(
task_id="multi-coher-post-operations",
name="multi-coher-post-operations",
namespace="default",
image="google/cloud-sdk:alpine",
### doesn't work ###
cmds=["gsutil", "cp", "/data/splitter-output\*.csv", "gs://my_bucket/data" , "&" , "gsutil", "rm", "-r", "/input"],
#Error:
#[2022-01-27 09:31:38,407] {pod_manager.py:197} INFO - CommandException: Destination URL must name a directory, bucket, or bucket
#[2022-01-27 09:31:38,408] {pod_manager.py:197} INFO - subdirectory for the multiple source form of the cp command.
####################
### doesn't work ###
# cmds=["gsutil", "cp", "/data/splitter-output\*.csv", "gs://my_bucket/data ;","gsutil", "rm", "-r", "/input"],
# [2022-01-27 09:34:06,865] {pod_manager.py:197} INFO - CommandException: Destination URL must name a directory, bucket, or bucket
# [2022-01-27 09:34:06,866] {pod_manager.py:197} INFO - subdirectory for the multiple source form of the cp command.
####################
### only preform the first command - only copying ###
# cmds=["bash", "-cx"],
# arguments=["gsutil cp /data/splitter-output\*.csv gs://my_bucket/data","gsutil rm -r /input"],
# [2022-01-27 09:36:09,164] {pod_manager.py:197} INFO - + gsutil cp '/data/splitter-output*.csv' gs://my_bucket/data
# [2022-01-27 09:36:11,200] {pod_manager.py:197} INFO - Copying file:///data/splitter-output\Coherence Results-26-Jan-2022-1025Part1.csv [Content-Type=text/csv]...
# [2022-01-27 09:36:11,300] {pod_manager.py:197} INFO - / [0 files][ 0.0 B/ 93.0 KiB]
# / [1 files][ 93.0 KiB/ 93.0 KiB]
# [2022-01-27 09:36:11,302] {pod_manager.py:197} INFO - Operation completed over 1 objects/93.0 KiB.
# [20 22-01-27 09:36:12,317] {kubernetes_pod.py:459} INFO - Deleting pod: multi-coher-post-operations.d66b4c91c9024bd289171c4d3ce35fdd
####################
volumes=[
Volume(
name="nfs-pvc",
configs={
"persistentVolumeClaim": {"claimName": "nfs-pvc"}
},
)
],
volume_mounts=[
VolumeMount(
name="nfs-pvc",
mount_path="/data/",
sub_path=None,
read_only=False,
)
],
)
Run Code Online (Sandbox Code Playgroud)
我找到了一种运行多个命令的技术。首先,我找到了 Kubernetespodoperator cmd 和参数属性与 Docker 的 ENTRYPOINT 和 CMD 之间的关系。
Kubernetespodoperator cmds 会覆盖 docker 原有的 ENTRYPOINT,并且 Kubernetespodoperator 参数相当于 docker 的 CMD。
因此,为了从 Kubernetespodoperator 运行多个命令,我使用了以下语法:我已将 Kubernetespodoperator cmd 设置为使用 -c 运行 bash:
cmds=["/bin/bash", "-c"],
Run Code Online (Sandbox Code Playgroud)
我设置了 Kubernetespodoperator 参数来运行两个用 & 分隔的 echo 命令:
arguments=["echo hello && echo goodbye"],
Run Code Online (Sandbox Code Playgroud)
所以我的 Kubernetespodoperator 看起来像这样:
stajoverflow_test = KubernetesPodOperator(
task_id="stajoverflow_test",
name="stajoverflow_test",
namespace="default",
image="google/cloud-sdk:alpine",
cmds=["/bin/bash", "-c"],
arguments=["echo hello && echo goodbye"],
)
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
5488 次 |
| 最近记录: |