Terraform可以监视目录中的更改吗?

Joe*_*Joe 7 terraform

我想监视文件目录,如果其中一个发生更改,则要重新上传并运行其他一些任务。我以前的解决方案涉及监视单个文件,但这容易出错,因为某些文件可能会被忘记:

resource "null_resource" "deploy_files" {    
  triggers = {
    file1 = "${sha1(file("my-dir/file1"))}"
    file2 = "${sha1(file("my-dir/file2"))}"
    file3 = "${sha1(file("my-dir/file3"))}"
    # have I forgotten one?
  }

  # Copy files then run a remote script.
  provisioner "file" { ... }
  provisioner "remote-exec: { ... }
}
Run Code Online (Sandbox Code Playgroud)

我的下一个解决方案是在一个资源中获取目录结构的哈希,然后在第二个资源中使用此哈希作为触发器:

resource "null_resource" "watch_dir" {
  triggers = {
    always = "${uuid()}"
  }

  provisioner "local-exec" {
    command = "find my-dir  -type f -print0 | xargs -0 sha1sum | sha1sum > mydir-checksum"
  }
}


resource "null_resource" "deploy_files" {    
  triggers = {
    file1 = "${sha1(file("mydir-checksum"))}"
  }

  # Copy files then run a remote script.
  provisioner "file" { ... }
  provisioner "remote-exec: { ... }
}
Run Code Online (Sandbox Code Playgroud)

这样行之有效,除了对的更改mydir-checksum仅在第一个之后才能获取apply。所以我需要apply两次,这不是很好。这有点不合时宜。

我找不到更明显的方法来监视整个目录的内容更改。有标准的方法吗?

gn0*_*00m 10

在 Terraform 0.12 及更高版本中,可以使用for 表达式结合文件集函数和散列函数之一来计算目录中文件的组合校验和:

> sha1(join("", [for f in fileset(path.cwd, "*"): filesha1(f)]))
"77e0b2785eb7405ea5b3b610c33c3aa2dccb90ea"
Run Code Online (Sandbox Code Playgroud)

上面的代码将计算sha1当前目录中与名称模式匹配的每个文件的校验和,将校验和连接到一个字符串中,最后计算结果字符串的校验和。所以这个null_resource例子看起来像这样,上面的表达式作为触发器:

resource "null_resource" "deploy_files" {    
  triggers = {
    dir_sha1 = sha1(join("", [for f in fileset("my-dir", "*"): filesha1(f)]))
  }

  provisioner "file" { ... }
  provisioner "remote-exec: { ... }
}
Run Code Online (Sandbox Code Playgroud)

请注意,fileset("my-dir", "*")不考虑my-dir. 如果您想包含这些校验和,请使用名称模式**而不是*.

  • 如果您要访问上层目录中的文件,您还需要将其传递给 filesha1 函数。您可以使用本地以避免重复。例如: [for f in fileset("${path.module}/../website", "*"): filessha1("${path.module}/../website/${f}")] (6认同)
  • 我让它像这样工作: sha1(join("", [for f in fileset(".", "./path/to/files/*") : filessha1(f)])) (6认同)

Ris*_*ale 5

我也有同样的要求,我使用data.external资源以下面的方式实现它:

  • 编写了一个脚本,它会给我使用目录的校验和 md5sum

    #!/bin/bash
    #
    # This script calculates the MD5 checksum on a directory
    #
    
    # Exit if any of the intermediate steps fail
    set -e
    
    # Extract "DIRECTORY" argument from the input into
    # DIRECTORY shell variables.
    # jq will ensure that the values are properly quoted
    # and escaped for consumption by the shell.
    eval "$(jq -r '@sh "DIRECTORY=\(.directory)"')"
    
    # Placeholder for whatever data-fetching logic your script implements
    CHECKSUM=`find ${DIRECTORY} -type f | LC_ALL=C sort | xargs shasum -a 256 | awk '{ n=split ($2, tokens, /\//); print $1 " " tokens[n]} ' |  shasum -a 256 | awk '{ print $1 }'`
    
    # Safely produce a JSON object containing the result value.
    # jq will ensure that the value is properly quoted
    # and escaped to produce a valid JSON string.
    jq -n --arg checksum "$CHECKSUM" '{"checksum":$checksum}'
    
    Run Code Online (Sandbox Code Playgroud)
  • 创建data.external如下

    data "external" "trigger" {
      program = ["${path.module}/dirhash.sh"]
    
      query {
        directory = "${path.module}/<YOUR_DIR_PATH_TO_WATCH>"
      }
    }
    
    Run Code Online (Sandbox Code Playgroud)
  • 使用上述资源results输出作为触发器null_resource

    resource "null_resource" "deploy_files" {
      # Changes to any configuration file, requires the re-provisioning
      triggers {
        md5 = "${data.external.trigger.result["checksum"]}"
      }
      ...
    }
    
    Run Code Online (Sandbox Code Playgroud)

PS:脚本依赖于 jq

更新:更新了校验和计算逻辑,以抵消不同平台上find的影响。


krl*_*mlr 5

您可以使用"archive_file"数据源

data "archive_file" "init" {
  type        = "zip"
  source_dir = "data/"
  output_path = "data.zip"
}

resource "null_resource" "provision-builder" {
  triggers = {
    src_hash = "${data.archive_file.init.output_sha}"
  }

  provisioner "local-exec" {
    command = "echo Touché"
  }
}
Run Code Online (Sandbox Code Playgroud)

仅当存档的哈希值已更改时,才会重新设置空资源。每当source_dir(在本示例中data/)的内容发生更改时,将在刷新期间重建归档文件。


Hau*_*eth 0

Terraform似乎没有提供任何目录树遍历功能,所以我能想到的唯一解决方案是使用某种外部工具来执行此操作,例如Make:

all: tf.plan

tf.plan: hash *.tf
        terraform plan -o $@

hash: some/dir
        find $^ -type f -exec sha1sum {} + > $@

.PHONY: all hash
Run Code Online (Sandbox Code Playgroud)

然后在您的 Terraform 文件中:

resource "null_resource" "deploy_files" {    
  triggers = {
    file1 = "${file("hash")}"
  }

  # Copy files then run a remote script.
  provisioner "file" { ... }
  provisioner "remote-exec: { ... }
}
Run Code Online (Sandbox Code Playgroud)