我创建了一个简单的模块:
.
??? inputs.tf
??? main.tf
Run Code Online (Sandbox Code Playgroud)
输入变量声明于inputs.tf:
variable "workers" {
type = number
description = "Amount of spark workers"
}
variable "values_values_path" {}
Run Code Online (Sandbox Code Playgroud)
main.tf 是:
resource "helm_release" "spark" {
name = "spark"
repository = "https://charts.bitnami.com/bitnami"
chart = "spark"
version = "1.2.21"
namespace = ...
set {
name = "worker.replicaCount"
value = var.workers
}
values = [
"${file("${var.custom_values_path}")}"
]
}
Run Code Online (Sandbox Code Playgroud)
如您所见,我正在尝试部署 helm 版本。我想设置一个参数化为的自定义值文件custom_values_path:
.
??? main.tf
??? provider.tf
??? spark-values.yaml
Run Code Online (Sandbox Code Playgroud)
我main.tf这里是:
module "spark" {
source = "../modules/spark"
workers = 1
custom_values_path = "./spark_values.yaml"
}
Run Code Online (Sandbox Code Playgroud)
但是,我得到:
Error: Error in function call
on ../modules/spark/main.tf line 14, in resource "helm_release" "spark":
14: "${file("${var.custom_values_path}")}"
|----------------
| var.custom_values_path is "./spark_values.yaml"
Call to function "file" failed: no file exists at spark_values.yaml.
Run Code Online (Sandbox Code Playgroud)
完整的目录结构为:
.
??? stash
? ??? main.tf
? ??? provider.tf
? ??? spark-values.yaml
??? modules
??? spark
??? inputs.tf
??? main.tf
Run Code Online (Sandbox Code Playgroud)
当我表演时,terraform plan我在./stash.
所以完整的命令是:
$ > cd ./stash $ stash > terraform plan 错误:函数调用错误
on ../modules/spark/main.tf line 14, in resource "helm_release" "spark":
14: "${file("${var.custom_values_path}")}"
|----------------
| var.custom_values_path is "./spark_values.yaml"
Call to function "file" failed: no file exists at spark_values.yaml.
Run Code Online (Sandbox Code Playgroud)
为什么我调用函数“文件”失败:不存在文件?
Ala*_*Dea 12
由于您是从子模块引用调用模块中的文件,因此您应该使用path.module提供基于调用模块路径的绝对路径,如下所示:
module "spark" {
source = "../modules/spark"
workers = 1
custom_values_path = "${path.module}/spark_values.yaml"
}
Run Code Online (Sandbox Code Playgroud)
我建议不要在 Terraform 中跨模块边界引用文件。你最好保持模块与变量之间的依赖关系,只是为了避免像这样的奇怪问题。另一种方法是将整个文件作为变量提供(无论如何,这就是file函数所做的)。
module "spark" {
source = "../modules/spark"
workers = 1
custom_values = file("${path.module}/spark_values.yaml")
}
Run Code Online (Sandbox Code Playgroud)
然后修改您的 Spark 模块以期望 custom_values 包含内容而不是文件的路径:
resource "helm_release" "spark" {
name = "spark"
repository = "https://charts.bitnami.com/bitnami"
chart = "spark"
version = "1.2.21"
namespace = ...
set {
name = "worker.replicaCount"
value = var.workers
}
values = [
var.custom_values
]
}
Run Code Online (Sandbox Code Playgroud)
看着这一点,我怀疑values参数需要list(string)所以你可能需要在custom_values上使用yamldecode。
| 归档时间: |
|
| 查看次数: |
2031 次 |
| 最近记录: |