Zha*_* Yi 4 amazon-s3 aws-lambda terraform
我TerraForm在我的应用程序中用作基础架构框架。下面是我用来将 python 代码部署到 lambda 的配置。它分为三个步骤: 1. 将所有依赖项和源代码压缩到一个 zip 文件中;2.将压缩文件上传到s3存储桶;3. 部署到 lambda 函数。
但是发生的情况是部署命令terraform apply将失败并显示以下错误:
Error: Error modifying Lambda Function Code quote-crawler: InvalidParameterValueException: Error occurred while GetObject. S3 Error Code: NoSuchKey. S3 Error Message: The specified key does not exist.
status code: 400, request id: 2db6cb29-8988-474c-8166-f4332d7309de
on config.tf line 48, in resource "aws_lambda_function" "test_lambda":
48: resource "aws_lambda_function" "test_lambda" {
Error: Error modifying Lambda Function Code praw_crawler: InvalidParameterValueException: Error occurred while GetObject. S3 Error Code: NoSuchKey. S3 Error Message: The specified key does not exist.
status code: 400, request id: e01c83cf-40ee-4919-b322-fab84f87d594
on config.tf line 67, in resource "aws_lambda_function" "praw_crawler":
67: resource "aws_lambda_function" "praw_crawler" {
Run Code Online (Sandbox Code Playgroud)
这意味着 s3 存储桶中不存在部署文件。但是当我第二次运行命令时它成功了。这似乎是一个时间问题。将 zip 文件上传到 s3 存储桶后,该 zip 文件在 s3 存储桶中不存在。这就是第一次部署失败的原因。但是几秒钟后,第二个命令成功且非常快地完成。我的配置文件有什么问题吗?
完整的terraform配置文件可以找到:https : //github.com/zhaoyi0113/quote-datalake/blob/master/config.tf
您需要正确添加依赖项才能实现此目的,否则会崩溃。
首先压缩文件
# Zip the Lamda function on the fly
data "archive_file" "source" {
type = "zip"
source_dir = "../lambda-functions/loadbalancer-to-es"
output_path = "../lambda-functions/loadbalancer-to-es.zip"
}
Run Code Online (Sandbox Code Playgroud)
然后通过指定依赖哪个 zip 来上传它 s3,source = "${data.archive_file.source.output_path}"这将使它依赖于 zip
# upload zip to s3 and then update lamda function from s3
resource "aws_s3_bucket_object" "file_upload" {
bucket = "${aws_s3_bucket.bucket.id}"
key = "lambda-functions/loadbalancer-to-es.zip"
source = "${data.archive_file.source.output_path}" # its mean it depended on zip
}
Run Code Online (Sandbox Code Playgroud)
然后你很高兴去部署 Lambda,让它依赖这条线做魔术 s3_key = "${aws_s3_bucket_object.file_upload.key}"
resource "aws_lambda_function" "elb_logs_to_elasticsearch" {
function_name = "alb-logs-to-elk"
description = "elb-logs-to-elasticsearch"
s3_bucket = "${var.env_prefix_name}${var.s3_suffix}"
s3_key = "${aws_s3_bucket_object.file_upload.key}" # its mean its depended on upload key
memory_size = 1024
timeout = 900
timeouts {
create = "30m"
}
runtime = "nodejs8.10"
role = "${aws_iam_role.role.arn}"
source_code_hash = "${base64sha256(data.archive_file.source.output_path)}"
handler = "index.handler"
}
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
5190 次 |
| 最近记录: |