van*_*ish 3 python lambda amazon-web-services boto3
我正在尝试使用 lambda 和 boto3 将文件从 1 个存储桶复制到同一个存储桶中的另一个前缀,但是我一直收到错误消息:
调用 CopyObject 操作时发生错误 (AccessDenied)。
或者
调用 HeadObject 操作时发生错误(403):禁止
取决于我使用的复制方法。
lambda 函数有一个分配给它的角色,我认为它赋予了它所需的所有权限:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:HeadObject",
"s3:ListObjects"
],
"Resource": [
"arn:aws:s3:::bucket-name",
"arn:aws:s3:::bucket-name/*"
],
"Effect": "Allow"
},
{
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::bucket-name/folderA/folderB/*",
"arn:aws:s3:::bucket-name/folderC/folderD/*",
"arn:aws:s3:::bucket-name/folderE/folderF/*"
],
"Effect": "Allow"
}
]
}
Run Code Online (Sandbox Code Playgroud)
拉姆达函数是:
#connect to s3
s3 = boto3.resource('s3')
dirs = {
"folderA/folderB": "folderC/folderD"
}
key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
etag = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['eTag'], encoding='utf-8')
bucket = event['Records'][0]['s3']['bucket']['name']
filePathName = key.split("/")
sourceDir = filePathName[0] + "/" + filePathName[1]
fileName = filePathName[2]
sourceKey = sourceDir + "/" + fileName
source = {'Bucket': bucket, 'Key': sourceKey}
destination = dirs[sourceDir] + "/" + fileName
##########
# This option comes up with the An error occurred (AccessDenied) when calling the CopyObject operation. Error
###########
s3.Object(bucket, destination).copy_from(CopySource=source)
###########
## This option comes up with the An error occurred (403) when calling the HeadObject operation: Forbidden error
###########
s3.meta.client.copy(source, bucket, destination)
Run Code Online (Sandbox Code Playgroud)
编辑:忘了提及,如果我将角色更改为
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::bucket-name",
"arn:aws:s3:::bucket-name/*"
],
"Effect": "Allow"
}
Run Code Online (Sandbox Code Playgroud)
我遇到了类似的问题。解决方案:sourceinCopySource=source必须是从存储桶根目录到实际文件的完整路径,而不是存储桶名称和键的字典。所以我认为您的代码可能必须是:
s3.Object(bucket, destination).copy_from(CopySource=bucket + sourceDir)