如何使用boto将文件上传到S3存储桶中的目录

Dhe*_*dra 90 python amazon-s3 boto amazon-web-services

我想使用python在s3存储桶中复制一个文件.

例如:我有桶名=测试.在存储桶中,我有2个文件夹名称"转储"和"输入".现在我想使用python将文件从本地目录复制到S3"dump"文件夹...任何人都可以帮助我吗?

Fel*_*cia 95

试试这个...

import boto
import boto.s3
import sys
from boto.s3.key import Key

AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''

bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump'
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,
        AWS_SECRET_ACCESS_KEY)


bucket = conn.create_bucket(bucket_name,
    location=boto.s3.connection.Location.DEFAULT)

testfile = "replace this with an actual filename"
print 'Uploading %s to Amazon S3 bucket %s' % \
   (testfile, bucket_name)

def percent_cb(complete, total):
    sys.stdout.write('.')
    sys.stdout.flush()


k = Key(bucket)
k.key = 'my test file'
k.set_contents_from_filename(testfile,
    cb=percent_cb, num_cb=10)
Run Code Online (Sandbox Code Playgroud)

[更新]我不是一个pythonist,所以感谢关于import语句的提示.另外,我不建议在自己的源代码中放置凭据.如果您在AWS内部运行此操作,请使用带有实例配置文件的IAM凭据(http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html),并保持相同的行为你的开发/测试环境,使用AdRoll的全息图(https://github.com/AdRoll/hologram)

  • 我会避免多个导入行,而不是pythonic.将导入行移到顶部,对于boto,可以使用boto.s3.connection导入S3Connection; conn = S3Connection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY); bucket = conn.create_bucket(bucketname ...); bucket.new_key(键名,...).set_contents_from_filename .... (8认同)
  • boto.s3.key.Key 在 1.7.12 上不存在 (2认同)
  • 要将文件上传到现有存储桶,而不是创建新存储桶,请替换这一行:bucket = conn.create_bucket(bucket_name, location=boto.s3.connection.Location.DEFAULT) 使用以下代码:bucket = conn.get_bucket(bucket_name) ) (2认同)

vca*_*rel 44

不需要那么复杂:

s3_connection = boto.connect_s3()
bucket = s3_connection.get_bucket('your bucket name')
key = boto.s3.key.Key(bucket, 'some_file.zip')
with open('some_file.zip') as f:
    key.send_file(f)
Run Code Online (Sandbox Code Playgroud)

  • `key.set_contents_from_filename('some_file.zip')`也可以在这里工作.请参阅[doc](http://boto.cloudhackers.com/en/latest/ref/s3.html#boto.s3.key.Key.set_contents_from_filename).可以在[这里]找到boto3的相应代码(http://boto3.readthedocs.io/en/latest/guide/s3-example-creating-buckets.html#upload-a-file-to-an-amazon- S3桶). (3认同)
  • 是的..不太复杂和常用的做法 (2认同)

小智 35

我用过它,实现起来非常简单

import tinys3

conn = tinys3.Connection('S3_ACCESS_KEY','S3_SECRET_KEY',tls=True)

f = open('some_file.zip','rb')
conn.upload('some_file.zip',f,'my_bucket')
Run Code Online (Sandbox Code Playgroud)

https://www.smore.com/labs/tinys3/

  • 由于tinys3项目被废弃,你不应该使用它.https://github.com/smore-inc/tinys3/issues/45 (6认同)

Bor*_*ris 32

import boto3

s3 = boto3.resource('s3')
BUCKET = "test"

s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file")
Run Code Online (Sandbox Code Playgroud)

  • 你能解释一下这一行 s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file") (2认同)
  • @venkat“your/local/file”是使用 python/boto 的计算机上的文件路径,例如“/home/file.txt”,“dump/file”是用于将文件存储在 S3 存储桶中的键名称。请参阅:http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Bucket.upload_file (2认同)

Rom*_*rac 23

在具有凭据的会话中将文件上传到 s3。

import boto3

session = boto3.Session(
    aws_access_key_id='AWS_ACCESS_KEY_ID',
    aws_secret_access_key='AWS_SECRET_ACCESS_KEY',
)
s3 = session.resource('s3')
# Filename - File to upload
# Bucket - Bucket to upload to (the top level directory under AWS S3)
# Key - S3 object name (can contain subdirectories). If not specified then file_name is used
s3.meta.client.upload_file(Filename='input_file_path', Bucket='bucket_name', Key='s3_output_key')
Run Code Online (Sandbox Code Playgroud)

  • 它是 S3 存储桶中的文件名。 (4认同)

Man*_*hra 13

from boto3.s3.transfer import S3Transfer
import boto3
#have all the variables populated which are required below
client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key)
transfer = S3Transfer(client)
transfer.upload_file(filepath, bucket_name, folder_name+"/"+filename)
Run Code Online (Sandbox Code Playgroud)

  • @ManishMehra 如果您对其进行编辑以澄清 colintobing 的混淆点,答案会更好;如果不检查文档,哪些参数引用本地路径,哪些参数引用 S3 路径,而不检查文档或阅读评论,这是不明显的。(完成后,您可以标记为清除此处的所有评论,因为它们将过时。) (3认同)

Sam*_*Nde 13

这是一个三班轮。只需按照boto3 文档中的说明进行操作即可

import boto3
s3 = boto3.resource(service_name = 's3')
s3.meta.client.upload_file(Filename = 'C:/foo/bar/baz.filetype', Bucket = 'yourbucketname', Key = 'baz.filetype')
Run Code Online (Sandbox Code Playgroud)

一些重要的论据是:

参数:

  • 文件名( str) -- 要上传的文件的路径。
  • Bucket ( str) -- 要上传到的存储桶的名称。
  • ( str) -- 您要分配给 s3 存储桶中的文件的名称。这可能与文件名或您选择的不同名称相同,但文件类型应保持不变。

    注意:我假设您已按照 boto3 文档中最佳配置实践的~\.aws建议将凭据保存在一个文件夹中。


    Piy*_*are 12

    这也有效:

    import os 
    import boto
    import boto.s3.connection
    from boto.s3.key import Key
    
    try:
    
        conn = boto.s3.connect_to_region('us-east-1',
        aws_access_key_id = 'AWS-Access-Key',
        aws_secret_access_key = 'AWS-Secrete-Key',
        # host = 's3-website-us-east-1.amazonaws.com',
        # is_secure=True,               # uncomment if you are not using ssl
        calling_format = boto.s3.connection.OrdinaryCallingFormat(),
        )
    
        bucket = conn.get_bucket('YourBucketName')
        key_name = 'FileToUpload'
        path = 'images/holiday' #Directory Under which file should get upload
        full_key_name = os.path.join(path, key_name)
        k = bucket.new_key(full_key_name)
        k.set_contents_from_filename(key_name)
    
    except Exception,e:
        print str(e)
        print "error"   
    
    Run Code Online (Sandbox Code Playgroud)


    Sha*_*kti 5

    import boto
    from boto.s3.key import Key
    
    AWS_ACCESS_KEY_ID = ''
    AWS_SECRET_ACCESS_KEY = ''
    END_POINT = ''                          # eg. us-east-1
    S3_HOST = ''                            # eg. s3.us-east-1.amazonaws.com
    BUCKET_NAME = 'test'        
    FILENAME = 'upload.txt'                
    UPLOADED_FILENAME = 'dumps/upload.txt'
    # include folders in file path. If it doesn't exist, it will be created
    
    s3 = boto.s3.connect_to_region(END_POINT,
                               aws_access_key_id=AWS_ACCESS_KEY_ID,
                               aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
                               host=S3_HOST)
    
    bucket = s3.get_bucket(BUCKET_NAME)
    k = Key(bucket)
    k.key = UPLOADED_FILENAME
    k.set_contents_from_filename(FILENAME)
    
    Run Code Online (Sandbox Code Playgroud)


    Nou*_*pra 5

    使用 boto3

    import logging
    import boto3
    from botocore.exceptions import ClientError
    
    
    def upload_file(file_name, bucket, object_name=None):
        """Upload a file to an S3 bucket
    
        :param file_name: File to upload
        :param bucket: Bucket to upload to
        :param object_name: S3 object name. If not specified then file_name is used
        :return: True if file was uploaded, else False
        """
    
        # If S3 object_name was not specified, use file_name
        if object_name is None:
            object_name = file_name
    
        # Upload the file
        s3_client = boto3.client('s3')
        try:
            response = s3_client.upload_file(file_name, bucket, object_name)
        except ClientError as e:
            logging.error(e)
            return False
        return True
    
    Run Code Online (Sandbox Code Playgroud)

    更多信息:- https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html