使用 Boto 3 将文件从 AWS S3 传输到 SFTP

fdr*_*igo 6 python sftp amazon-s3 amazon-web-services boto3

我是使用 Boto3 的初学者,我想直接将文件从 S3 存储桶传输到 SFTP 服务器。

我的最终目标是为 AWS Glue 编写 Python 脚本。

我找到了一些文章,其中展示了如何将文件从 SFTP 传输到 S3 存储桶:https :
//medium.com/better-programming/transfer-file-from-ftp-server-to-a-s3-bucket-使用-python-7f9e51f44e35

不幸的是,我找不到任何做相反动作的东西。你有什么建议/想法吗?


我的第一次错误尝试如下。

但我想避免将文件下载到本地内存,以便将其移动到 SFTP。

import pysftp
import boto3

# get clients
s3_gl = boto3.client('s3', aws_access_key_id='', aws_secret_access_key='')

# parameters
bucket_gl = ''
gl_data = ''
gl_script = ''

source_response = s3_gl.get_object(Bucket=bucket_gl,Key=gl_script+'file.csv')
print(source_response['Body'].read().decode('utf-8'))

#---------------------------------

srv = pysftp.Connection(host="", username="", password="")

with srv.cd('relevant folder in sftp'): 
    srv.put(source_response['Body'].read().decode('utf-8')) 

# Closes the connection
srv.close()
Run Code Online (Sandbox Code Playgroud)

Mar*_*ryl 6

"transfer ... directly" can mean number of different things.

Let's assume that you want to transfer the file via the local machine (where the Python code runs), without actually storing a temporary copy of the file to the local file system.


For SFTP upload, you can use Paramiko library.

Assuming you already have your Paramiko SFTPClient (sftp) and Boto 3 client (s3) instances ready (what is covered in the article you have linked in your question), you can simply "glue" them together using file-like objects:

with sftp.open('/sftp/path/filename', 'wb', 32768) as f:
    s3.download_fileobj('mybucket', 'mykey', f)
Run Code Online (Sandbox Code Playgroud)

For the purpose of the 32768 argument, see Writing to a file on SFTP server opened using pysftp "open" method is slow.