Ruby Azure Blob存储:"RequestBodyTooLarge"

JP.*_*JP. 1 ruby azure

如果我尝试使用演示代码将大文件(在这种情况下为388.7 MB)上传到azure blob存储,则会失败:

begin
  content = File.open("big_file.dat", "rb") { |file| file.read }
  blob = azure_blob_service.create_block_blob(container.name,"image-blob", content)
  puts blob.name
rescue StandardError => e
  $stderr.puts e.message
end

# RequestBodyTooLarge (413): The request body is too large and exceeds the maximum permissible limit.
Run Code Online (Sandbox Code Playgroud)

我在blob存储文档中读到blob的大小可达200​​ GB,因此看起来Ruby API没有正确地对其文件上传进行分块.我错过了什么吗?

dmi*_*ael 5

当前的Ruby Azure SDK确实有用于进行分块上传的方法,但是在任何地方都没有使用示例,并且规范中的所有内容都是模拟,这实际上没有帮助.

让分块上传工作非常繁琐,这绝对应该包含在库中.我花了好几个小时来做​​到这一点,我希望这段代码片段有所帮助.

这是一个非常基本的用法示例:

class ::File
  def each_chunk(chunk_size=2**20)
    yield read(chunk_size) until eof?
  end
end

container  = 'your container name'
blob       = 'your blob name'
block_list = []
service    = Azure::BlobService.new
counter    = 1

open('path/to/file', 'rb') do |f|
  f.each_chunk {|chunk|
    block_id = counter.to_s.rjust(5, '0')
    block_list << [block_id, :uncommitted]

    # You will likely want to get the MD5 for retries
    options = {
      content_md5: Base64.strict_encode64(Digest::MD5.digest(chunk)),
      timeout:     300 # 5 minutes
    }

    md5 = service.create_blob_block(container, blob, block_id, chunk, options)
    counter += 1
  }
end

service.commit_blob_blocks(container, blob, block_list)
Run Code Online (Sandbox Code Playgroud)

给我几天,我应该有更合理的封装承诺到https://github.com/dmichael/azure-contrib