Tit*_*llo 5 python amazon-s3 amazon-web-services aws-lambda keras
我正在尝试部署我使用 Keras 在我的机器上本地训练的 NN 模型。我使用我的模型(本地)作为:
from keras.models import load_model
model = load_model("/path/to/model/model.h5")
prediction = model.predict(x)
Run Code Online (Sandbox Code Playgroud)
现在,我需要在我的 lambda 函数上使用相同的模型。我将模型上传到s3存储桶中。然后我尝试访问该文件:
model = load_model("https://s3-eu-west-1.amazonaws.com/my-bucket/models/model.h5")
Run Code Online (Sandbox Code Playgroud)
但它告诉我该文件不存在。我想这是一个特权问题。我也尝试过(类似于我从 中读取 JSON 文件的方式s3):
client_s3 = boto3.client("s3")
result = client_s3.get_object(Bucket="my-bucket", Key='models/model.h5')
model = load_model(result["Body"].read())
Run Code Online (Sandbox Code Playgroud)
但我得到这个错误:
"stackTrace": [
[
"/var/task/lambda_function.py",
322,
"lambda_handler",
"model = load_model(result[\"Body\"].read())"
],
[
"/var/task/keras/models.py",
227,
"load_model",
"with h5py.File(filepath, mode='r') as f:"
],
[
"/var/task/h5py/_hl/files.py",
269,
"__init__",
"fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)"
],
[
"/var/task/h5py/_hl/files.py",
99,
"make_fid",
"fid = h5f.open(name, flags, fapl=fapl)"
],
[
"h5py/_objects.pyx",
54,
"h5py._objects.with_phil.wrapper",
null
],
[
"h5py/_objects.pyx",
55,
"h5py._objects.with_phil.wrapper",
null
],
[
"h5py/h5f.pyx",
78,
"h5py.h5f.open",
null
],
[
"h5py/defs.pyx",
621,
"h5py.defs.H5Fopen",
null
],
[
"h5py/_errors.pyx",
123,
"h5py._errors.set_exception",
null
]
],
"errorType": "UnicodeDecodeError",
"errorMessage": "'utf8' codec can't decode byte 0x89 in position 29: invalid start byte"
}
Run Code Online (Sandbox Code Playgroud)
我怀疑该result["Body"].read()函数不能与h5py对象一起使用。h5py从加载模型的最佳方法是什么s3?
解决方案:解决方案是将文件下载到文件/tmp/夹中:
result = client_s3.download_file("my-bucket",'model.h5', "/tmp/model.h5")
model = load_model("/tmp/day/model.h5")
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
3952 次 |
| 最近记录: |