我想知道是否有人使用过AWS Redshift和Snowflake,并使用了一个更好的案例.我使用过Redshift,但最近有人建议Snowflake作为一个很好的选择.我的用例基本上是零售营销数据,将由少数分析师使用,他们不是非常精通SQL,并且最有可能拥有报告工具
我正在尝试使用 AWS lambda 将源存储桶中的多个文件复制到目标存储桶,但出现以下错误。桶结构如下
mysrcbucket/Input/daily/acctno_pin_xref/ABC_ACCTNO_PIN_XREF_FULL_20170926_0.csv.gz mysrcbucket/Input/daily/acctno_pin_xref/ABC_ACCTNO_PIN_XREF_FULL_20170926_1.csv/acctno_PIN_XREF_FULL_20170926_0.csv.gz
mydestbucket/Input/daily/acctno_pin_xref/ABC_ACCTNO_PIN_XREF_FULL_20170926_0.csv.gz
mydestbucket/Input/daily/acctno_pin_xref/ABC_ACCTNO_PIN_XREF_FULL_20170926_1.csv.gz mydestbucket/Input/daily/acctno_pin_xref/ABC_ACCTNO_PIN_XREF_count_20170926.inf
我在下面编写了 lambda 函数,但出现以下错误。有人可以帮我解释一下我做错了什么吗
{ "errorMessage": "expected string or bytes-like object", "errorType": "TypeError", "stackTrace": [
[
"/var/task/lambda_function.py",
17,
"lambda_handler",
"s3.Object(dest_bucket,dest_key).copy_from(CopySource= { 'Bucket': obj.bucket_name , 'Key' : obj.key})"
],
[
"/var/runtime/boto3/resources/factory.py",
520,
"do_action",
"response = action(self, *args, **kwargs)"
],
[
"/var/runtime/boto3/resources/action.py",
83,
"__call__",
"response = getattr(parent.meta.client, operation_name)(**params)"
],
[
"/var/runtime/botocore/client.py",
312,
"_api_call",
"return self._make_api_call(operation_name, kwargs)"
],
[
"/var/runtime/botocore/client.py",
575,
"_make_api_call",
"api_params, operation_model, context=request_context)"
],
[
"/var/runtime/botocore/client.py",
627,
"_convert_to_request_dict",
"params=api_params, model=operation_model, context=context)" …
Run Code Online (Sandbox Code Playgroud)