我想从订阅过滤器获取日志,然后将日志放入 s3 存储桶中并将它们发送到 ES。
类似于这里的图表:
https://aws.amazon.com/solutions/implementations/centralized-logging/
当我使用这个功能时:
/*
For processing data sent to Firehose by Cloudwatch Logs subscription filters.
Cloudwatch Logs sends to Firehose records that look like this:
{
"messageType": "DATA_MESSAGE",
"owner": "123456789012",
"logGroup": "log_group_name",
"logStream": "log_stream_name",
"subscriptionFilters": [
"subscription_filter_name"
],
"logEvents": [
{
"id": "01234567890123456789012345678901234567890123456789012345",
"timestamp": 1510109208016,
"message": "log message 1"
},
{
"id": "01234567890123456789012345678901234567890123456789012345",
"timestamp": 1510109208017,
"message": "log message 2"
}
...
]
}
The data is additionally compressed with GZIP.
The code below will:
1) Gunzip …Run Code Online (Sandbox Code Playgroud) amazon-web-services elasticsearch kibana aws-lambda amazon-kinesis-firehose