简体   繁体   English

CloudWatch 日志流到 Lambda python

[英]CloudWatch logs stream to Lambda python

I have created a subscription filter in CloudWatch log group and made it stream to my lambda function, but am getting an error in my lambda function.我在 CloudWatch 日志组中创建了一个订阅过滤器,并将其流式传输到我的 lambda 函数,但我的 lambda 函数出现错误。

Code:代码:

import boto3
import binascii
import json
import base64
import zlib

def stream_gzip_decompress(stream):
    dec = zlib.decompressobj(32 + zlib.MAX_WBITS)  # offset 32 to skip the header
    foo=''
    for chunk in stream:
        rv = dec.decompress(chunk)
        if rv:
            foo += rv
    return foo

def lambda_handler(event, context):
    # Decode and decompress the AWS Log stream to extract json object
    stream=json.dumps(event['awslogs']['data'])
    f = base64.b64decode(stream)
    payload=json.loads(stream_gzip_decompress(f.decode(f)))
    print(payload)

Error:错误:

Response:回复:

{
  "errorMessage": "decode() argument 1 must be str, not bytes",
  "errorType": "TypeError",
  "stackTrace": [
    [
      "/var/task/lambda_function.py",
      34,
      "lambda_handler",
      "payload=json.loads(stream_gzip_decompress(f.decode(f)))"
    ]
  ]
}

Any help or clue would be greatly appreciated!任何帮助或线索将不胜感激! If you have any alternative solution please suggest.如果您有任何替代解决方案,请提出建议。 My requirement is to handle logs from CloudWatch using lambda.我的要求是使用 lambda 处理来自 CloudWatch 的日志。

Thanks in Advance !!提前致谢 !!

In case anyone else is looking for help with this topic.如果其他人正在寻求有关此主题的帮助。

I took a slightly different approach, but I did see an 'awslog' key in the event.我采取了稍微不同的方法,但我确实在事件中看到了一个“awslog”键。

Here is a sample that I was successful with.这是我成功的示例。 Python 3.6 Lambda. Python 3.6 拉姆达。 Setup cloudwatch trigger to call the lambda设置 cloudwatch 触发器以调用 lambda

import gzip
import json
import base64


def lambda_handler(event, context):
    print(f'Logging Event: {event}')
    print(f"Awslog: {event['awslogs']}")
    cw_data = event['awslogs']['data']
    print(f'data: {cw_data}')
    print(f'type: {type(cw_data)}')
    compressed_payload = base64.b64decode(cw_data)
    uncompressed_payload = gzip.decompress(compressed_payload)
    payload = json.loads(uncompressed_payload)

    log_events = payload['logEvents']
    for log_event in log_events:
        print(f'LogEvent: {log_event}')

Below is the outline I normally follow when processing CloudWatch Logs being sent to AWS Lambda.以下是我在处理发送到 AWS Lambda 的 CloudWatch 日志时通常遵循的大纲。

import gzip
import json
from StringIO import StringIO

def lambda_handler(event, context):
    cw_data = str(event['awslogs']['data'])
    cw_logs = gzip.GzipFile(fileobj=StringIO(cw_data.decode('base64', 'strict'))).read()
    log_events = json.loads(cw_logs)
    for log_event in logevents['logEvents']:
        # Process Logs

I see that you are treating the data sent to the AWS Lambda as a JSON object.我看到您将发送到 AWS Lambda 的数据视为 JSON 对象。 You first want to base64 decode then unzip the data.您首先要进行 base64 解码,然后解压缩数据。 After decoding and decompressing you should have the JSON object with the log information.解码和解压缩后,您应该拥有带有日志信息的 JSON 对象。

Here is quasar's answer converted to Python 3.这是转换为 Python 3 的 quasar 的答案。

import gzip
import json
import base64
from io import BytesIO

cw_data = str(event['awslogs']['data'])
cw_logs = gzip.GzipFile(fileobj=BytesIO(base64.b64decode(cw_data, validate=True))).read()
log_events = json.loads(cw_logs)
for log_event in log_events['logEvents']:
    # Process Logs

The main change is using io.BytesIO and a different base64 decode function to get to the log event data.主要的变化是使用 io.BytesIO 和不同的 base64 解码函数来获取日志事件数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM