简体   繁体   English

AWS Cloudwatch 将日志记录到 Azure 日志分析

[英]AWS Cloudwatch Logs to Azure Log Analytics

I am aware of the HTTP Data Collector API that can be used to pull data into Azure Log analytics, my ask here is on AWS Cloudwatch data to Azure. I am aware of the HTTP Data Collector API that can be used to pull data into Azure Log analytics, my ask here is on AWS Cloudwatch data to Azure. We have Azure hosted application and an external AWS hosted Serverless Lamda functions and we want to import the logs of those 13 serverless functions into Azure.我们有 Azure 托管应用程序和外部 AWS 托管无服务器 Lamda 函数,我们希望将这 13 个无服务器函数的日志导入 Azure。 I know from the documentation and there is a python function that can be used as a AWS Lamda function and the python example is in MSFT documentation. I know from the documentation and there is a python function that can be used as a AWS Lamda function and the python example is in MSFT documentation. But what I am failing to understand is what Json format that AWS cloud collector needs to create so they can send it to Azure Log Analytics.但我不明白的是 AWS 云收集器需要创建什么 Json 格式,以便他们可以将其发送到 Azure Log Analytics。 Any examples on this?有这方面的例子吗? Any help on how this can be done.关于如何做到这一点的任何帮助。 I have come across this blog also but that is splunk specific.我也遇到过这个博客,但这是特定于 splunk 的。 https://www.splunk.com/blog/2017/02/03/how-to-easily-stream-aws-cloudwatch-logs-to-splunk.html https://www.splunk.com/blog/2017/02/03/how-to-easily-stream-aws-cloudwatch-logs-to-splunk.html

Hey never mind I was able to dig a little deeper and I found that in AWS I can STREAM the Logs from one Lambda to other Lambda function thru subscription. Hey never mind I was able to dig a little deeper and I found that in AWS I can STREAM the Logs from one Lambda to other Lambda function thru subscription. Once that was setthen all I did was consumed that and on the fly created the JSON and sent it to Azure Logs.一旦设置好了,我所做的一切就被消耗掉了,并在运行中创建了 JSON 并将其发送到 Azure 日志。 In case if you or anyone is interested in it, following is the code:-如果您或任何人对此感兴趣,以下是代码:-

import json
import datetime
import hashlib
import hmac
import base64
import boto3
import datetime
import gzip

from botocore.vendored import requests
from datetime import datetime

Update the customer ID to your Log Analytics workspace ID
customer_id = "XXXXXXXYYYYYYYYYYYYZZZZZZZZZZ"

For the shared key, use either the primary or the secondary Connected Sources client authentication key
shared_key = "XXXXXXXXXXXXXXXXXXXXXXXXXX"

The log type is the name of the event that is being submitted
log_type = 'AWSLambdafuncLogReal'

json_data = [{
"slot_ID": 12345,
"ID": "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
"availability_Value": 100,
"performance_Value": 6.954,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "true"
},
{
"slot_ID": 67890,
"ID": "b6bee458-fb65-492e-996d-61c4d7fbb942",
"availability_Value": 100,
"performance_Value": 3.379,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "false"
}]
#body = json.dumps(json_data)
#####################
######Functions######
#####################

Build the API signature
def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
x_headers = 'x-ms-date:' + date
string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
bytes_to_hash = bytes(string_to_hash, encoding="utf-8")
decoded_key = base64.b64decode(shared_key)
encoded_hash = base64.b64encode(
hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest()).decode()
authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
return authorization

Build and send a request to the POST API
def post_data(customer_id, shared_key, body, log_type):
method = 'POST'
content_type = 'application/json'
resource = '/api/logs'
rfc1123date = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
print (rfc1123date)
content_length = len(body)
signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'

headers = {
    'content-type': content_type,
    'Authorization': signature,
    'Log-Type': log_type,
    'x-ms-date': rfc1123date
}
response = requests.post(uri,data=body, headers=headers)
if (response.status_code >= 200 and response.status_code <= 299):
    print("Accepted")
else:
    print("Response code: {}".format(response.status_code))
    print(response.text)
def lambda_handler(event, context):
cloudwatch_event = event["awslogs"]["data"]
decode_base64 = base64.b64decode(cloudwatch_event)
decompress_data = gzip.decompress(decode_base64)
log_data = json.loads(decompress_data)
print(log_data)
awslogdata = json.dumps(log_data)
post_data(customer_id, shared_key, awslogdata, log_type)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AWS RDS未将常规日志上传到CloudWatch Logs - AWS RDS not uploading general log to CloudWatch Logs AWS CloudWatch 日志流为空,而 Docker 容器日志已满 - AWS CloudWatch log stream is empty while Docker container logs are full 使用 AWS CloudWatch 洞察日志的日志检索解决方案 - Log retrieval solution using AWS CloudWatch insights logs AWS EC2 将用户数据输出到 cloudwatch 日志 - AWS EC2 log userdata output to cloudwatch logs 将日志从 AWS Cloudwatch 日志组发送到 Opendistro EFK - Ship logs from AWS Cloudwatch log group to Opendistro EFK 多个触发的 AWS Lambda 日志显示在单个 Cloudwatch 日志流中 - Multiple Triggered AWS Lambda Logs are displayed in a single Cloudwatch Log Streams 如何使用 Serilog 将库日志记录到 AWS CloudWatch? - How to log library logs to AWS CloudWatch using Serilog? 如何跟踪 AWS CloudWatch Log Stream 日志? - How do I tail AWS CloudWatch Log Stream logs? 使用 Terraform 将 AWS Lambda 日志写入 CloudWatch 日志组 - Write AWS Lambda Logs to CloudWatch Log Group with Terraform 带有 .NET 的 AWS - 从 CloudWatch 读取日志 - 没有返回日志数据 - AWS with .NET - reading logs from CloudWatch - no log data returned
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM