简体   繁体   English

Azure 日志分析数据收集器 rest API 来自 Databricks 的连接超时错误

[英]Azure log analytics data collector rest API connection timeout error from Databricks

I'm trying to send some custom logs to log analytics from Databricks notebook using Microsoft tutorial , however I'm facing rest API connection timeout error.我正在尝试使用Microsoft 教程将一些自定义日志从 Databricks 笔记本发送到日志分析,但是我面临 rest API 连接超时错误。

ConnectionError: HTTPSConnectionPool(host='XXXXXXX-XXXX-XXXX-XXXX-XXXXXXXX.ods.opinsights.azure.com', port=443): Max retries exceeded with url: /api/logs?api-version=2016-04-01 (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fbed9108310>: Failed to establish a new connection: [Errno 110] Connection timed out'))

Any suggestions please?请问有什么建议吗? How can I allow Azure Databricks to access log analytics workspace?如何允许 Azure Databricks 访问日志分析工作区?

Please find below a sample Python script on how to submit custom data logs with the Azure Monitor HTTP Data Collector API.请在下面找到一个示例 Python 脚本,了解如何使用 Azure 监视器 HTTP 数据收集器提交自定义数据日志。

import json
import requests
import datetime
import hashlib
import hmac
import base64

#Retrieve your Log Analytics Workspace ID from your Key Vault Databricks Secret Scope
wks_id = dbutils.secrets.get(scope = "keyvault_scope", key = "wks-id-logaw1")

#Retrieve your Log Analytics Primary Key from your Key Vault Databricks Secret Scope
wks_shared_key = dbutils.secrets.get(scope = "keyvault_scope", key = "wks-shared-key-logaw1")

#The log type is the name of the event that is being submitted
log_type = 'WebMonitorTest'

#An example JSON web monitor object
json_data = [{
  "slot_ID": 12345,
  "ID": "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
  "availability_Value": 100,
  "performance_Value": 6.954,
  "measurement_Name": "last_one_hour",
  "duration": 3600,
  "warning_Threshold": 0,
  "critical_Threshold": 0,
  "IsActive": "true"
},
{
  "slot_ID": 67890,
  "ID": "b6bee458-fb65-492e-996d-61c4d7fbb942",
  "availability_Value": 100,
  "performance_Value": 3.379,
  "measurement_Name": "last_one_hour",
  "duration": 3600,
  "warning_Threshold": 0,
  "critical_Threshold": 0,
  "IsActive": "false"
}]
body = json.dumps(json_data)

#####################
######Functions######
#####################

#Build the API signature
def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
  x_headers = 'x-ms-date:' + date
  string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
  bytes_to_hash = str.encode(string_to_hash,'utf-8')  
  decoded_key = base64.b64decode(shared_key)
  encoded_hash = (base64.b64encode(hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest())).decode()
  authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
  return authorization

#Build and send a request to the POST API
def post_data(customer_id, shared_key, body, log_type):
  method = 'POST'
  content_type = 'application/json'
  resource = '/api/logs'
  rfc1123date = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
  content_length = len(body)
  signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
  uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'

  headers = {
      'content-type': content_type,
      'Authorization': signature,
      'Log-Type': log_type,
      'x-ms-date': rfc1123date
  }

  response = requests.post(uri,data=body, headers=headers)
  if (response.status_code >= 200 and response.status_code <= 299):
      print ('Accepted')
  else:
      print ("Response code: {}".format(response.status_code))
      
#Post the log
post_data(wks_id, wks_shared_key, body, log_type)

This article helps to Write Custom Logs on Log Analytics through Databricks on Azure. 本文有助于通过 Azure 上的 Databricks 在 Log Analytics 上编写自定义日志。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM