简体   繁体   English

如何在 Vertex AI 中构建容器日志?

[英]How to structure container logs in Vertex AI?

I have a model in Vertex AI, from the logs it seems that Vertex AI has ingested the log into message field within jsonPayload field, but i would like to structure the jsonPayload field such that every key in message will be a field within jsonPayload , ie: flatten/extract message我在顶点AI模型,从日志中,似乎顶点AI摄食日志到message领域内jsonPayload场,但我想,以结构jsonPayload场,使得在每一个关键message会场内jsonPayload ,即:展平/提取message 在此处输入图片说明

The logs in Stackdriver follow a defined LogEntry schema. Stackdriver 中的日志遵循定义的LogEntry架构。 Cloud Logging uses structured logs where log entries use the jsonPayload field to add structures to their payload. Cloud Logging 使用结构化日志,其中日志条目使用 jsonPayload 字段向其负载添加结构。

For Vertex AI, the parameters are passed inside the message field which we see in the logs.对于 Vertex AI,参数在我们在日志中看到的消息字段中传递。 These structures of the logs are predefined.日志的这些结构是预定义的。 However if you want to extract the fields that are present inside the message block you can refer to the below mentioned workarounds:但是,如果您想提取消息块中存在的字段,您可以参考以下提到的解决方法:

1. Create a sink : 1. 创建一个接收器:

  • You can export your logs to a Cloud Storage bucket, Bigquery,Pub/Sub etc.您可以将日志导出到 Cloud Storage 存储桶、Bigquery、Pub/Sub 等。
  • If you use Bigquery as the sink, then in Bigquery you can use the JSON functions to extract the required data.如果您使用 Bigquery 作为接收器,那么在 Bigquery 中您可以使用JSON 函数来提取所需的数据。

2. Download the logs and write your custom code : 2. 下载日志并编写您的自定义代码:

  • You can download the log files and then write your custom logic to extract data as per your requirements.您可以下载日志文件,然后编写自定义逻辑以根据您的要求提取数据。
  • You can refer to the client library (python) to write the custom logic and python JSON functions.您可以参考客户端库(python) 编写自定义逻辑和 Python JSON 函数。

Using the gcloud logging client to write structure logs into a Vertex AI endpoint:使用gcloud 日志客户端将结构日志写入 Vertex AI 端点:

(Make sure you have a service account with premissions to write logs into gcloud) (确保您有一个具有将日志写入 gcloud 权限的服务帐户)

import json
import logging
from logging import Handler, LogRecord
import google.cloud.logging_v2 as logging_v2
from google.api_core.client_options import ClientOptions
from google.oauth2 import service_account

data_to_write_to_endpoint = {key1: value1, ...}

#Json key for a Service account permitted to write logs into the gcp 
# project where your endpoint is
credentials = service_account.Credentials.from_service_account_info(
            json.loads(SERVICE_ACOUNT_KEY_JSON)
        )
client = logging_v2.client.Client(
            credentials=credentials, client_options=ClientOptions(api_endpoint="logging.googleapis.com",),
        )
# This represent your Vertex AI endpoint
resource = logging_v2.Resource(
            type="aiplatform.googleapis.com/Endpoint",
            labels={"endpoint_id": YOUR_ENDPOINT_ID, "location": ENDPOINT_REGION},
        )
client.setup_logging(log_level=log_level)
logger = client.logger("LOGGER NAME")
logger.log_struct(
        info=data_to_write_to_endpoint,
        severity=severity,
        resource=resource,
    )

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM