繁体   English   中英

Azure DevOps Pipeline 上没有日志记录

[英]No Logging on Azure DevOps Pipeline

更新:

是否可以添加或更改在 Azure DevOps 上执行管道的命令?


Visual Studio Code本地运行我的程序,我确实得到了输出。

但是,在Azure DevOps上运行我的GitHub源分支不会产生任何输出。

我遵循了 Stack Overflow 的答案,该答案将此解决方案引用到了GitHub 问题

我已经实现了以下内容,但 Azure 的原始日志在我的 Python logging记录中返回空白。

test_logging.py

import logging

filename = "my.log"

global logger
logger = logging.getLogger()
logger.setLevel(logging.INFO)
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
open(filename, "w").close()  # empty logs
fileHandler = logging.FileHandler(filename)
fileHandler.setFormatter(formatter)
fileHandler.setLevel(logging.INFO)
logger.addHandler(fileHandler)

logger.error('TEST')

# fetch logs
with open(filename, "r") as fileHandler:
    logs = [log.rstrip() for log in fileHandler.readlines()]
open(filename, "w").close()  # empty logs
print('logs = ', logs)
>>> logs = []

host.json

{
  "version":  "2.0",
  "logging": {
    "fileLoggingMode": "always",
    "logLevel": {
      "default": "Debug"
    }
  } 
}

然后我从post尝试了这个替代host.json

"logging": {
    "fileLoggingMode": "debugOnly",
    "logLevel": {
        "default": "None",
        "Host.Results": "Information",
        "Function": "Information",
        "Host.Aggregator": "Information"
    },
    "applicationInsights": {
        "samplingSettings": {
            "isEnabled": false,
            "maxTelemetryItemsPerSecond": 5
        }
    }
}

azure-pipeline-ontology_tagger.yaml

# ##########
# A build run against multiple Python targets
# ##########

resources:
- repo: self

variables:
  tag: '$(Build.SourceBranchName)-$(Build.BuildNumber)'
  imageName: '$(Build.Repository.Name)-ontology_tagger'
  artifactFeed: grandproject/private-sources
  repositoryUrl: private-sources
  packageDirectory: workers/ontology_tagger

trigger:
  batch: true
  branches:
    include:
    - master
    - development
    - releases/*
  paths:
    include:
    - "workers/ontology_tagger"
    exclude:
    - "workers"
    - "*.md"
pr:
  branches:
    include:
    - master
    - development
    - releases/*
  paths:
    include:
    - "workers/ontology_tagger"
    exclude:
    - "workers"
    - "*.md"

stages:
- stage: BuildWP
  displayName: Build Workers python package
  jobs:

  - job: Build
    displayName: Build Worker python image

    pool:
      name: EKS-grandproject-dev

    steps:
    - bash: env

    - task: PipAuthenticate@0
      displayName: Authenticate with artifact feed
      inputs:
        artifactFeeds: $(artifactFeed)

    - task: TwineAuthenticate@1
      displayName: Authenticate with artifact feed
      inputs:
        artifactFeed: $(artifactFeed)

    - bash: echo "##vso[task.setvariable variable=POETRY_HTTP_BASIC_AZURE_PASSWORD;isOutput=true]$(echo $PIP_EXTRA_INDEX_URL | sed -r 's|https://(.+):(.+)@.*|\2|')"
      name: "PIPAUTH"

    - task: Bash@3
      displayName: Test worker
      inputs:
        targetType: 'inline'
        workingDirectory: '$(packageDirectory)'
        script: |
          docker build . --progress plain --pull --target test \
          --build-arg POETRY_HTTP_BASIC_AZURE_PASSWORD=${PIPAUTH_POETRY_HTTP_BASIC_AZURE_PASSWORD} \
          --build-arg ATLASSIAN_TOKEN=$(ATLASSIAN_TOKEN)
    - task: Bash@3
      displayName: Build and publish package
      inputs:
        targetType: 'inline'
        workingDirectory: '$(packageDirectory)'
        script: |
          set -e
          cp $(PYPIRC_PATH) ./
          docker build . --target package --progress plain  --build-arg REPO=$(repositoryUrl)
    - task: Bash@3
      displayName: Build docker image
      inputs:
        targetType: 'inline'
        workingDirectory: '$(packageDirectory)'
        script: |
          docker build . --tag '$(imageName):$(tag)' --progress plain --pull --target production \
          --build-arg POETRY_HTTP_BASIC_AZURE_PASSWORD=${PIPAUTH_POETRY_HTTP_BASIC_AZURE_PASSWORD} \
          --label com.azure.dev.image.build.sourceversion=$(Build.SourceVersion) \
          --label com.azure.dev.image.build.sourcebranchname=$(Build.SourceBranchName) \
          --label com.azure.dev.image.build.buildnumber=$(Build.BuildNumber)
    - task: ECRPushImage@1
      displayName: Push image with 'latest' tag
      condition: and(succeeded(),eq(variables['Build.SourceBranchName'], 'master'))
      inputs:
        awsCredentials: 'dev-azure-devops'
        regionName: 'eu-central-1'
        imageSource: 'imagename'
        sourceImageName: $(imageName)
        sourceImageTag: $(tag)
        repositoryName: $(imageName)
        pushTag: 'latest'
        autoCreateRepository: true

    - task: ECRPushImage@1
      displayName: Push image with branch name tag
      condition: and(succeeded(),ne(variables['Build.SourceBranchName'], 'merge'))
      inputs:
        awsCredentials: 'iotahoe-dev-azure-devops'
        regionName: 'eu-central-1'
        imageSource: 'imagename'
        sourceImageName: $(imageName)
        sourceImageTag: $(tag)
        repositoryName: $(imageName)
        pushTag: '$(Build.SourceBranchName)'
        autoCreateRepository: true

    - task: ECRPushImage@1
      displayName: Push image with uniq tag
      condition: and(succeeded(),ne(variables['Build.SourceBranchName'], 'merge'))
      inputs:
        awsCredentials: 'dev-azure-devops'
        regionName: 'eu-central-1'
        imageSource: 'imagename'
        sourceImageName: $(imageName)
        sourceImageTag: $(tag)
        repositoryName: $(imageName)
        pushTag: $(tag)
        autoCreateRepository: true
        outputVariable: 'ECR_PUSHED_IMAGE_NAME'

如果还有什么我应该提供的,请告诉我。

我认为您从根本上混淆了这里的一些内容:您提供和遵循的链接提供了有关在 Azure Functions 中设置日志记录的指导。 但是,您似乎在谈论 Azure Pipelines 中的日志记录,这是完全不同的事情。 所以只是要清楚:

Azure Pipelines 运行生成和部署作业,将 GitHub 存储库中的代码部署到 Azure Functions。 管道在 Azure Pipelines 代理中执行,可以是 Microsoft 或自托管。 如果我们假设你正在使用 Microsoft 托管代理执行管道,则不应假设这些代理具有 Azure Functions 可能具有的任何功能(也不应首先执行针对 Azure Functions 的代码)。 如果你想在你的管道中执行 python 代码,你应该首先开始查看托管代理预先安装了哪些与 python 相关的功能并从那里开始工作: https : //docs.microsoft.com/en-us/azure/ devops/pipelines/agents/hosted?view=azure-devops&tabs=yaml

如果您想记录有关管道运行的信息,您应该在手动排队管道时首先选中“启用系统诊断”选项。 要自己实现更多日志记录,请检查: https : //docs.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash

要登录 Azure Functions,您可能希望从这里开始: https : //docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring ,但这与登录 Azure Pipelines 是完全不同的主题。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM