简体   繁体   English

azure 数据湖 (gen2) 日志中的用户 ID

[英]Userid in azure datalake (gen2) log

I am getting my head around the logs in Azure, and particularly the logs I get from ADSL Gen2 data lake.我正在了解 Azure 中的日志,尤其是我从 ADSL Gen2 数据湖获得的日志。 It is really true that I can't get the login / userid for the changes to the data lake?我真的无法获取数据湖更改的登录名/用户ID吗?

I have these fields我有这些领域

TenantId TimeGenerated [UTC] AccountName Location Protocol OperationName AuthenticationType StatusCode StatusText DurationMs ServerLatencyMs Uri CallerIpAddress CorrelationId SchemaVersion OperationVersion AuthenticationHash UserAgentHeader ClientRequestId Etag ServiceType RequestHeaderSize ResponseHeaderSize LastModifiedTime [UTC] Category TlsVersion SourceSystem Type _ResourceId TenantId TimeGenerated [UTC] AccountName Location Protocol OperationName AuthenticationType StatusCode StatusText DurationMs ServerLatencyMs Uri CallerIpAddress CorrelationId SchemaVersion OperationVersion AuthenticationHash UserAgentHeader ClientRequestId Etag ServiceType RequestHeaderSize ResponseHeaderSize LastModifiedTime [UTC] Category TlsVersion SourceSystem Type _ResourceId

AuthenticationType is just called AccountKey AuthenticationType 只是称为AccountKey

CallerIpAddress is the IP address of the user CallerIpAddress 是用户的 IP 地址

But the userID like 123@domain.com or similar is what I am looking for.但是像 123@domain.com 或类似的用户 ID 正是我正在寻找的。 So how do I include fields that describe how the operation was authenticated那么我如何包含描述操作如何验证的字段

To be clear (after some input from KarthikBhyresh-MT)清楚(在 KarthikBhyresh-MT 的一些输入之后)

I have my own ADSL that I am playing around in In azure portal under ADSL > Diagnostic setting (classic) I have enabled Blob logging version 2.0, Read/Write/Delete/Delete data (just as suggested)我有我自己的 ADSL,我正在 ADSL >诊断设置(经典)下的天蓝色门户中播放我已启用 Blob 日志记录版本 2.0,读/写/删除/删除数据(正如建议的那样)

I then use Microsoft Azure Storage Explore to upload some file, deleting some of the files again, and generally make something to log然后我使用 Microsoft Azure Storage Explore 上传了一些文件,再次删除了一些文件,并且通常会做一些记录

In azure portal under ADSL > Logs (preview) I read the StorageBlobLogs在 ADSL >日志(预览)下的 azure 门户中,我阅读了 StorageBlobLogs

If I run the simplest query where RequesterUpn is not empty I get my username for an even where AuthenticationType is OAuth.如果我在 RequesterUpn 不为空的情况下运行最简单的查询,即使 AuthenticationType 是 OAuth,我也会得到我的用户名。 That is the login to the service.那就是登录到服务。

But when I find the OperationName: DeleteFile I have no information of who did it I have the autenticationHash(1) and CallerIpAdress(2) and I could look up the IpAdress from the OAuth log event to put Delete action to a name但是当我找到 OperationName: DeleteFile 时,我不知道是谁做的,我有 autenticationHash(1) 和 CallerIpAdress(2),我可以从 OAuth 日志事件中查找 IpAdress 以将删除操作放到一个名称上在此处输入图片说明

If you have turned on the below specifics in ADLS account如果您已在 ADLS 帐户中打开以下详细信息

在此处输入图片说明

Optionally route if needed如果需要,可选择路由

在此处输入图片说明

I can see from the logs in the storage about ADLS now.我现在可以从存储中的日志中看到有关 ADLS 的信息。

A sample record containing upn .包含upn的示例记录。 You can find it at identity.upn where OAuth is the authorization used, which is supported Version 2.0 of Storage Analytics logging.您可以在identity.upn中找到它,其中OAuth是使用的授权,它支持 Storage Analytics 日志记录的2.0 版

{
    "time": "2021-10-30T05:12:17.3923930Z",
    "resourceId": "/subscriptions/<Subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.Storage/storageAccounts/<storage-account-name>/blobServices/default",
    "category": "StorageRead",
    "operationName": "GetBlobServiceProperties",
    "operationVersion": "2020-08-04",
    "schemaVersion": "1.0",
    "statusCode": 200,
    "statusText": "Success",
    "durationMs": 712,
    "callerIpAddress": "<ip-address><port>",
    "correlationId": "fced83b0-xxxx-xxxx-xxxx-cd769c000000",
    "identity": {
        "type": "OAuth",
        "tokenHash": "E098F823BC1BE1D9AC73F22F82xxxxxxxxxxxxxxxxxxxx5537E013A5E6BDF71E",
        "requester": {
            "appId": "691458b9-xxxx-xxxx-xxxx-ed83a7f1b41c",
            "audience": "https://storage.azure.com/",
            "objectId": "b1c5060f-xxxx-xxxx-xxxx-31cce61160f4",
            "tenantId": "72f988bf-xxxx-xxxx-xxxx-2d7cd011db47",
            "tokenIssuer": "https://sts.windows.net/72f988bf-xxxx-xxxx-xxxx-2d7cd011db47/",
            "upn": "user@domain.com"
        }
    },
    "location": "East US",
    "properties": {
        "accountName": "<storage-account-name>",
        "userAgentHeader": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.54 Safari/537.36 Edg/95.0.1020.30",
        "serviceType": "blob",
        "objectKey": "/<storage-account-name>",
        "serverLatencyMs": 711,
        "requestHeaderSize": 2713,
        "responseHeaderSize": 282,
        "responseBodySize": 802,
        "tlsVersion": "TLS 1.2"
    },
    "uri": "https://<storage-account-name>.blob.core.windows.net:443/?restype=service&comp=properties&_=1635xxxx35961",
    "protocol": "HTTPS",
    "resourceType": "Microsoft.Storage/storageAccounts/blobServices"
}

Refer official MS Storage Analytics log format doc for more details.有关更多详细信息,请参阅官方 MS Storage Analytics 日志格​​式文档。

Once I had sat Allow storage account key access to Disabled .一旦我坐过Allow storage account key access to Disabled I had OAuth on every StorageRead, StorageWrite, StorageDelete我在每个 StorageRead、StorageWrite、StorageDelete 上都有 OAuth

在此处输入图片说明

MS Dokumentation 微软文档

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure Datalake Store Gen2中的数据屏蔽 - Data masking in Azure Datalake Store Gen2 Spark 可以写入 Azure Datalake Gen2 吗? - Can Spark write to Azure Datalake Gen2? azure datalake gen2 databricks ACL 权限 - azure datalake gen2 databricks ACLs permissions 在没有 Azure DataFactory 的情况下将文件和文件夹从 Azure DataLake Gen1 复制到 Azure DataLake Gen2 - Copy files and folders from Azure DataLake Gen1 to Azure DataLake Gen2 without Azure DataFactory Azure Datalake Gen2 作为 Azure 数据资源管理器的外部表 - Azure Datalake Gen2 as external table for Azure Data Explorer Datalake Storage Gen2 中文件访问和 ACL 更改的日志 - Log for file access and ACL changes in Datalake storage Gen2 Azure Databrics - 从 Gen2 DataLake 存储运行 Spark Jar - Azure Databrics - Running a Spark Jar from Gen2 DataLake Storage SQL Polybase 可以从 Azure datalake gen2 读取数据吗? - Can SQL Polybase read data from Azure datalake gen2? 如何将 datalake gen1 数据集迁移到 azure 数据工厂中的 datalake gen2? - How to migrate datalake gen1 datasets to datalake gen2 in azure data factory? 通过 Azure 函数中的 C# 将文件从一个 DataLake Gen2 复制到另一个 Data Lake Gen 2 - Copy file from one DataLake Gen2 to another Data Lake Gen 2 via C# in Azure Functions
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM