简体   繁体   English

OMS Log Analytics 8MB PowerBI 查询限制解决方法?

[英]OMS Log Analytics 8MB PowerBI Query Limit workarounds?

So I am querying data directly from OMS Log analytics using PowerBI Desktop, and I believe there is an 8MB hard limit on the data returned by the query.所以我使用 PowerBI Desktop 直接从 OMS 日志分析查询数据,我相信查询返回的数据有 8MB 的硬限制。 The problem I have is that I need to query about 30 000 rows, but hit the 8MB limit around 18 000 rows.我遇到的问题是我需要查询大约 30 000 行,但达到了 18 000 行左右的 8MB 限制。 Is it possible to break the query up, for example, query1 would return rows 1 - 18 000, query2 would return 18 001 - 28 000 and so on, then I can merge the queries in PowerBI to give me a view of all the data?是否可以拆分查询,例如,query1 将返回行 1 - 18 000,query2 将返回 18 001 - 28 000 等等,然后我可以在 PowerBI 中合并查询以查看所有数据?

Problem is my experience in this field, DAX in particular is quite limited, so I don't know how to specify this in the advanced editor.问题是我在这方面的经验,尤其是 DAX 非常有限,所以我不知道如何在高级编辑器中指定它。 Any help here would be highly appreciated.任何帮助在这里将不胜感激。

Thanks!谢谢!

Same Issue.同样的问题。 Solved it.解决了。

My Need: I have a table in Azure LogAnalytics (LA) that accumulates about ~35K rows per day.我的需求:我在 Azure LogAnalytics (LA) 中有一个表,每天累积约 35,000 行。 I needed to get all rows from LA into PowerBi for analysis.我需要将所有行从 LA 导入 PowerBi 以进行分析。

My Issue: I crafted the KQL query I wanted in the LA Logs Web UX.我的问题:我在 LA Logs Web UX 中制作了我想要的 KQL 查询。 I then selected the "Export -> Export to PowerBI M query" feature.然后我选择了“导出 -> 导出到 PowerBI M 查询”功能。 Pasted it into a BLANK query in PowerBi.将其粘贴到 PowerBi 中的空白查询中。 Authed.已认证。 And I noticed a few bizarre behaviors:我注意到一些奇怪的行为:

1) - Like you said, I was getting a rolling ~35K rows of data, each query would trim just a bit off the first date in my KQL range. 1) - 就像你说的,我得到了大约 35K 行的滚动数据,每个查询都会在我的 KQL 范围内的第一个日期上修剪一点。

2) - Also, I found that for each day, the query would opportunistically trim off some rows - like it was 'guessing' what data I didn't need to fit into a limit. 2) - 另外,我发现对于每一天,查询都会机会性地修剪一些行 - 就像它在“猜测”我不需要适合限制的数据一样。

3) - No matter what KQL |where TimeGenerated >= ago(xd) clause I wrote, I clearly wasn't going to get back more than the limits it held me to. 3) - 无论我写了什么 KQL |where TimeGenerated >= ago(xd) 子句,我显然不会超过它限制我的限制。

My Solution - and it works great.我的解决方案 - 效果很好。 In PowerQuery, i created a new blank table in PowerQuery/M (not a DAX table!).在 PowerQuery 中,我在 PowerQuery/M 中创建了一个新的空白表(不是 DAX 表!)。 In that table I used DateTimeZone.UtcNow() to start it off with Today's date, then I added a col called [Days Back] and added rows for -1,-2,-3...-7.在那个表中,我使用 DateTimeZone.UtcNow() 从今天的日期开始,然后我添加了一个名为 [Days Back] 的列,并添加了 -1,-2,-3...-7 的行。 Then, with some M, I added another col that subtracts Today from Days Back, given me a history of dates.然后,使用一些 M,我添加了另一个 col,从 Days Back 中减去 Today,给我日期的历史记录。 在此处输入图片说明 . .

Now, I have a table from which I can iterate over each Date in history and pass to my KQL query a parm1 for: | where TimeGeneratedDate == todatetime('"& Date.ToText(TimeGeneratedDateLoop) & "')现在,我有我可以每个日期遍历的历史和传递给我的KQL查询PARM1一个表: | where TimeGeneratedDate == todatetime('"& Date.ToText(TimeGeneratedDateLoop) & "') | where TimeGeneratedDate == todatetime('"& Date.ToText(TimeGeneratedDateLoop) & "')

As you can see, after I edited my main LA query to use TimeGeneratedDateLoop as a parm, I can now get each full day's amount of records w/o hitting the LA limit.如您所见,在我编辑了我的主要 LA 查询以使用 TimeGeneratedDateLoop 作为参数后,我现在可以获得每天的记录数量,而不会达到 LA 限制。 Note, that in my case, no single day breaches the 8MB limit.请注意,就我而言,没有一天超过 8MB 的限制。 If yours does, then you can attack this problem with making 12-hour breakdowns, instead of full a day.如果您是这样,那么您可以通过进行 12 小时故障排除来解决这个问题,而不是整整一天。

Here's my final M-query for the function.:这是我对该函数的最终 M 查询。:

NOTE: I also removed this line from the pre-generated query: "prefer"="ai.response-thinning=true" <- I don't know if it helped, but setting it to false didn't work.注意:我还从预先生成的查询中删除了这一行: "prefer"="ai.response-thinning=true" <- 我不知道它是否有帮助,但将其设置为 false 不起作用。

let
  FxDailyQuery = (TimeGeneratedDateLoop as date) => 
    let
      AnalyticsQuery = 
        let
          Source = Json.Document(Web.Contents(
            "https://api.loganalytics.io/v1/workspaces/xxxxx-202d-xxxx-a351-xxxxxxxxxxxx/query", 
            [
              Query = [#"query"
                = "YourLogAnalyticsTbl
| extend TimeGeneratedDate = bin(TimeGenerated, 1d)
| where notempty(Col1)
| where notempty(Col2) 
| where TimeGenerated >= ago(30d) 
| where TimeGeneratedDate == todatetime('"& Date.ToText(TimeGeneratedDateLoop) & "')
", #"x-ms-app" = "OmsAnalyticsPBI"], 
              Timeout = #duration(0, 0, 4, 0)
            ]
          )),
          TypeMap = #table({"AnalyticsTypes", "Type"}, {
            {"string", Text.Type}, 
            {"int", Int32.Type}, 
            {"long", Int64.Type}, 
            {"real", Double.Type}, 
            {"timespan", Duration.Type}, 
            {"datetime", DateTimeZone.Type}, 
            {"bool", Logical.Type}, 
            {"guid", Text.Type}, 
            {"dynamic", Text.Type}
          }),
          DataTable = Source[tables]{0},
          Columns = Table.FromRecords(DataTable[columns]),
          ColumnsWithType = Table.Join(Columns, {"type"}, TypeMap, {"AnalyticsTypes"}),
          Rows = Table.FromRows(DataTable[rows], Columns[name]),
          Table = Table.TransformColumnTypes(Rows, Table.ToList(
            ColumnsWithType, 
            (c) => {c{0}, c{3}}
          ))
        in
          Table
    in
      AnalyticsQuery
in
  FxDailyQuery

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM