简体   繁体   中英

Agregate JSON events into an array in Azure Stream Analytics

I'm new to Azure Stream Analytics and query language. I have an ASA job which reads json data coming from my IoT Hub and feeds it to different functions based on one of the values. This is what I have now:

SELECT
    *
INTO
    storage
FROM
    iothub

SELECT
    *
INTO
    storageQueueFunction
FROM
    iothub
WHERE
    recType LIKE '3'

SELECT
    *
INTO
    deviceTwinD2CFunctionApp
FROM
    iothub
WHERE
    recType LIKE '50'
    
SELECT
    *
INTO
    heartbeatD2CFunctionApp
FROM
    iothub
WHERE
    recType LIKE '51'

SELECT
    *
INTO
    ackC2D
FROM
    iothub
WHERE
    recType LIKE '54'

I'm pretty sure this could be done more efficiently but it's working for now.

My problem is that when a large number of events come in with recType 54, I think it is overloading my Function App "ackC2D".

My idea is to batch these types of events into a json array using something like a rolling window of 5 seconds, then send that array to the output where I can parse through the array event by event.

I haven't been able to find anything like this online, the closest I can find is aggregating data then outputting a calculation on the aggregate.

Is what I'm trying to do possible?

Thanks!

When configuring the Azure function output, you have the ability to specify 'Max batch size' and 'Max batch count' properties. If lot of input events arrive rapidly, keeping a high value for these properties will result in fewer calls to your Azure Function output (by automatically batching many outputs events in a single HTTP request).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM