简体   繁体   中英

How to Post JSON to Durable Azure Function (Python)?

I would like to call Durable Azure Functions from Azure Data Factory. I would like to post json to Functions and get status when Function processing have completed. My ultimate goals is to successfully run Functions which takes 10min without timeout.

I have already successfully executed Azure Function Activity from ADF with method GET.

Now I need advice to modify Python code of Orchestrator to acceppt json and use json values to filtering which Result set is processed. {"Country": "Japan"}

Current Code base is from Tutorial: https://docs.microsoft.com/en-us/azure/azure-functions/durable/quickstart-python-vscode

I'm following Durable Function instruction from here: http://datanrg.blogspot.com/2020/10/using-durable-functions-in-azure-data.html

# This function an HTTP starter function for Durable Functions.
# Before running this sample, please:
# - create a Durable orchestration function
# - create a Durable activity function (default name is "Hello")
# - add azure-functions-durable to requirements.txt
# - run pip install -r requirements.txt

import logging

import azure.functions as func
import azure.durable_functions as df


async def main(req: func.HttpRequest, starter: str) -> func.HttpResponse:
client = df.DurableOrchestrationClient(starter)
instance_id = await client.start_new(req.route_params["functionName"], None, None)

logging.info(f"Started orchestration with ID = '{instance_id}'.")

return client.create_check_status_response(req, instance_id)

#  This function is not intended to be invoked directly. Instead it will be
# triggered by an HTTP starter function.
# Before running this sample, please:
# - create a Durable activity function (default name is "Hello")
# - create a Durable HTTP starter function
# - add azure-functions-durable to requirements.txt
# - run pip install -r requirements.txt

import logging
import json

import azure.functions as func
import azure.durable_functions as df


def orchestrator_function(context: df.DurableOrchestrationContext):
result1 = yield context.call_activity('Hello', "Tokyo")
result2 = yield context.call_activity('Hello', "Seattle")
result3 = yield context.call_activity('Hello', "London")
return [result1, result2, result3]

main = df.Orchestrator.create(orchestrator_function)


# This function is not intended to be invoked directly. Instead it will be
# triggered by an orchestrator function.
# Before running this sample, please:
# - create a Durable orchestration function
# - create a Durable HTTP starter function
# - add azure-functions-durable to requirements.txt
# - run pip install -r requirements.txt

import logging


def main(name: str) -> str:
    return f"Hello {name}!"

Now I need advice to modify Python code of Orchestrator to acceppt json and use json values to filtering which Result set is processed. {"Country": "Japan"}

import azure.durable_functions as df
import azure.functions as func


async def main(documents: func.DocumentList, starter: str):
    client = df.DurableOrchestrationClient(starter)
    instance_id = await client.start_new('MyDFOrchestrator', {"doc_list": [{doc1}, {doc2}, {doc3}]})
    logging.info(f"Started orchestration ID {instance_id}")

It should be fine to pass JSON as input value to the orchestrator . There is an example here which does similar. Though the example is with http trigger, but the concerned area here has nothing to do what trigger you use at the starter/triggering Function.

Alternatively, you can create a concrete serializable class holding the model/entity structure (much cleaner than raw json). To create serializable classes, all we require is for your class to export two static methods: to_json() and from_json(). The Durable Functions framework will interally call these classes to serialize and de-serialize your custom class.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM