繁体   English   中英

错误代码 2200 - json 从 Azure Blob 到 Azure Data Lake Gen2 的数据工厂复制作业“有效负载过大”

[英]Error code 2200 - “payload too large” on data factory copy job of json from Azure Blob to Azure Data Lake Gen2

I have made a data factory copy job, that is supposed to copy JSON-files from blob storage to JSON in Azure Data Lake Gen 2. I have made several other copy jobs that works but not from json to json before, and in this instance我不断收到错误:

错误代码:2200。故障类型:用户配置问题。 详细信息:包含活动/数据集/链接服务配置的有效负载太大。 请检查您的设置是否具有非常大的值并尝试减小其大小。

我已经尝试减小接收器上的块大小,但这只会使它更快地失败,所以不确定是什么问题。 The json files are pretty big and includes output from forecasting algorithms, so there is both time series, model parameters and other stuff in the same json document.

这是复制活动的 JSON 脚本的第一部分,如果有帮助的话,直到映射:

{
    "name": "BlobStorage_To_DataLakeG2",
    "properties": {
        "description": "This job is intended to perform data copies of json-files from blob storage to ADLS gen2 for selected files.",
        "activities": [
            {
                "name": "TotalLoadForecast_ADLSG2_json",
                "type": "Copy",
                "dependsOn": [],
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false,
                    "secureInput": false
                },
                "userProperties": [],
                "typeProperties": {
                    "source": {
                        "type": "JsonSource",
                        "storeSettings": {
                            "type": "AzureBlobStorageReadSettings",
                            "recursive": true,
                            "enablePartitionDiscovery": true,
                            "partitionRootPath": "totalloadforecastoutput/"
                        },
                        "formatSettings": {
                            "type": "JsonReadSettings"
                        }
                    },
                    "sink": {
                        "type": "JsonSink",
                        "storeSettings": {
                            "type": "AzureBlobFSWriteSettings",
                            "blockSizeInMB": 4
                        },
                        "formatSettings": {
                            "type": "JsonWriteSettings"
                        }
                    },
                    "enableStaging": false,
                    "translator": {
                        "type": "TabularTranslator",
                        "mappings": [
´´´

似乎导致问题的设置是“enablePartitionDiscovery”:true。 将其设置为 false 使作业成功。 也许它不适用于 json 文档。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM