简体   繁体   English

将 csv 文件从 blob 容器复制到 Azure SQL DB 时出现 null 和空字符串错误

[英]Getting error on null and empty string while copying a csv file from blob container to Azure SQL DB

I tried all combination on the datatype of my data but each time my data factory pipeline is giving me this error:我尝试了所有数据类型的组合,但每次我的数据工厂管道都给我这个错误:

{ "errorCode": "2200", "message": "ErrorCode=UserErrorColumnNameNotAllowNull,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Empty or Null string found in Column Name 2. Please make sure column name not null and try again.,Source=Microsoft.DataTransfer.Common,'", "failureType": "UserError", "target": "xxx", "details": [] } { "errorCode": "2200", "message": "ErrorCode=UserErrorColumnNameNotAllowNull,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=在列名 2 中找到空或空字符串。请确保列名不为空再试一次。,Source=Microsoft.DataTransfer.Common,'", "failureType": "UserError", "target": "xxx", "details": [] }

My Copy data source code is something like this: { "name": "xxx", "description": "uuu", "type": "Copy", "dependsOn": [], "policy": { "timeout": "7.00:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "source": { "type": "DelimitedTextSource", "storeSettings": { "type": "AzureBlobStorageReadSettings", "recursive": true, "wildcardFileName": "*" }, "formatSettings": { "type": "DelimitedTextReadSettings" } }, "sink": { "type": "AzureSqlSink" }, "enableStaging": false, "translator": { "type": "TabularTranslator", "mappings": [ { "source": { "name": "populationId", "type": "Guid" }, "sink": { "name": "PopulationID", "type": "String" } }, { "source": { "name": "inputTime", "type": "DateTime" }, "sink": { "name": "inputTime", "type": "DateTime" } }, { "source": { "name": "inputCount", "type": "Decimal" }, "sink": { "name": "inputCount", "type": "Decimal" } }, { "source": { "name": "inputBiomass", "type": "Decimal" }, "sink": { "name": "inputBiomass", "type": "Decimal" } }, { "source": { "name": "inputNumber", "type": "Decimal" }, "sink": { "name": "inputNumber", "type": "Decimal" } }, { "source": { "name": "utcOffset", "type": "String" }, "sink": { "name": "utcOffset", "type": "Int32" } }, { "source": { "name": "fishGroupName", "type": "String" }, "sink": { "name": "fishgroupname", "type": "String" } }, { "source": { "name": "yearClass", "type": "String" }, "sink": { "name": "yearclass", "type": "String" } } ] } }, "inputs": [ { "referenceName": "DelimitedTextFTDimensions", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "AzureSqlTable1", "type": "DatasetReference" } ] }我的复制数据源代码是这样的: { "name": "xxx", "description": "uuu", "type": "Copy", "dependsOn": [], "policy": { "timeout": "7.00:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "source": { "type": "DelimitedTextSource", "storeSettings": { "type": "AzureBlobStorageReadSettings", "recursive": true, "wildcardFileName": "*" }, "formatSettings": { "type": "DelimitedTextReadSettings" } }, "sink": { "type": "AzureSqlSink" }, "enableStaging": false, "translator": { "type": "TabularTranslator", "mappings": [ { "source": { "name": "populationId", "type": "Guid" }, "sink": { "name": "PopulationID", "type": "String" } }, { "source": { "name": "inputTime", "type": "DateTime" }, "sink": { "name": "inputTime", "type": "DateTime" } }, { "source": { "name": "inputCount", "type": "Decimal" }, "sink": { "name": "inputCount", "type": "Decimal" } }, { "source": { "name": "inputBiomass", "type": "Decimal" }, "sink": { "name": "inputBiomass", "type": "Decimal" } }, { "source": { "name": "inputNumber", "type": "Decimal" }, "sink": { "name": "inputNumber", "type": "Decimal" } }, { "source": { "name": "utcOffset", "type": "String" }, "sink": { "name": "utcOffset", "type": "Int32" } }, { "source": { "name": "fishGroupName", "type": "String" }, "sink": { "name": "fishgroupname", "type": "String" } }, { "source": { "name": "yearClass", "type": "String" }, "sink": { "name": "yearclass", "type": "String" } } ] } }, "inputs": [ { "referenceName": "DelimitedTextFTDimensions", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "AzureSqlTable1", "type": "DatasetReference" } ] } { "name": "xxx", "description": "uuu", "type": "Copy", "dependsOn": [], "policy": { "timeout": "7.00:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "source": { "type": "DelimitedTextSource", "storeSettings": { "type": "AzureBlobStorageReadSettings", "recursive": true, "wildcardFileName": "*" }, "formatSettings": { "type": "DelimitedTextReadSettings" } }, "sink": { "type": "AzureSqlSink" }, "enableStaging": false, "translator": { "type": "TabularTranslator", "mappings": [ { "source": { "name": "populationId", "type": "Guid" }, "sink": { "name": "PopulationID", "type": "String" } }, { "source": { "name": "inputTime", "type": "DateTime" }, "sink": { "name": "inputTime", "type": "DateTime" } }, { "source": { "name": "inputCount", "type": "Decimal" }, "sink": { "name": "inputCount", "type": "Decimal" } }, { "source": { "name": "inputBiomass", "type": "Decimal" }, "sink": { "name": "inputBiomass", "type": "Decimal" } }, { "source": { "name": "inputNumber", "type": "Decimal" }, "sink": { "name": "inputNumber", "type": "Decimal" } }, { "source": { "name": "utcOffset", "type": "String" }, "sink": { "name": "utcOffset", "type": "Int32" } }, { "source": { "name": "fishGroupName", "type": "String" }, "sink": { "name": "fishgroupname", "type": "String" } }, { "source": { "name": "yearClass", "type": "String" }, "sink": { "name": "yearclass", "type": "String" } } ] } }, "inputs": [ { "referenceName": "DelimitedTextFTDimensions", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "AzureSqlTable1", "type": "DatasetReference" } ] } Can anyone please help me understand the issue. { "name": "xxx", "description": "uuu", "type": "Copy", "dependsOn": [], "policy": { "timeout": "7.00:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "source": { "type": "DelimitedTextSource", "storeSettings": { "type": "AzureBlobStorageReadSettings", "recursive": true, "wildcardFileName": "*" }, "formatSettings": { "type": "DelimitedTextReadSettings" } }, "sink": { "type": "AzureSqlSink" }, "enableStaging": false, "translator": { "type": "TabularTranslator", "mappings": [ { "source": { "name": "populationId", "type": "Guid" }, "sink": { "name": "PopulationID", "type": "String" } }, { "source": { "name": "inputTime", "type": "DateTime" }, "sink": { "name": "inputTime", "type": "DateTime" } }, { "source": { "name": "inputCount", "type": "Decimal" }, "sink": { "name": "inputCount", "type": "Decimal" } }, { "source": { "name": "inputBiomass", "type": "Decimal" }, "sink": { "name": "inputBiomass", "type": "Decimal" } }, { "source": { "name": "inputNumber", "type": "Decimal" }, "sink": { "name": "inputNumber", "type": "Decimal" } }, { "source": { "name": "utcOffset", "type": "String" }, "sink": { "name": "utcOffset", "type": "Int32" } }, { "source": { "name": "fishGroupName", "type": "String" }, "sink": { "name": "fishgroupname", "type": "String" } }, { "source": { "name": "yearClass", "type": "String" }, "sink": { "name": "yearclass", "type": "String" } } ] } }, "inputs": [ { "referenceName": "DelimitedTextFTDimensions", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "AzureSqlTable1", "type": "DatasetReference" } ] }任何人都可以帮我理解这个问题。 I see in some blogs they ask me use treatnullasempty but I am not allowed to modify the JSON.我在一些博客中看到他们要求我使用treatnullasempty,但我不允许修改JSON。 is there a way to do that??有没有办法做到这一点??

I suggest to using Data Flow DerivedColumn , DerivedColumn can help you build expression to replace the null column.我建议使用Data Flow DerivedColumn , DerivedColumn 可以帮助您构建表达式来替换空列。

For example:例如: 在此处输入图片说明

Derived Column, if Column_2 is null = true , return 'dd' :派生列,如果Column_2为 null = true ,则返回 'dd' :

iifNull(Column_2,'dd')

在此处输入图片说明 在此处输入图片说明

Mapping the column映射列在此处输入图片说明

Reference: Data transformation expressions in mapping data flow参考: 映射数据流中的数据转换表达式

Hope this helps.希望这可以帮助。

修复了它。这是一个简单的修复,因为我在目的地中的一列被标记为非空,我将其更改为空并且它起作用了。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 将文件从 sharepoint 复制到 ADF 中的 blob 存储时出错 - Getting error while copying the file from sharepoint to blob storage in ADF 将一些文本文件从 azure blob 复制到突触时出现错误超过 128 的最大长度 - Getting error exceeded the maximum length of 128 while copying some text files from azure blob to synapse 通过 Polybase 将数据从 Azure blob 复制到 Azure DW 时出现“将数据类型 VARCHAR 转换为 DATETIM 时出错”E - Getting "Error converting data type VARCHAR to DATETIM"E while copying data from Azure blob to Azure DW through Polybase 错误代码 DFExecutorUserError / 将 CSV 文件从 Blob 存储加载到 Azure SQL 数据库时出现问题 - Error code DFExecutorUserError / Problem with loading a CSV file from Blob storage to an Azure SQL Database 从 Azure VM SQL 服务器/数据库到 Azure Blob 存储(CSV/JSON)的近实时副本 - Near real time replica from Azure VM SQL Server/DB into Azure Blob Storage (CSV/JSON) 使用Azure Data Factory V2将数据从Azure Blob存储复制到SQL Server时出错 - Error when copying data from Azure Blob Storage to SQL Server using Azure Data Factory V2 使用Azure Data Factory将文件内容从Azure存储复制到Azure SQL Db - Copying file contents from Azure Storage to Azure SQL Db using Azure Data Factory 将 Azure Blob 存储中的新 CSV 加载到 SQL DB - Loading a new CSV in Azure Blob Storage to SQL DB 将数据从本地 SQL 服务器复制和合并到 Azure blob - Copying & Consolidating Data from on-prem SQL server to Azure blob 将csv从blob存储并行加载到Azure SQL数据库 - parallel loading of csv from blob storage into Azure SQL Database
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM