[英]What are some ways to handle bad files in Azure Datafactory Copy Activity
When using the default datafactory copy activity to load files from json blobs (source using modified timestamp) to sqldb table (destination).当使用默认数据工厂复制活动将文件从 json blob(使用修改后的时间戳的源)加载到 sqldb 表(目标)时。 For the fault-tolerant settings, if the row is incompatible then it gets skipped, however if there is a bad file that is not a valid json format then the activity errors and retries instead of skipping the bad file.
对于容错设置,如果该行不兼容,则会跳过该行,但是如果存在不是有效 json 格式的坏文件,则活动会出错并重试,而不是跳过坏文件。
What are some ways to identify/skip incompatible or corrupt files in ADF copy activity?有哪些方法可以在 ADF 复制活动中识别/跳过不兼容或损坏的文件?
Thanks in advance提前致谢
According to the Fault tolerance of copy activity in Azure Data Factory , the fault tolerance of copy activity only support binary files and tabular data .根据Azure 数据工厂中的复制活动容错,复制活动的容错只支持二进制文件和表格数据。
Identify invalid json files is not supported yet.尚不支持识别无效的 json 文件。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.