简体   繁体   English

Azure 数据工厂 - 复制数据正在将 excel 文件转换为应用程序/八位字节流

[英]Azure Data Factory - Copy data is converting excel file to application/octet-stream

I have a pipeline in data factory that is moving an excel file from a folder named inbound to a folder named raw, but it copies the excel file as "application/octet-stream".我在数据工厂中有一个管道,它正在将 excel 文件从名为 inbound 的文件夹移动到名为 raw 的文件夹,但它将 excel 文件复制为“application/octet-stream”。 How do I get it so it keeps the file as "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"?我如何获取它以便将文件保存为“application/vnd.openxmlformats-officedocument.spreadsheetml.sheet”? Here is the code I have for the copy.这是我的副本代码。

{
                            "name": "Copy from Inbound",
                            "type": "Copy",
                            "dependsOn": [],
                            "policy": {
                                "timeout": "7.00:00:00",
                                "retry": 0,
                                "retryIntervalInSeconds": 30,
                                "secureOutput": false,
                                "secureInput": false
                            },
                            "userProperties": [],
                            "typeProperties": {
                                "source": {
                                    "type": "DelimitedTextSource",
                                    "storeSettings": {
                                        "type": "AzureBlobFSReadSettings",
                                        "recursive": true,
                                        "enablePartitionDiscovery": false
                                    },
                                    "formatSettings": {
                                        "type": "DelimitedTextReadSettings"
                                    }
                                },
                                "sink": {
                                    "type": "DelimitedTextSink",
                                    "storeSettings": {
                                        "type": "AzureBlobFSWriteSettings"
                                    },
                                    "formatSettings": {
                                        "type": "DelimitedTextWriteSettings",
                                        "quoteAllText": true,
                                        "fileExtension": ".txt"
                                    }
                                },
                                "enableStaging": false,
                                "translator": {
                                    "type": "TabularTranslator",
                                    "typeConversion": true,
                                    "typeConversionSettings": {
                                        "allowDataTruncation": true,
                                        "treatBooleanAsNumber": false
                                    }
                                }
                            },
                            "inputs": [
                                {
                                    "referenceName": "Blob",
                                    "type": "DatasetReference",
                                    "parameters": {
                                        "container": "inbound",
                                        "folder": {
                                            "value": "@replace(pipeline().parameters.filePath, 'inbound/', '')",
                                            "type": "Expression"
                                        },
                                        "file": {
                                            "value": "@pipeline().parameters.fileName",
                                            "type": "Expression"
                                        }
                                    }
                                }
                            ],
                            "outputs": [
                                {
                                    "referenceName": "Blob",
                                    "type": "DatasetReference",
                                    "parameters": {
                                        "container": "raw",
                                        "folder": {
                                            "value": "@concat(replace(pipeline().parameters.filePath, 'inbound/', ''),'/',formatDateTime(utcNow(),'yyyy-mm-dd'))",
                                            "type": "Expression"
                                        },
                                        "file": {
                                            "value": "@pipeline().parameters.fileName",
                                            "type": "Expression"
                                        }
                                    }
                                }
                            ]
                        }

Unfortunately, there is no setting available to add a content-type property of a file in Copy data activity when loaded to storage.不幸的是,没有设置可用于在加载到存储时在复制数据活动中添加文件的内容类型属性。 The default content type added to the file loaded to blob is application/octet-stream.添加到加载到 blob 的文件的默认内容类型是 application/octet-stream。

You can raise a feature request from the Azure data factory.您可以从 Azure 数据工厂提出功能请求。

在此处输入图像描述

As a temporary solution, you follow the suggestion given by MartinJaffer-MSFT in the Microsoft Q&A forum.作为临时解决方案,您可以按照 MartinJaffer-MSFT 在Microsoft Q&A论坛中给出的建议进行操作。

Use Web activity after your copy data activity to make a request to set the blob properties .在复制数据活动之后使用 Web 活动来请求设置 blob 属性

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 复制带有空标头的 Excel 文件 Azure 数据工厂 (ADF) - Copy an Excel file with empty headers Azure Data Factory (ADF) Firebase 存储 | 从 node.js 云函数上传时文件是“application/octet-stream” - Firebase Storage | File is "application/octet-stream" on upload from node.js cloud function Azure 数据工厂中的 80 个 Postgres 数据库复制到 - 80 Postgres databases in Azure Data Factory to copy into Azure 数据工厂复制到 CosmosDB 节流 - Azure Data Factory copy to CosmosDB throttling 如何获取单元格值并将其写入新列。 Excel 到 SQL 复制活动 Azure 数据工厂 - How to grab a cell value and write it as a new column. Excel to SQL copy activity Azure Data Factory Azure 数据工厂从嵌套在 json 的字符串中复制数据 json - Azure Data Factory copy data from json string nested in a json 带有后端存储桶的 GCP 负载平衡器返回“内容类型:应用程序/八位字节流”for.js 和 .css 文件 - GCP Load balancer w/ backend bucket returns "content-type: application/octet-stream" for .js and .css files 根据Azure数据工厂中的文件名将文件从一个文件夹复制到多个文件夹 - Copy Files from a folder to multiple folders based on the file name in Azure Data Factory 如何读取Azure数据工厂中的.BAK文件? - How to read an .BAK file in Azure data factory? 使用 Azure 数据工厂复制活动更新 Cosmos 项目 TTL - Upsert Cosmos item TTL using Azure Data Factory Copy Activity
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM