简体   繁体   English

发送 Azure Iot 数据到 azure gen2

[英]Send Azure Iot data to azure gen2

Currently My IOT hub is sending data to storage account gen2 within a folder called rawdata.. But I want to send the data into three folders if it is raw data - send to raw folder, transform data send to transform folder likewise.目前,我的 IOT 中心正在将数据发送到名为 rawdata 的文件夹中的存储帐户 gen2。但如果是原始数据,我想将数据发送到三个文件夹中 - 发送到原始文件夹,转换数据同样发送到转换文件夹。

Current setting:当前设置: 截屏 Kindly guide any article.请指导任何文章。

Thanks Anuj谢谢阿努杰

The IoT Hub file upload feature is something totally different. IoT 中心文件上传功能完全不同。 So send (and transform) telemetry to a sink like Data Lake Store Gen2, use something like Stream Analytics: https://learn.microsoft.com/en-us/azure/stream-analytics/blob-storage-azure-data-lake-gen2-output因此,将遥测数据发送(和转换)到像 Data Lake Store Gen2 这样的接收器,使用类似 Stream Analytics:https: https://learn.microsoft.com/en-us/azure/stream-analytics/blob-storage-azure-data- lake-gen2-输出

In one streaming job you can have multiple outputs with different transformations being applied.在一个流媒体作业中,您可以有多个输出,并应用不同的转换。


edit based on comments:根据评论编辑:

You can also do simple filtering (but no transformations) using IoT Hub routing queries: https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-routing-query-syntax您还可以使用 IoT 中心路由查询进行简单过滤(但不进行转换): https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-routing-query-syntax

Create different custom endpoints for your storage account (with different paths).为您的存储帐户创建不同的自定义终结点(具有不同的路径)。 Then create different custom routes towards these.然后为这些创建不同的自定义路由。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 Databricks /mnt 安装 Azure Data lake Gen2 - Mounting Azure Data lake Gen2 with Databricks /mnt 在 Databricks 上使用 Pyspark 访问 Azure ADLS gen2 - Accessing Azure ADLS gen2 with Pyspark on Databricks 使用 Elastic Stack 对驻留在 Azure Data Lake Storage Gen2 中的数据进行实时数据分析 - Realtime data analytics using Elastic Stack on data residing in Azure Data Lake Storage Gen2 Azure Data Lake Gen2 存储帐户 blob 与 adf 选择 - Azure Data Lake Gen2 Storage Account blob vs adf choice 将数据从本地 sql 服务器复制到 Azure Data Lake Storage Gen2 中的增量格式 - copy data from on premise sql server to delta format in Azure Data Lake Storage Gen2 获取列表中数据湖 gen2 文件夹的所有内容 azure 突触工作区 - get all the contents of data lake gen2 folder in a list azure synapse workspace 用于解析 Azure Data Lake Storage Gen2 URI 的正则表达式,用于使用 Azurite 进行生产和测试 - Regex to parse Azure Data Lake Storage Gen2 URI for production and testing with Azurite 使用 Azure 数据工厂数据流将 CSV 文件下沉到 Azure Data Lake Gen2 时如何删除额外文件? - How to remove extra files when sinking CSV files to Azure Data Lake Gen2 with Azure Data Factory data flow? 在 Azure Databricks 和 Terraform 中安装带有 AAD 直通的 ADLS gen2 - Mounting ADLS gen2 with AAD passthrough in Azure Databricks with Terraform 如何使用 dbt 将镶木地板文件从 Azure Data Lake Gen2/Azure Blob 存储加载到专用池? - How to load parquet files from Azure Data Lake Gen2/Azure Blob Storage to Dedicated pool using dbt?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM