简体   繁体   English

使用 Elastic Stack 对驻留在 Azure Data Lake Storage Gen2 中的数据进行实时数据分析

[英]Realtime data analytics using Elastic Stack on data residing in Azure Data Lake Storage Gen2

How can we create the real-time data pipeline while data resides on Azure Data Lake Storage Gen2, and the analytics has to be done using Elastic Stack.当数据驻留在 Azure Data Lake Storage Gen2 上时,我们如何创建实时数据管道,并且必须使用 Elastic Stack 进行分析。

What can be the integration tool or technique for the completion of this design?完成此设计的集成工具或技术是什么?

As @Nick.McDermaid mentioned in the comment that you need to reconsider your design.正如@Nick.McDermaid在评论中提到的,您需要重新考虑您的设计。 AFAIK there is no such tool available which can integrate Azure Data Lake Gen2 and Elastic Stack for real time analytics.据我所知,没有这样的工具可以集成 Azure Data Lake Gen2 和 Elastic Stack 进行实时分析。

Alternatively, the better way to implement your requirement is by using the Azure products designed for real time analytics like Azure Stream Analytics , Azure Synapse Analytics, etc. You can also consider Azure Data Factory for data movement and transformation.或者,实现您的要求的更好方法是使用专为实时分析设计的 Azure 产品,例如Azure Stream Analytics 、Azure Synapse Analytics 等。您还可以考虑 Azure 数据工厂进行数据移动和转换。

You can check outthis page to know more about all the analytics products available in Azure. Choose the best which suits your requirement and try to implement using official document examples.您可以查看此页面以了解有关 Azure 中可用的所有分析产品的更多信息。选择最适合您要求的产品并尝试使用官方文档示例进行实施。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 将数据从本地 sql 服务器复制到 Azure Data Lake Storage Gen2 中的增量格式 - copy data from on premise sql server to delta format in Azure Data Lake Storage Gen2 如何使用 dbt 将镶木地板文件从 Azure Data Lake Gen2/Azure Blob 存储加载到专用池? - How to load parquet files from Azure Data Lake Gen2/Azure Blob Storage to Dedicated pool using dbt? 无法使用 python azure-storage-file-datalake SDK 在 Azure Data Lake Gen2 中创建 Append Blob - Cannot create Append Blobs in Azure Data Lake Gen2 using python azure-storage-file-datalake SDK Azure Data Lake Gen2 存储帐户 blob 与 adf 选择 - Azure Data Lake Gen2 Storage Account blob vs adf choice 使用 Databricks /mnt 安装 Azure Data lake Gen2 - Mounting Azure Data lake Gen2 with Databricks /mnt 用于解析 Azure Data Lake Storage Gen2 URI 的正则表达式,用于使用 Azurite 进行生产和测试 - Regex to parse Azure Data Lake Storage Gen2 URI for production and testing with Azurite 发送 Azure Iot 数据到 azure gen2 - Send Azure Iot data to azure gen2 获取列表中数据湖 gen2 文件夹的所有内容 azure 突触工作区 - get all the contents of data lake gen2 folder in a list azure synapse workspace 使用 Azure 数据工厂数据流将 CSV 文件下沉到 Azure Data Lake Gen2 时如何删除额外文件? - How to remove extra files when sinking CSV files to Azure Data Lake Gen2 with Azure Data Factory data flow? 如何使用 java sdk 在 azure 数据湖 gen1 中创建资源? - How to create resources in azure data lake gen1 with java sdk?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM