简体   繁体   English

向客户公开Azure Data Lake Gen2以进行查询的不同方式有哪些

[英]What are the different ways to expose Azure Data Lake Gen2 to Customers for querying purpose

In the present Business scenario, our complete platform is based on MS SQL Server (Source -> ETL -> Target).在当前的业务场景中,我们完整的平台基于MS SQL Server(源-> ETL->目标)。 For BI reporting purpose, we are creating customer-wise data marts and providing them with the necessary access.为了进行BI报告,我们正在创建客户级数据集市,并为其提供必要的访问权限。 So they either directly connect to database with provided credentials or they use built in query editor panel in the provided application.因此,他们要么使用提供的凭据直接连接到数据库,要么使用提供的应用程序中的内置查询编辑器面板。

Our plan is to migrate things to Azure Data Lake platform.我们的计划是将事物迁移到Azure Data Lake平台。 However with this new change, we would want to continue the data flow pattern and grant access to customers to their specific Data Lake BLOB's, which is arranged in database-table fashion manner.但是,通过这一新更改,我们希望继续数据流模式,并向客户授予对其特定的Data Lake BLOB(以数据库表方式排列)的访问权限。

So the question is, is it possible to expose a specific BLOB to a specific customer in any way ?因此,问题是,是否可以通过任何方式将特定的BLOB暴露给特定的客户? If yes, then how and if not, then what are the alternatives so that customer can access the Data Lake for querying purpose ?如果是,那么如何,如果不是,那么有什么替代方案,以便客户可以访问Data Lake进行查询?

PS: I know its possible to push data from data lake to database and grant access to the database. PS:我知道可以将数据从数据湖推送到数据库并授予对数据库的访问权限。 However we are looking something on the Data Lake platform itself, if possible.但是,如果可能的话,我们正在Data Lake平台本身上寻找一些东西。

In the present Business scenario, our complete platform is based on MS SQL Server (Source -> ETL -> Target).在当前的业务场景中,我们完整的平台基于MS SQL Server(源-> ETL->目标)。 For BI reporting purpose, we are creating customer-wise data marts and providing them with the necessary access.为了进行BI报告,我们正在创建客户级数据集市,并为其提供必要的访问权限。 So they either directly connect to database with provided credentials or they use built in query editor panel in the provided application.因此,他们要么使用提供的凭据直接连接到数据库,要么使用提供的应用程序中的内置查询编辑器面板。

Our plan is to migrate things to Azure Data Lake platform.我们的计划是将事物迁移到Azure Data Lake平台。 However with this new change, we would want to continue the data flow pattern and grant access to customers to their specific Data Lake BLOB's, which is arranged in database-table fashion manner.但是,通过这一新更改,我们希望继续数据流模式,并向客户授予对其特定的Data Lake BLOB(以数据库表方式排列)的访问权限。

So the question is, is it possible to expose a specific BLOB to a specific customer in any way ?因此,问题是,是否可以通过任何方式将特定的BLOB暴露给特定的客户? If yes, then how and if not, then what are the alternatives so that customer can access the Data Lake for querying purpose ?如果是,那么如何,如果不是,那么有什么替代方案,以便客户可以访问Data Lake进行查询?

PS: I know its possible to push data from data lake to database and grant access to the database. PS:我知道可以将数据从数据湖推送到数据库并授予对数据库的访问权限。 However we are looking something on the Data Lake platform itself, if possible.但是,如果可能的话,我们正在Data Lake平台本身上寻找一些东西。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure Data Lake Storage Gen2的托管服务身份配置 - Managed Service Identity configuration for Azure Data Lake Storage Gen2 从 Azure Data Lake Gen2 数据存储创建 Azure ML 数据集时出现 AuthenticationException - AuthenticationException when creating Azure ML Dataset from Azure Data Lake Gen2 Datastore 无法使用 azure-sdk-for-js 使用 Angular 访问 Azure Data Lake Gen2 的文件系统 - Unable to access FileSystem of Azure Data Lake Gen2 with Angular using azure-sdk-for-js Azure Data Lake Storage Gen2访问令牌生成-“ AADSTS65001:用户或管理员未同意使用ID为ID的应用程序 - Azure Data Lake Storage Gen2 access token generation - "AADSTS65001: The user or administrator has not consented to use the application with ID 使用 AZURE_STORAGE_ACCESS_TOKEN 连接到 Azure Data Lake Storage (Gen 2) 虚拟文件的 Gdal - Gdal connection to Azure Data Lake Storage (Gen 2) virtual file using AZURE_STORAGE_ACCESS_TOKEN 如何在浏览器上打开文件,而不是在本地下载存储在 Azure Data Lake Gen 2 中的文件 - How to open the file on browser instead of downloading in local that is stored in Azure Data lake Gen 2 在 Azure Databricks 和 Terraform 中安装带有 AAD 直通的 ADLS gen2 - Mounting ADLS gen2 with AAD passthrough in Azure Databricks with Terraform 对 Azure Data Lake Storage Gen 2 的 REST API 调用不起作用。 给我错误“受众验证失败。受众不匹配” - REST API call to Azure Data lake Storage Gen 2 not working. Giving me error "Audience validation failed. Audience did not match" 为什么有些用户可以在 Synapse 工作区中看到链接的 Azure Data Lake Gen 2 资源而其他用户看不到? - Why can some users see in a Synapse Workspace see a Linked Azure Data Lake Gen 2 resource and others can't? 如何在 C# 中使用服务主体(clientId 和 clientSecret)为 Azure Data Lake Store(Gen-2)创建 SAS 令牌? - How to create SAS token for Azure Data Lake Store (Gen-2) using service principals (clientId and clientSecret) in C#?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM