[英]SQL View on Delta Lake table
I need to create an abstraction on top of existing Delta Lake Table in Databricks.我需要在 Databricks 中现有的 Delta Lake 表之上创建一个抽象。 Is it possible to make SQL Server kind of SQL View based on Delta Lake Table in Spark?
是否可以在 Spark 中基于 Delta Lake Table 制作 SQL Server 类型的 SQL 视图?
SQL view can be created on delta lake by multiple ways now.现在可以通过多种方式在 delta Lake 上创建 SQL 视图。
CREATE OR REPLACE VIEW sqlView AS SELECT col1, .., coln FROM delta_table
` `
ADD JAR /path/to/delta-core-shaded-assembly_2.11-0.1.0.jar;
ADD JAR /path/to/hive-delta_2.11-0.1.0.jar;
SET hive.input.format=io.delta.hive.HiveInputFormat;
SET hive.tez.input.format=io.delta.hive.HiveInputFormat;
CREATE EXTERNAL TABLE deltaTable(col1 INT, col2 STRING)
STORED BY 'io.delta.hive.DeltaStorageHandler'
LOCATION '/delta/table/path'
` `
For more details: https://github.com/delta-io/connectors更多详情: https : //github.com/delta-io/connectors
A view can be created in Delta Lake just like in relational DBs using below DDL statement:可以使用以下 DDL 语句在 Delta Lake 中创建视图,就像在关系数据库中一样:
CREATE OR REPLACE VIEW SampleDB.Sample_View
AS
SELECT
ColA
,COlB
FROM SampleDB.Sample_Table
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.