[英]How to read Snowflake table from Spark connector when the table was created with quotes around it?
I know how to read from Snowflake table with Spark connector like below:我知道如何使用如下所示的 Spark 连接器从 Snowflake 表中读取数据:
df = spark.read.format("snowflake") \
.options(**sfParams) # This is a dict with all the SF creds stored \
.option('dbtable', 'TABLE1').load()
This works perfectly fine.这工作得很好。 But if the table was created with quotes around it in Snowflake like
CREATE TABLE DB1.SCHEMA1."MY.TABLE2"
, spark is not able to format it.但是,如果表是在 Snowflake 中用引号引起来的,例如
CREATE TABLE DB1.SCHEMA1."MY.TABLE2"
,则 spark 无法对其进行格式化。 I tried like我试过
df = spark.read.format("snowflake") \
.options(**sfParams) # This is a dict with all the SF creds stored \
.option('dbtable', '"MY.TABLE2"').load()
But its throwing invalid URL prefix found in: 'MY.TABLE2'
但它
invalid URL prefix found in: 'MY.TABLE2'
When using objects with identifiers in code, they need to be escaped, like:在代码中使用带有标识符的对象时,需要对它们进行转义,例如:
'"MY.TABLE2"'
should be:应该:
'\"MY.TABLE2\"'
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.