简体   繁体   English

如何在创建带有引号的表时从 Spark 连接器读取 Snowflake 表?

[英]How to read Snowflake table from Spark connector when the table was created with quotes around it?

I know how to read from Snowflake table with Spark connector like below:我知道如何使用如下所示的 Spark 连接器从 Snowflake 表中读取数据:

df = spark.read.format("snowflake") \
               .options(**sfParams) # This is a dict with all the SF creds stored \
               .option('dbtable', 'TABLE1').load()

This works perfectly fine.这工作得很好。 But if the table was created with quotes around it in Snowflake like CREATE TABLE DB1.SCHEMA1."MY.TABLE2" , spark is not able to format it.但是,如果表是在 Snowflake 中用引号引起来的,例如CREATE TABLE DB1.SCHEMA1."MY.TABLE2" ,则 spark 无法对其进行格式化。 I tried like我试过

df = spark.read.format("snowflake") \
               .options(**sfParams) # This is a dict with all the SF creds stored \
               .option('dbtable', '"MY.TABLE2"').load()

But its throwing invalid URL prefix found in: 'MY.TABLE2'但它invalid URL prefix found in: 'MY.TABLE2'

When using objects with identifiers in code, they need to be escaped, like:在代码中使用带有标识符的对象时,需要对它们进行转义,例如:

'"MY.TABLE2"'

should be:应该:

'\"MY.TABLE2\"'

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 BigQuery Spark 连接器保存分区表 - Saving partitioned table with BigQuery Spark connector 检查表是否存在:Spark bigquery connector - Check if table exists: Spark bigquery connector Spark 读取 BigQuery 外部表 - Spark Read BigQuery External Table 写入雪花表时 AWS Glue 出错 - Error when AWS Glue while writing to snowflake table 当使用 SPARK 读取视图时,在 HUDI 表上创建 Athena 视图会返回软删除记录 - Creating an Athena view on a HUDI table returns soft deleted records when the view is read using SPARK Firehose 记录格式转换无法读取从现有架构创建的粘合表架构 - Firehose record format conversion cannot read glue table schema created from existing schema 如何基于S3分区数据在snowflake中创建外部表 - How to Create external table in snowflake based on S3 partitioned data GBQ - 如何自动更新从查询创建的表? - GBQ - How to automatically update table that was created from a query? 在 BigQuery 中创建表时触发 AWS lambda - Trigger an AWS lambda when a table is created in BigQuery 将文件从 AWS S3 复制到 Snowflake 表 - 执行复制并处理 0 个文件 - Copying files from AWS S3 to Snowflake Table - Copy executed with 0 files processed
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM