简体   繁体   中英

How to check if query is pushed down from databricks to snowflake?

I'm trying to use query pushdown from databricks to Snowflake. I'm reading data from snowflake(data source) to databricks, creating dataframes and applying joins, filter and aggregate functions. Code is running fine but not able to find if the query is pushed down to snowflake. How to check if query has run on snowflake or spark(databricks) cluster?

At least these 2 ways:

  1. Use the History from Snowflake UI to see if the query was run in the Snowflake side.

  2. Enable debug mode on Spark connector by setting the context to DEBUG:

    sc.setLogLevel('DEBUG')

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM