简体   繁体   English

从 Spark SQL 生成 SQL

[英]Generate SQL from Spark SQL

I have a Scala code with Spark DataFrame with a join of few frames and filter and dynamic part.我有一个带有Spark DataFrame 的Scala代码,其中加入了几个帧、过滤器和动态部分。

Is it possible to generate and write to log a classic SQL code to control the process?是否可以生成并写入记录经典SQL代码来控制流程?

        val target = List(
            ("0", "0", "Orange", "2020-03-10")
        ).toDF("id", "new_id", "name", "process_date")
    
    
    ....
    dynamic part of code
    ....
    
    increment.as("i")
            .join(target.as("t"), $"t.id" === $"i.id", "left")
            .select(selectFields: _*)

I want to get in logs something like this我想进入这样的日志

    select field1, field2, ....
    from increment i join target t where t.id = i.id

Maybe you could try to write in the log itself just after the query's execution.也许您可以尝试在查询执行后立即写入日志本身。

Something like this example:像这样的例子:

import org.apache.log4j.{Level, Logger}

Logger.getRootLogger.setLevel(Level.ERROR)
// your code here
increment.as("i")
        .join(target.as("t"), $"t.id" === $"i.id", "left")
        .select(selectFields: _*)
Logger.getRootLogger.error("""SELECT column1, column2, column3 FROM table JOIN table1 ON(id = id1) WHERE column3 = .....GROUP BY ....""")

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM