簡體   English   中英

databricks 'date_trunc' 函數如何在后端運行?

[英]how does databricks 'date_trunc' function run in the back end?

我希望在 Databricks 中看到date_trunc函數的源代碼。 pyspark 源代碼沒有回答我的問題。 基本上我想知道核心發生了什么; 例如,它是運行正則regexp模式/方法還是有自己的算法?

任何人都可以幫忙嗎? 謝謝!

Spark 代碼實際上是在 JVM 上運行的 Scala 代碼,盡管您可以從 Python 中使用它,並且可以在 GitHub 上找到它: https ://github.com/apache/spark

我相信您正在尋找的代碼可見於https://github.com/apache/spark/blob/b6aea1a8d99b3d99e91f7f195b23169d3d61b6a7/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils .scala#L971

def truncTimestamp(micros: Long, level: Int, zoneId: ZoneId): Long = {
    // Time zone offsets have a maximum precision of seconds (see `java.time.ZoneOffset`). Hence
    // truncation to microsecond, millisecond, and second can be done
    // without using time zone information. This results in a performance improvement.
    level match {
      case TRUNC_TO_MICROSECOND => micros
      case TRUNC_TO_MILLISECOND =>
        micros - Math.floorMod(micros, MICROS_PER_MILLIS)
      case TRUNC_TO_SECOND =>
        micros - Math.floorMod(micros, MICROS_PER_SECOND)
      case TRUNC_TO_MINUTE => truncToUnit(micros, zoneId, ChronoUnit.MINUTES)
      case TRUNC_TO_HOUR => truncToUnit(micros, zoneId, ChronoUnit.HOURS)
      case TRUNC_TO_DAY => truncToUnit(micros, zoneId, ChronoUnit.DAYS)
      case _ => // Try to truncate date levels
        val dDays = microsToDays(micros, zoneId)
        daysToMicros(truncDate(dDays, level), zoneId)
    }
  }

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM