[英]Scala - How to convert a Date String to a timestamp in a Spark SQL query?
I have a formattedDataInputDateTime
String that I want to insert into a table as a Timestamp type as a second field.我有一个
formattedDataInputDateTime
字符串,我想将它作为 Timestamp 类型作为第二个字段插入到表中。
// Returns 2019-10-30T13:00Z
val localDateTimeZoned = OffsetDateTime.of(java.time.LocalDate.parse(currentDate), java.time.LocalTime.now, ZoneOffset.UTC).truncatedTo(ChronoUnit.HOURS)
// Returns 2019-10-30T13:00:00.000+0000
val formattedDataInputDateTime: String = localDateTimeZoned.format(DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSxx")).toString
So I wrote the following query but can't figure out how to insert the formattedDataInputDateTime
as a timestamp here?所以我写了以下查询,但不知道如何在此处插入
formattedDataInputDateTime
作为时间戳?
spark.sql(
s"""INSERT INTO main.basic_metrics
|VALUES ('metric_name', ???,
|'metric_type', current_timestamp, false)""".stripMargin)
I've tried to test this approach but it resulted in the following error:我试图测试这种方法,但它导致了以下错误:
val ts = cast(unix_timestamp("$formattedDataInputDateTime", "yyyy-MM-dd'T'HH:mm:ss.SSSxx") as timestamp)
type mismatch;
found : String("$formattedDataInputDateTime")
required: org.apache.spark.sql.Column
val ts = cast(unix_timestamp("$formattedDataInputDateTime", "yyyy-MM-dd'T'HH:mm:ss.SSSxx") as timestamp)
type mismatch;
found : String("$formattedDataInputDateTime")
required: org.apache.spark.sql.Column
This basically means the $ is inside the quoted string.这基本上意味着 $ 在带引号的字符串内。 It should be outside like
$"formattedDataInputDateTime"
它应该像
$"formattedDataInputDateTime"
一样在外面
You are passing String
instead of Column
, you can wrap it using lit
:您传递的是
String
而不是Column
,您可以使用lit
包装它:
cast(unix_timestamp(lit(formattedDataInputDateTime), "yyyy-MM-dd'T'HH:mm:ss.SSSxx")
However you can get current date and format it using spark functions current_date
and date_format
.但是,您可以使用 spark 函数
current_date
和date_format
获取当前日期并对其进行格式化。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.