简体   繁体   中英

convert epoch to datetime in Scala / Spark

I'm converting String representing a DateTime to unix_time (epoch) using :

def strToTime(x: String):Long = { DateTimeFormat.
    forPattern("YYYY-MM-dd HH:mm:ss").parseDateTime(x).getMillis()/1000 }

to get a list of Long like this :

.map( p=> List( strToTime(p(0) ) ) ) 

my question is - what is the easiest way to turn in backwards? something like:

def timeToStr(x: Long):String = { x*1000L.toDateTime}

that I could use on the above List(Long)

I have read Convert seconds since epoch to joda DateTime in Scala but can't apply it successfully

The opposite of parseDateTime is print :)

def timeToStr(epochMillis: Long): String =
  DateTimeFormat.forPattern("YYYY-MM-dd HH:mm:ss").print(epochMillis)

You have a precedence problem - .toDateTime is being applied to 1000L before * is applied. Bracket the operations to make the call order clear:

def timeToStr(x: Long): String = { (x*1000L).toDateTime }

Follows my approach!

import java.util.Date
import java.text.SimpleDateFormat

def epochToDate(epochMillis: Long): String = {
    val df:SimpleDateFormat = new SimpleDateFormat("yyyy-MM-dd")
    df.format(epochMillis)
}   

Follows a test run.

scala> epochToDate(1515027919000L)
res0: String = 2018-01-03

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM