简体   繁体   中英

convert string to BigInt dataframe spark scala

I am trying to insert values into dataframe in which fields are string type into postgresql database in which field are big int type.

I didn't find how to cast them as big int .I used before IntegerType I got no problem. But with this dataframe the cast cause me negative integer

val sparkSession = SparkSession.builder.master("local").appName("spark session example").getOrCreate()

  val cabArticleGold = sparkSession.sqlContext.load("jdbc", Map("url" -> "jdbc:oracle:thin:System/maher@//localhost:1521/XE", "dbtable" -> "IPTECH.TMP_ARTCAB")).select("CODEART", "CAB").limit(10)
import sparkSession.sqlContext.implicits._
 cabArticleGold.show()
cabArticleGold.withColumn("CAB",'CAB.cast(IntegerType)).foreach(row=>println(row(1)))

232524399
-1613725482
232524423
-1613725465
232524437
-1191331072
3486
-1639094853
232524461
1564177573

Any help to use Big Int would be appreciated.I know that scala supports Big Int, but how can I do it?

For large integer you should use LongType :

cabArticleGold.withColumn("CAB", 'CAB.cast(LongType))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("long"))

You can also use DecimalType

cabArticleGold.withColumn("CAB", 'CAB.cast(DecimalType(38, 0)))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("decimal(38, 0)"))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM