[英]How to round decimal in SPARK SQL
I have a Scala Spark SQL with the following data.我有一个带有以下数据的 Scala Spark SQL。 How do I discretize/round the scores to the nearest decimal place given below.如何将分数离散化/四舍五入到下面给出的最接近的小数位。 Since I am not finding any predefined functions for that.因为我没有找到任何预定义的函数。 Can anyone please help me out to figure this out.谁能帮我解决这个问题。
49.5 --> from 49.50 to 49.99
50 ---> 50.0 to 50.49
50.5 --->50.5 to 50.99
4.5 ---> 4.50 to 4.99
5.0 ---> 5.0 to 5.49
9.5 --> 9.50 to 9.99
10--->10 to 10.49
I want value 49.5 for range 49.50 to 49.99 and value 10 for range 10 to 10.49.我想要范围 49.50 到 49.99 的值 49.5 和范围 10 到 10.49 的值 10。 So on很快
A more general solution to round any numbers (I added two more rows to illustrate).对任何数字进行四舍五入的更通用解决方案(我添加了另外两行来说明)。
val df2 = df.withColumn("val2", ((col("val") / 0.5).cast("int"))*0.5)
df2.show
+-----+----+
| val|val2|
+-----+----+
| 49.5|49.5|
|49.99|49.5|
| 50.0|50.0|
| 1.1| 1.0|
| 9.9| 9.5|
| 10.0|10.0|
+-----+----+
For Spark SQL:对于 Spark SQL:
df.createOrReplaceTempView("df")
spark.sql("select val, int(val / 0.5) * 0.5 as val2 from df").show
+-----+----+
| val|val2|
+-----+----+
| 49.5|49.5|
|49.99|49.5|
| 50.0|50.0|
| 1.1| 1.0|
| 9.9| 9.5|
| 10.0|10.0|
+-----+----+
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.