简体   繁体   English

在火花 dataframe python 中将二进制字符串的列转换为 int

[英]Convert column of binary string to int in spark dataframe python

So I have a dataframe with one column like this:所以我有一个 dataframe 有一列像这样:

+----------+
|some_colum|
+----------+
|        10|
|        00|
|        00|
|        10|
|        10|
|        00|
|        10|
|        00|
|        00|
|        10|
+----------+

where the column some_colum are binary strings.其中 some_colum 列是二进制字符串。

I want to convert this column to decimal.我想将此列转换为十进制。

I've tried doing我试过做

data = data.withColumn("some_colum", int(col("some_colum"), 2))

But this doesn't seem to work.但这似乎不起作用。 as I get the error:当我得到错误时:

int() can't convert non-string with explicit base

I think cast() might be able to do the job but I'm unable to figure it out.我认为 cast() 可能能够完成这项工作,但我无法弄清楚。 Any ideas?有任何想法吗?

I think the int cannot be applied directly to a column.我认为int不能直接应用于列。 You can use in a udf:您可以在 udf 中使用:

from org.apache.spark.sql import functions
binary_to_int = functions.udf(lambda x: int(x, 2), IntegerType())
data = data.withColumn("some_colum", binary_to_int("some_colum").alias('some_column_int'))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM