简体   繁体   中英

Split the string into two columns in spark dataframe

I have a dataframe having a row value "My name is Rahul" I want to split "my name is" in one column and "Rahul" in another column. Here is no delimiter to use the split function. How can I do it in spark?

Instead of Split function, Use regexp_extract function in Spark.

Regex Explanation:

(.*)\\s(.*) //capture everything into 1 capture group until last space(\s) then capture everything after into 2 capture group.

Example:

val df= Seq(("My name is Rahul")).toDF("text") //sample string

df.withColumn("col1",regexp_extract($"text","(.*)\\s(.*)",1)).
withColumn("col2",regexp_extract($"text","(.*)\\s(.*)",2)).
show()

Result:

+----------------+----------+-----+
|            text|      col1| col2|
+----------------+----------+-----+
|My name is Rahul|My name is|Rahul|
+----------------+----------+-----+

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM