简体   繁体   中英

Enclose quotes for column names that contain , in spark dataframe

I have a dataframe where some column names contains , . From below, the second column name contains , .

Now I want to enclose those columns that contains , with "" .

The following is the code I have:

def testWriteDataframeToCSV (): Unit = {
    val df = generateDF(Array(
      ("1", "4567-01", "one", 1, 1.0, "1", "1.1"),
      ("2", "4568-02", "two", 2, 2.0, "2", "2.2"),
      ("3", "4569-25", "three", 3, 3.0, "3", "3")
    ), Seq("Id", "Course,No", "data1", "data2", "data3", "data4", "data5"))

    val take: Option[Int] = None
    val sample: Option[Float] = None

    val header = df.schema.fieldNames.mkString(",") + "\n"       
  }

Current header:

header = "Id,Course,No,data1,data2,data3,data4,data5\n"

Expected header:

header = "Id,"Course,No",data1,data2,data3,data4,data5\n"

You just need to find which element has a , in the name and add quotes accordingly. I have used Scala's triple quotes s""" """ so there's no need to escape the one quote on each side:

df.schema.fieldNames.map{ f => if (f.contains(",")) s""""${f}"""" else f }.mkString(",")
//String = Id,"Course,No",data1,data2,data3,data4,data5

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM