简体   繁体   English

在spark数据框中用引号括起包含的列名称

[英]Enclose quotes for column names that contain , in spark dataframe

I have a dataframe where some column names contains , . 我有一个数据帧,其中一些列名包含, From below, the second column name contains , . 从下面开始,第二列名称包含,

Now I want to enclose those columns that contains , with "" . 现在我想附上含有这些列,""

The following is the code I have: 以下是我的代码:

def testWriteDataframeToCSV (): Unit = {
    val df = generateDF(Array(
      ("1", "4567-01", "one", 1, 1.0, "1", "1.1"),
      ("2", "4568-02", "two", 2, 2.0, "2", "2.2"),
      ("3", "4569-25", "three", 3, 3.0, "3", "3")
    ), Seq("Id", "Course,No", "data1", "data2", "data3", "data4", "data5"))

    val take: Option[Int] = None
    val sample: Option[Float] = None

    val header = df.schema.fieldNames.mkString(",") + "\n"       
  }

Current header: 当前标题:

header = "Id,Course,No,data1,data2,data3,data4,data5\n"

Expected header: 预期的标头:

header = "Id,"Course,No",data1,data2,data3,data4,data5\n"

You just need to find which element has a , in the name and add quotes accordingly. 您只需要查找名称中具有的元素,并相应地添加引号即可。 I have used Scala's triple quotes s""" """ so there's no need to escape the one quote on each side: 我已经使用了Scala的三引号s""" """因此无需在每一侧都转义一个引号:

df.schema.fieldNames.map{ f => if (f.contains(",")) s""""${f}"""" else f }.mkString(",")
//String = Id,"Course,No",data1,data2,data3,data4,data5

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM