簡體   English   中英

如何使用數組類型列從 CSV 加載數據以觸發數據幀

[英]How to load data, with array type column, from CSV to spark dataframes

我有 CSV 文件,如圖所示:

name,age,languages,experience
'Alice',31,['C++', 'Java'],2
'Bob',34,['Java', 'Python'],2
'Smith',35,['Ruby', 'Java'],3
'David',36,['C', 'Java', 'R']4

加載數據時,默認情況下所有列都作為字符串加載。

scala> val df = spark.read.format("csv").option("header",true).load("data.csv")
df: org.apache.spark.sql.DataFrame = [name: string, age: string ... 2 more fields]

scala> df.show()
+-------+---+------------------+----------+
|   name|age|         languages|experience|
+-------+---+------------------+----------+
|'Alice'| 31|   ['C++', 'Java']|         2|
|  'Bob'| 34|['Java', 'Python']|         2|
|'Smith'| 35|  ['Ruby', 'Java']|         3|
|'David'| 36|['C', 'Java', 'R']|         4|
+-------+---+------------------+----------+

scala> df.printSchema()
root
 |-- name: string (nullable = true)
 |-- age: string (nullable = true)
 |-- languages: string (nullable = true)
 |-- experience: string (nullable = true)

所以我將自定義模式定義為StringIntegerArrayInteger數據類型:

import org.apache.spark.sql.types.{StructField, StructType, StringType, ArrayType, IntegerType}

val custom_schema = new StructType(Array(StructField("name", StringType), StructField("age", IntegerType), StructField("languages", ArrayType(StringType)), StructField("experience", IntegerType)))

當我使用自定義模式加載數據時,它會拋出錯誤

運行命令后的終端截圖

scala> val df = spark.read.format("csv").option("header",true).schema(custom_schema).load("data.csv")
org.apache.spark.sql.AnalysisException: CSV data source does not support array<string> data type.
  at org.apache.spark.sql.execution.datasources.DataSourceUtils$.$anonfun$verifySchema$1(DataSourceUtils.scala:67)
  at org.apache.spark.sql.execution.datasources.DataSourceUtils$.$anonfun$verifySchema$1$adapted(DataSourceUtils.scala:65)
  at scala.collection.Iterator.foreach(Iterator.scala:941)
  at scala.collection.Iterator.foreach$(Iterator.scala:941)
  at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
  at scala.collection.IterableLike.foreach(IterableLike.scala:74)
  at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
  at org.apache.spark.sql.types.StructType.foreach(StructType.scala:102)
  at org.apache.spark.sql.execution.datasources.DataSourceUtils$.verifySchema(DataSourceUtils.scala:65)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:445)
  at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:326)
  at org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:308)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:308)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:240)
  ... 47 elided

如何通過將列制作為數組來加載數據以觸發數據幀?

您可以在從文件中讀取它后將其轉換為數組,方法是使用regexp_replace刪除括號( [] )並使用 split 將剩余的字符串用逗號( ,split ,例如..

val df = spark.read.format("csv").option("header",true).load("data.csv")

val transformedDf = df.withColumn(
                         "languages",
                         split(
                             regexp_replace(col("languages"),"\[|\]",""),
                             ","
                         )
                    )

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM