简体   繁体   English

如何使用 Spark-Scala 从网络下载 CSV 文件?

[英]How to use Spark-Scala to download a CSV file from the web?

world,世界,

How to use Spark-Scala to download a CSV file from the web and load the file into a spark-csv DataFrame?如何使用 Spark-Scala 从网络下载 CSV 文件并将文件加载到 spark-csv DataFrame 中?

Currently I depend on curl in a shell command to get my CSV file.目前我依靠 shell 命令中的 curl 来获取我的 CSV 文件。

Here is the syntax I want to enhance:这是我想要增强的语法:

/* fb_csv.scala
This script should load FB prices from Yahoo.

Demo:
spark-shell -i fb_csv.scala
*/

// I should get prices:
import sys.process._
"/usr/bin/curl -o /tmp/fb.csv http://ichart.finance.yahoo.com/table.csv?s=FB"!

import org.apache.spark.sql.SQLContext

val sqlContext = new SQLContext(sc)

val fb_df = sqlContext.read.format("com.databricks.spark.csv").option("header","true").option("inferSchema","true").load("/tmp/fb.csv")

fb_df.head(9)

I want to enhance the above script so it is pure Scala with no shell syntax inside.我想增强上面的脚本,使其成为纯 Scala,内部没有 shell 语法。

val content = scala.io.Source.fromURL("http://ichart.finance.yahoo.com/table.csv?s=FB").mkString

val list = content.split("\n").filter(_ != "")

val rdd = sc.parallelize(list)

val df = rdd.toDF

Found better answer from Process CSV from REST API into Spark 从 REST API 到 Spark 的 Process CSV 中找到了更好的答案

Here you go:干得好:

import scala.io.Source._
import org.apache.spark.sql.{Dataset, SparkSession}

var res = fromURL(url).mkString.stripMargin.lines.toList
val csvData: Dataset[String] = spark.sparkContext.parallelize(res).toDS()

val frame = spark.read.option("header", true).option("inferSchema",true).csv(csvData)
frame.printSchema()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM