简体   繁体   中英

Scala: Write Random Values to JSON and Save in File then Analyze in Spark

I would like to write ten (or a billion) events to JSON and save as files.

I am writing in a Databricks notebook in Scala. I want the JSON string to have randomly generated values for fields like "Carbs":

{"Username": "patient1", "Carbs": 92, "Bolus": 24, "Basal": 1.33, "Date": 2017-06-28, "Timestamp": 2017-06-28 21:59:...}

I successfully used the following to write the date to an Array() and then save as a JSON file.

val dateDF = spark.range(10)
  .withColumn("today", current_date())

But what is the best way to write random values to an Array and then save the Array as a JSON file?

您将RDD转换为dataframe,然后另存为json格式

dataframe.write.mode('append').json(path)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM