[英]Moving average of dataset in Apache Spark and Scala
我必須使用Apache Spark和Scala作為編程語言在數據集上執行以下任務:
deviceid,bytes,eventdate 15590657,246620,20150630 14066921,1907,20150621 14066921,1906,20150626 6522013,2349,20150626 6522013,2525,20150613
按設備ID對數據進行分組。 因此,我們現在有了一個deviceid =>(bytes,eventdate)的映射
對於每個設備,請按事件日期對集合進行排序。 現在,基於每個設備的事件日期,我們有了一組有序的字節。
從該有序集中選擇字節的最后30天。
使用30個時間段找到最后一個日期的字節移動平均值。
使用30的時間間隔求出最終日期的字節標准差。
返回結果中的兩個值(mean-k stddev)和(mean + k stddev)[假設k = 3]
我正在使用Apache Spark 1.3.0。 實際數據集更寬,最終必須運行十億行。
這是數據集的數據結構。
package com.testing
case class DeviceAggregates (
device_id: Integer,
bytes: Long,
eventdate: Integer
) extends Ordered[DailyDeviceAggregates] {
def compare(that: DailyDeviceAggregates): Int = {
eventdate - that.eventdate
}
}
object DeviceAggregates {
def parseLogLine(logline: String): DailyDeviceAggregates = {
val c = logline.split(",")
DailyDeviceAggregates(c(0).toInt, c(1).toLong, c(2).toInt)
}
}
DeviceAnalyzer類如下所示:
package com.testing
import com.testing.DeviceAggregates
import org.apache.spark.{SparkContext, SparkConf}
import scala.util.Sorting
object DeviceAnalyzer {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("Device Statistics Analyzer")
val sc = new SparkContext(sparkConf)
val logFile = args(0)
val deviceAggregateLogs = sc.textFile(logFile).map(DeviceAggregates.parseLogLine).cache()
val deviceIdsMap = deviceAggregateLogs.groupBy(_.device_id)
deviceIdsMap.foreach(
// I am stuck here !!
})
sc.stop()
}
}
但是在這一點上,我仍然堅持該算法的實際實現。
我有一個非常粗糙的實現方式可以完成這項工作,但這還不盡人意。 抱歉,我是Scala / Spark的新手,所以我的問題很基本。 這是我現在所擁有的:
import com.testing.DailyDeviceAggregates
import org.apache.spark.{SparkContext, SparkConf}
import org.apache.spark.mllib.linalg.Vector
import org.apache.spark.mllib.stat.{MultivariateStatisticalSummary, Statistics}
import org.apache.spark.mllib.linalg.{Vector, Vectors}
import scala.util.Sorting
object DeviceAnalyzer {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("Device Analyzer")
val sc = new SparkContext(sparkConf)
val logFile = args(0)
val deviceAggregateLogs = sc.textFile(logFile).map(DailyDeviceAggregates.parseLogLine).cache()
// Calculate statistics based on bytes
val deviceIdsMap = deviceAggregateLogs.groupBy(_.device_id)
deviceIdsMap.foreach(a => {
val device_id = a._1 // This is the device ID
val allaggregates = a._2 // This is an array of all device-aggregates for this device
println(allaggregates)
val sortedAggregates = Sorting.quickSort(allaggregates.toArray) // Sort the CompactBuffer of DailyDeviceAggregates based on eventdate
println(sortedAggregates) // This does not work - returns an empty array !!
val byteValues = allaggregates.map(dda => dda.bytes.toDouble).toArray // This should be sortedAggregates.map (but does not compile)
val count = byteValues.count(A => true)
val sum = byteValues.sum
val xbar = sum / count
val sum_x_minus_x_bar_square = byteValues.map(x => (x-xbar)*(x-xbar)).sum
val stddev = math.sqrt(sum_x_minus_x_bar_square / count)
val vector: Vector = Vectors.dense(byteValues)
println(vector)
println(device_id + "," + xbar + "," + stddev)
//val vector: Vector = Vectors.dense(byteValues)
//println(vector)
//val summary: MultivariateStatisticalSummary = Statistics.colStats(vector)
})
sc.stop()
}
}
如果有人可以提出以下建議,我將不勝感激:
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.