簡體   English   中英

通過定義UDAF產生Spark-sql錯誤

[英]Spark-sql error by defining UDAF

帶有Zeppelin Notebook的AWS EMR上的Spark版本1.6.0

我用以下代碼定義了UDAF:

import org.apache.spark.sql.types._
import org.apache.spark.sql.Row;
import org.apache.spark.sql.expressions.MutableAggregationBuffer
import org.apache.spark.sql.expressions.UserDefinedAggregateFunction

import java.text.SimpleDateFormat
import java.util.Date

class AggregateTS extends UserDefinedAggregateFunction{
    def inputSchema: StructType = StructType(StructField("input", StringType) :: Nil)

    def bufferSchema: StructType = StructType(StructField("intermediate", StringType)::Nil)

    def dataType: DataType = StringType

    def deterministic: Boolean = true

    def initialize(buffer: MutableAggregationBuffer): Unit = {
        buffer(0) = "Init"
    }

    def update(buffer: MutableAggregationBuffer, input: Row): Unit = {
        if (buffer.getAs[String](0) == "Init"){
            buffer(0) = input.getAs[String](0)
        }
        else{
            // add two string
            buffer(0) = average_ts(input.getAs[String](0), buffer.getAs[String](0))
        }
    }

    def merge(buffer1: MutableAggregationBuffer, buffer2:Row):Unit = {
        buffer1(0) = average_ts(buffer1.getAs[String](0), buffer2.getAs[String](0))
    }

    def evaluate(buffer: Row): Any = {
        buffer.getAs[String](0)
    }
}

從那里我得到一個編譯錯誤:

error: not found: type DataType
       def dataType: DataType = StringType

這是什么意思 ?

由我自己解決。 似乎是一些導入沖突錯誤。 我將導入句子更改為顯式

import org.apache.spark.sql.types.{DataType}

然后工作

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM