[英]Creating a Random Feature Array in Spark DataFrames
創建ALS
模型時,我們可以提取userFactors
框和itemFactors
框。 這些數據框包含帶有數組的列。
我想生成一些隨機數據並將其userFactors
到userFactors
DataFrame。
這是我的代碼:
val df1: DataFrame = Seq((123, 456, 4.0), (123, 789, 5.0), (234, 456, 4.5), (234, 789, 1.0)).toDF("user", "item", "rating")
val model1 = (new ALS()
.setImplicitPrefs(true)
.fit(df1))
val iF = model1.itemFactors
val uF = model1.userFactors
然后,我使用具有以下功能的VectorAssembler
創建一個隨機DataFrame:
def makeNew(df: DataFrame, rank: Int): DataFrame = {
var df_dummy = df
var i: Int = 0
var inputCols: Array[String] = Array()
for (i <- 0 to rank) {
df_dummy = df_dummy.withColumn("feature".concat(i.toString), rand())
inputCols = inputCols :+ "feature".concat(i.toString)
}
val assembler = new VectorAssembler()
.setInputCols(inputCols)
.setOutputCol("userFeatures")
val output = assembler.transform(df_dummy)
output.select("user", "userFeatures")
}
然后,我使用新的用戶ID創建DataFrame並添加隨機向量和偏倚:
val usersDf: DataFrame = Seq(567), (678)).toDF("user")
var usersFactorsNew: DataFrame = makeNew(usersDf, 20)
當我合並兩個DataFrame時會出現問題。
usersFactorsNew.union(uF)
產生錯誤:
org.apache.spark.sql.AnalysisException: Union can only be performed on tables with the compatible column types. struct<type:tinyint,size:int,indices:array<int>,values:array<double>> <> array<float> at the second column of the second table;;
如果我打印模式,則uF
DataFrame具有類型為Array[Float]
的特征向量,而usersFactorsNew
DataFrame作為類型為Vector
的特征Vector
。
我的問題是如何將Vector
的類型更改為Array以便執行聯合。
我嘗試編寫此udf
幾乎沒有成功:
val toArr: org.apache.spark.ml.linalg.Vector => Array[Double] = _.toArray
val toArrUdf = udf(toArr)
也許VectorAssembler
不是此任務的最佳選擇。 但是,目前,這是我找到的唯一選擇。 我希望得到一些更好的建議。
無需創建虛擬數據VectorAssembler
並使用VectorAssembler
生成隨機特征向量,您只需直接使用UDF
。 ALS
模型的userFactors
將返回Array[Float]
因此UDF
的輸出應與此匹配。
val createRandomArray = udf((rank: Int) => {
Array.fill(rank)(Random.nextFloat)
})
請注意,這將給出區間[0.0,1.0]中的數字(與問題中使用的rand()
相同),如果需要其他數字,請修改為合適。
使用3的等級和userDf
:
val usersFactorsNew = usersDf.withColumn("userFeatures", createRandomArray(lit(3)))
將給出如下數據框(當然具有隨機特征值)
+----+----------------------------------------------------------+
|user|userFeatures |
+----+----------------------------------------------------------+
|567 |[0.6866711267486822,0.7257031656127676,0.983562255688249] |
|678 |[0.7013908820314967,0.41029552817665327,0.554591149586789]|
+----+----------------------------------------------------------+
現在應該可以將此數據框與uF
數據框連接起來。
的原因UDF
沒有工作,應該是由於它是一個Array[Double] while you need an
數組[浮點型] for the
工會. It should be possible to fix with a
. It should be possible to fix with a
map(_。toFloat)` . It should be possible to fix with a
。
val toArr: org.apache.spark.ml.linalg.Vector => Array[Float] = _.toArray.map(_.toFloat)
val toArrUdf = udf(toArr)
您的所有過程都是正確的。 甚至udf
函數也可以成功運行。 您需要做的就是將makeNew
函數的最后一部分更改為
def makeNew(df: DataFrame, rank: Int): DataFrame = {
var df_dummy = df
var i: Int = 0
var inputCols: Array[String] = Array()
for (i <- 0 to rank) {
df_dummy = df_dummy.withColumn("feature".concat(i.toString), rand())
inputCols = inputCols :+ "feature".concat(i.toString)
}
val assembler = new VectorAssembler()
.setInputCols(inputCols)
.setOutputCol("userFeatures")
val output = assembler.transform(df_dummy)
output.select(col("id"), toArrUdf(col("userFeatures")).as("features"))
}
並且您應該走得很完美,以便這樣做(我用id列而不是user列 創建了userDf )
val usersDf: DataFrame = Seq((567), (678)).toDF("id")
var usersFactorsNew: DataFrame = makeNew(usersDf, 20)
usersFactorsNew.union(uF).show(false)
你應該得到
+---+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|id |features |
+---+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|567|[0.8259185719733708, 0.327713892339658, 0.049547223031371046, 0.056661808506210054, 0.5846626163454274, 0.038497936270104005, 0.8970865088803417, 0.8840660648882804, 0.837866669938156, 0.9395263094918058, 0.09179528484355126, 0.4915430644129799, 0.11083447052043116, 0.5122858182953718, 0.4302683812966408, 0.3862741815833828, 0.6189322403095068, 0.3000371006293433, 0.09331299668168902, 0.7421838728601371, 0.855867963988993]|
|678|[0.7686514248005568, 0.5473580740023187, 0.072945344124282, 0.36648594574355287, 0.9780202082328863, 0.5289221651923784, 0.3719451099963028, 0.2824660794505932, 0.4873197501260199, 0.9364676464120849, 0.011539929543513794, 0.5240615794930654, 0.6282546154521298, 0.995256022569878, 0.6659179561266975, 0.8990775317754092, 0.08650071017556926, 0.5190186149992805, 0.056345335742325475, 0.6465357505620791, 0.17913532817943245] |
|123|[0.04177388548851013, 0.26762014627456665, -0.19617630541324615, 0.34298020601272583, 0.19632814824581146, -0.2748605012893677, 0.07724890112876892, 0.4277132749557495, 0.1927199512720108, -0.40271613001823425] |
|234|[0.04139673709869385, 0.26520395278930664, -0.19440513849258423, 0.3398836553096771, 0.1945556253194809, -0.27237895131111145, 0.07655145972967148, 0.42385169863700867, 0.19098000228405, -0.39908021688461304] |
+---+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.