簡體   English   中英

如何在Spark工作者節點中連接到NEO4J?

[英]how to Connect to NEO4J in Spark worker nodes?

我需要在火花映射函數中獲得一個小的子圖。 我曾嘗試使用AnormCypher和NEO4J-SPARK-CONNECTOR,但均無效。 AnormCypher將導致Java IOException錯誤(我在mapPartition函數中建立連接,在本地服務器上測試)。 而且Neo4j-spark-connector將導致下面的任務“不可序列化例外”。

是否有一種好方法可以在Spark worker節點中獲取子圖(或連接至圖數據庫,如neo4j)?

Exception in thread "main" org.apache.spark.SparkException: Task not serializable
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2094)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1.apply(RDD.scala:793)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1.apply(RDD.scala:792)
    at ....

我的代碼段使用neo4j-spark-connector 2.0.0-m2:

val neo = Neo4j(sc) // this runs on the driver

// this runs by a map function
def someFunctionToBeMapped(p: List[Long]) = { 
  val metaGraph = neo.cypher("match p = (a:TourPlace) -[r:could_go_to] -> (b:TourPlace)" +
    "return a.id ,r.distance, b.id").loadRowRdd.map( row => ((row(0).asInstanceOf[Long],row(2).asInstanceOf[Long]), row(1).asInstanceOf[Double]) ).collect().toList

AnromCypher代碼為:

def partitionMap(partition: Iterator[List[Long]]) = {
  import org.anormcypher._
  import play.api.libs.ws._
  // Provide an instance of WSClient
  val wsclient = ning.NingWSClient()
  // Setup the Rest Client
  // Need to add the Neo4jConnection type annotation so that the default
  // Neo4jConnection -> Neo4jTransaction conversion is in the implicit scope
  implicit val connection: Neo4jConnection = Neo4jREST("127.0.0.1", 7474, "neo4j", "000000")(wsclient)
  //
  // Provide an ExecutionContext
  implicit val ec = scala.concurrent.ExecutionContext.global

  val res = partition.filter( placeList => {

    val startPlace = Cypher("match p = (a:TourPlace) -[r:could_go_to] -> (b:TourPlace)"  +
      "return p")().flatMap( row => row.data )
  })
  wsclient.close()
  res
}

我已使用Spark獨立模式並能夠連接neo4j數據庫

使用的版本:

火花2.1.0

neo4j-spark-connector 2.1.0-m2

我的代碼:-

val sparkConf = new SparkConf().setAppName("Neo$j").setMaster("local")
    val sc = new SparkContext(sparkConf)
    println("***Getting Started ****")
    val neo = Neo4j(sc)
    val rdd = neo.cypher("MATCH (n) RETURN id(n) as id").loadDataFrame
    println(rdd.count)

Spark提交:-spark-submit --class package.classname --jars pathofneo4jsparkconnectoryJAR --conf spark.neo4j.bolt.password = ***** targetJarFile.jar

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM