繁体   English   中英

设置parent后,elasticsearch-spark无法索引类型

[英]elasticsearch-spark failing to index type after parent is set

我首先使用以下命令通过rest API手动设置类型:

curl -XPUT localhost:9200/myIndex/ -d '{
  "mappings" : { 
          "company": {}, 
          "people": {
               "_parent" : {
                   "type" : "company"
                }
           }
       }
}'

然而,在火花层,使用以下代码

这是人图

object PeopleDataCleaner {
  def main(args: Array[String]): Unit = {
    val liftedArgs = args.lift
    val mongoURL = liftedArgs(0).getOrElse("mongodb://127.0.0.1/mg_test.lc_data_test")
    val elasticsearchHost = liftedArgs(1).getOrElse("52.35.155.55")
    val elasticsearchPort = liftedArgs(2).getOrElse("9200")
    val mongoReadPreferences = liftedArgs(3).getOrElse("primary")
    val spark = SparkSession.builder()
      .appName("Mongo Data CLeaner")
      .master("local[*]")
      .config("spark.mongodb.input.uri", mongoURL)
      .config("mongo.input.query", "{currentCompanies : {$exists: true, $ne: []}}")
      .config("mongo.readPreference.name", mongoReadPreferences)
      .config("es.nodes", elasticsearchHost)
      .config("es.port", elasticsearchPort)
      .getOrCreate()
    import spark.implicits._
    val data = MongoSpark.load[LCDataRecord](spark)
      .as[LCDataRecord]
      .filter { record =>
        record.currentCompanies != null &&
        record.currentCompanies.nonEmpty &&
        record.linkedinId != null
      }
      .map { record =>
        val moddedCurrentCompanies = record.currentCompanies
          .filter { currentCompany => currentCompany.link != null && currentCompany.link != "" }
        record.copy(currentCompanies = moddedCurrentCompanies)
      }
      .flatMap { record =>
          record.currentCompanies.map { currentCompany =>
            currentCompanyToFlatPerson(record, currentCompany)
          }
      }
      .saveToEs("myIndex/people", Map(
        "es.mapping.id" -> "idField",
        "es.mapping.parent" -> "companyLink"
      ))
  }

这是公司

object CompanyDataCleaner {
  def main(args: Array[String]): Unit = {
    val liftedArgs = args.lift
    val mongoURL = liftedArgs(0).getOrElse("mongodb://127.0.0.1/mg_test.lc_data_test")
    val elasticsearchHost = liftedArgs(1).getOrElse("localhost")
    val elasticsearchPort = liftedArgs(2).getOrElse("9200")
    val mongoReadPreferences = liftedArgs(3).getOrElse("primary")
    val spark = SparkSession.builder()
      .appName("Mongo Data CLeaner")
      .master("local[*]")
      .config("spark.mongodb.input.uri", mongoURL)
      .config("mongo.input.query", "{currentCompanies : {$exists: true, $ne: []}}")
      .config("mongo.readPreference.name", mongoReadPreferences)
      .config("es.index.auto.create", "true")
      .config("es.nodes", elasticsearchHost)
      .config("es.port", elasticsearchPort)
      .getOrCreate()

    import spark.implicits._
    val data = MongoSpark
      .load[LCDataRecord](spark)
      .as[LCDataRecord]
      .filter { record => record.currentCompanies != null && record.currentCompanies.nonEmpty }
      .flatMap(record => record.currentCompanies)
      .filter { record => record.link != null }
      .dropDuplicates("link")
      .map(formatCompanySizes)
      .map(companyToFlatCompany)
      .saveToEs("myIndex/company", Map("es.mapping.id" -> "link"))

  }

出现一条失败消息,指出org.apache.spark.util.TaskCompletionListenerException: Can't specify parent if no parent field has been configured 通过首先将公司索引到elasticsearch中,这不是问题,我的理解是上述映射应已定义了父子关系。

编辑使用基于REST的批量API或使用普通的REST索引API不会遇到此问题。

.config("es.index.auto.create", "true").config("es.index.auto.create", "false")解决我的问题。 看来,即使索引和类型存在,EsSpark仍在尝试创建它,并且如果它的parent字段集不是合法操作。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM