繁体   English   中英

Circe无法将原始json转换为case类错误:找不到类型为io.circe.generic.decoding.DerivedDecoder的Lazy隐含值

[英]Circe Couldn't convert raw json to case class Error: could not find Lazy implicit value of type io.circe.generic.decoding.DerivedDecoder

我已经为JSON表示定义了几个案例类,但我不确定我是否正确地执行了它,因为有很多嵌套的case类。 spec,meta等实体的类型为JSONObject以及Custom对象本身。

这是我定义的所有类:

  case class CustomObject(apiVersion: String,kind: String, metadata: Metadata,spec: Spec,labels: Object,version: String)

  case class Metadata(creationTimestamp: String, generation: Int, uid: String,resourceVersion: String,name: String,namespace: String,selfLink: String)

  case class Spec(mode: String,image: String,imagePullPolicy: String, mainApplicationFile: String,mainClass: String,deps: Deps,driver: Driver,executor: Executor,subresources: Subresources)

  case class Driver(cores: Double,coreLimit: String,memory: String,serviceAccount: String,labels: Labels)

  case class Executor(cores: Double,instances: Double,memory: String,labels: Labels)

  case class Labels(version: String)

  case class Subresources(status: Status)

  case class Status()

  case class Deps()

这是我需要转换的自定义K8s对象的JSON结构:

{
    "apiVersion": "sparkoperator.k8s.io/v1alpha1",
    "kind": "SparkApplication",
    "metadata": {
        "creationTimestamp": "2019-01-11T15:58:45Z",
        "generation": 1,
        "name": "spark-example",
        "namespace": "default",
        "resourceVersion": "268972",
        "selfLink": "/apis/sparkoperator.k8s.io/v1alpha1/namespaces/default/sparkapplications/spark-example",
        "uid": "uid"
    },
    "spec": {
        "deps": {},
        "driver": {
            "coreLimit": "1000m",
            "cores": 0.1,
            "labels": {
                "version": "2.4.0"
            },
            "memory": "1024m",
            "serviceAccount": "default"
        },
        "executor": {
            "cores": 1,
            "instances": 1,
            "labels": {
                "version": "2.4.0"
            },
            "memory": "1024m"
        },
        "image": "gcr.io/ynli-k8s/spark:v2.4.0,
        "imagePullPolicy": "Always",
        "mainApplicationFile": "http://localhost:8089/spark_k8s_airflow.jar",
        "mainClass": "org.apache.spark.examples.SparkExample",
        "mode": "cluster",
        "subresources": {
            "status": {}
        },
        "type": "Scala"
    }
}

更新:我想将JSON转换为使用Circe的case类,但是,对于这样的类,我遇到了这个错误:

Error: could not find Lazy implicit value of type io.circe.generic.decoding.DerivedDecoder[dataModel.CustomObject]
    implicit val customObjectDecoder: Decoder[CustomObject] = deriveDecoder[CustomObject]

我为所有案例类定义了隐式解码器:

 implicit val customObjectLabelsDecoder: Decoder[Labels] = deriveDecoder[Labels]
    implicit val customObjectSubresourcesDecoder: Decoder[Subresources] = deriveDecoder[Subresources]
    implicit val customObjectDepsDecoder: Decoder[Deps] = deriveDecoder[Deps]
    implicit val customObjectStatusDecoder: Decoder[Status] = deriveDecoder[Status]
    implicit val customObjectExecutorDecoder: Decoder[Executor] = deriveDecoder[Executor]
    implicit val customObjectDriverDecoder: Decoder[Driver] = deriveDecoder[Driver]
    implicit val customObjectSpecDecoder: Decoder[Spec] = deriveDecoder[Spec]
    implicit val customObjectMetadataDecoder: Decoder[Metadata] = deriveDecoder[Metadata]
    implicit val customObjectDecoder: Decoder[CustomObject] = deriveDecoder[CustomObject]

您无法为CustomObject派生解码的CustomObjectlabels: Object成员。

所有解码都是由静态类型驱动的,而circe不提供像ObjectAny这样的类型的编码器或解码器,它们没有有用的静态信息。

如果您将该案例类更改为以下内容:

case class CustomObject(apiVersion: String, kind: String, metadata: Metadata, spec: Spec)

...并使用导入保留代码的其余部分:

import io.circe.Decoder, io.circe.generic.semiauto.deriveDecoder

并将您的JSON文档定义为doc (在向"image": "gcr.io/ynli-k8s/spark:v2.4.0,添加引号后"image": "gcr.io/ynli-k8s/spark:v2.4.0,使其成为有效的JSON行),以下应该可以正常工作:

scala> io.circe.jawn.decode[CustomObject](doc)
res0: Either[io.circe.Error,CustomObject] = Right(CustomObject(sparkoperator.k8s.io/v1alpha1,SparkApplication,Metadata(2019-01-11T15:58:45Z,1,uid,268972,spark-example,default,/apis/sparkoperator.k8s.io/v1alpha1/namespaces/default/sparkapplications/spark-example),Spec(cluster,gcr.io/ynli-k8s/spark:v2.4.0,Always,http://localhost:8089/spark_k8s_airflow.jar,org.apache.spark.examples.SparkExample,Deps(),Driver(0.1,1000m,1024m,default,Labels(2.4.0)),Executor(1.0,1.0,1024m,Labels(2.4.0)),Subresources(Status()))))

尽管其他一个答案说的是,circe肯定可以为没有成员的案例类派生编码器和解码器 - 这绝对不是问题所在。

作为旁注,我希望有可能有比这更好的错误消息:

Error: could not find Lazy implicit value of type io.circe.generic.decoding.DerivedDecoder[dataModel.CustomObject

但是考虑到circe-generic现在必须使用Shapeless's Lazy ,这是我们能得到的最好的。 您可以尝试使用circe-derivation作为circe-generic的半自动派生的替代方案,它具有更好的错误消息(以及其他一些优点),或者您可以使用splain之类的编译器插件,它专门用于提供更好的错误消息即使在像shapeless.Lazy的东西的存在。 shapeless.Lazy

最后一点,您可以通过在deriveDecoder上推断类型参数来deriveDecoder清理半自动定义:

implicit val customObjectLabelsDecoder: Decoder[Labels] = deriveDecoder

这完全是一个品味问题,但我觉得它读起来有点吵。

看起来对我不对。 你有任何问题吗?

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM