简体   繁体   中英

How to write case class for enum column in Apache Spark Dataset?

story1, 10, small
story2, 20, medium
sotry3, 3, small
story4, 50, xlarge

I want to convert my data to Dataset. I have a column name storyType (small, medium, large, xlarge). So I don't know how to write my case class in this situation

case class Story(name:String, point: Int, storyType: ???)

Try

sealed trait StoryType
case object small extends StoryType
case object medium extends StoryType
case object large extends StoryType
case object xlarge extends StoryType

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM