简体   繁体   中英

Create an empty DF using schema from another DF (Scala Spark)

I have to compare a DF with another one that is the same schema readed from a specific path, but maybe in that path there are not files so I've thought that I have to compare it with a null DF with the same columns as the original.

So I am trying to create a DF with the schema from another DF that contains a lot of columns but I can't find a solution for this. I have been reading the following posts but no one helps me:

How to create an empty DataFrame with a specified schema?

How to create an empty DataFrame? Why "ValueError: RDD is empty"?

How to create an empty dataFrame in Spark

How can I do it in scala? Or is better take other option?

originalDF.limit(0) will return an empty dataframe with the same schema.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM