简体   繁体   中英

How can I convert a pyspark.sql.dataframe.DataFrame back to a sql table in databricks notebook

I created a dataframe of type pyspark.sql.dataframe.DataFrame by executing the following line: dataframe = sqlContext.sql("select * from my_data_table")

How can I convert this back to a sparksql table that I can run sql queries on?

You can create your table by using createReplaceTempView . In your case it would be like:

dataframe.createOrReplaceTempView("mytable")

After this you can query your mytable using SQL .

If your a spark version is ≤ 1.6.2 you can use registerTempTable

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM