简体   繁体   中英

Pyspark : "ImportError: cannot import name 'st_makePoint'

I am trying to enter some data in postgresql database using pyspark. There is one field in postresql table which defined as data type GEOGRAPHY(Point). I have written below pyspark code to creat this field using longitude and latitude

from pyspark.sql.functions import st_makePoint
df = (Load input file into pyspark dataframe)
df = df.withColumn("Location", st_makePoint(col("Longitude"), col("Latitude")))
Next step is load the data into postgresql

But I am getting the error

"ImportError: cannot import name 'st_makePoint'

I think st_makePoint is part of pyspark.sql.function. Not sure why it is giving error. Please help. Also if there is better way of entering the Geography(Point) field in postgresql from pyspark please let me know

check this documentation of geo-mesa

Registering user-defined types and functions can be done manually by invoking geomesa_pyspark.init_sql() on the Spark session object:

import geomesa_pyspark
geomesa_pyspark.init_sql(spark)

Then you can use st_mkPoint

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM