I have created my spark pool in azure synapse, but when i try creating my database using the below code its giving me error- any help in what am doing wrong:
spark.sql("CREATE DATABASE nyctaxi_ksa7")
df.write.mode("overwrite").saveAsTable("nyctaxi.trip")
AnalysisException Traceback (most recent call last)
/tmp/ipykernel_6859/2561526676.py in <module>
----> 1 spark.sql("CREATE DATABASE nyctaxi_ksa7")
2 df.write.mode("overwrite").saveAsTable("nyctaxi.trip")
/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py in sql(self, sqlQuery)
721 [Row(f1=1, f2='row1'), Row(f1=2, f2='row2'), Row(f1=3, f2='row3')]
722 """
--> 723 return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
724
725 def table(self, tableName):
~/cluster-env/env/lib/python3.8/site-packages/py4j/java_gateway.py in __call__(self, *args)
1302
1303 answer = self.gateway_client.send_command(command)
-> 1304 return_value = get_return_value(
1305 answer, self.gateway_client, self.target_id, self.name)
1306
/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py in deco(*a, **kw)
115 # Hide where the exception came from that shows a non-Pythonic
116 # JVM exception message.
--> 117 raise converted from None
118 else:
119 raise
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException
This error is sometimes seen if there is an issue with the workspace. You can try creating a new workspace. If the error persists, you can raise a support ticket for the support engineers to investigate more on the issue.
I have repro'd and created the database in the synapse spark pool without errors.
spark.sql("CREATE DATABASE sampledb1")
df.write.mode("overwrite").saveAsTable("sampledb1.tb1")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.