简体   繁体   中英

Create Spark DataFrame from Pandas Dataframe Error

I am trying to create a Spark Dataframe from a Pandas Dataframe and have tried many workarounds but continue to fail. This is my code, I am simply trying to follow one of the many basic examples:

test = pd.DataFrame([1,2,3,4,5])
type(test)

from pyspark import SparkContext
sc = SparkContext(master="local[4]")
sqlCtx = SQLContext(sc)
spark_df = sqlCtx.createDataFrame(test)

I was trying the above with a pandas dataframe having 2000 columns and hundreds of thousands of rows but I created the above test df just to make sure it wasn't a problem with the dataframe. And indeed I get the exact same error:

 Py4JJavaError: An error occurred while calling o596.get.
: java.util.NoSuchElementException: spark.sql.execution.pandas.respectSessionTimeZone
    at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:884)
    at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:884)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:884)
    at org.apache.spark.sql.RuntimeConfig.get(RuntimeConfig.scala:74)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:280)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:748)

Do you have this set?

export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH

Also just to be sure, add the path to the

py4j zip (mine is py4j-0.10.1-src.zip)

in the spark directory as:

export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.1-src.zip:$PYTHONPATH

I resolved the issue - I forgot to add the following lines of code to the beginning of my anaconda notebook:

import findspark 
findspark.init()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM