I can run my python+pyspark script from the unix command line by typing
pyspark script.py
But how do I run script.py from within the pyspark shell? This seems like an elementary question but I can't find the answer anywhere. I tried
execfile('script.py')
But I get an error which includes:
ValueError: Cannot run multiple SparkContexts at once
Could the error come from script.py
trying to create a new SparkContext variable?
When you launch the pyspark interactive client it usually says : SparkContext available as sc, HiveContext available as sqlContext.
If your script file contains sc = SparkContext()
, maybe try commenting it.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.