简体   繁体   中英

How to specify a python version in a Databricks Cluster

I am trying to install a wheel on a Databricks Cluster. Unfortunatly this wheel has the requirement:

python_requires='==3.6.8'

On Databricks Clusters the version 3.7.3 is used and so the installation of the wheel is failing. How can I install a lower python version on those clusters?

What I tried:

Switch to an anaconda supported cluster and create a virtualenv with the specific version in the init script --> this runs into an error which stops the cluster from starting (based on this https://docs.databricks.com/runtime/mlruntime.html ).

Is there another way to set up a virtualenv which can be used on alle nodes of the cluster?

Thanks!


Update

So I tried the next thing:

I created a init script:

#!/bin/bash
wget https://www.python.org/ftp/python/3.6.8/Python-3.6.8.tgz
tar xvf Python-3.6.8.tgz
ls
pwd
cd Python-3.6.8
ls
pwd
./configure --enable-optimizations --enable-shared
make -j8
sudo make altinstall
python3.6

to install Python 3.6.8 on the cluster (this takes quite a while).

The init script fails--> here is the error log:

find: ‘build’: No such file or directory
find: ‘build’: No such file or directory
find: ‘build’: No such file or directory
find: ‘build’: No such file or directory
make[1]: [clean] Error 1 (ignored)
Executing <Task finished coro=<CoroutineTests.test_async_def_wrapped.<locals>.start() done, defined at /Python-3.6.8/Lib/test/test_asyncio/test_pep492.py:150> result=None created at /Python-3.6.8/Lib/asyncio/base_events.py:463> took 0.168 seconds
stty: 'standard input': Inappropriate ioctl for device
/Python-3.6.8/Modules/expat/xmlparse.c: In function ‘appendAttributeValue’:
/Python-3.6.8/Modules/expat/xmlparse.c:5577:40: warning: array subscript is above array bounds [-Warray-bounds]
           if (!poolAppendChar(pool, buf[i]))
                                        ^
/Python-3.6.8/Modules/expat/xmlparse.c:545:27: note: in definition of macro ‘poolAppendChar’
    : ((*((pool)->ptr)++ = c), 1))
                           ^
/Python-3.6.8/Modules/expat/xmlparse.c:5577:40: warning: array subscript is above array bounds [-Warray-bounds]
           if (!poolAppendChar(pool, buf[i]))
                                        ^
/Python-3.6.8/Modules/expat/xmlparse.c:545:27: note: in definition of macro ‘poolAppendChar’
    : ((*((pool)->ptr)++ = c), 1))
                           ^
python3.6: error while loading shared libraries: libpython3.6m.so.1.0: cannot open shared object file: No such file or directory

The unpacking and loading of the tar is working fine, however after the second ls/pwd the errors occured. In gerenal so far the python is installed "somewhere". How can I redirect, so it will be installed at /databricks/python3/bin/python?

Thank you!

Somethign like this should work in the init script:

#!/bin/bash
sudo wget https://repo.continuum.io/archive/Anaconda3-5.3.0-Linux-x86_64.sh
sudo bash Anaconda3-5.2.0-Linux-x86_64.sh -b -p /anaconda3
echo "PYSPARK_PYTHON=/anaconda3/bin/python3" >> /databricks/spark/conf/spark-env.sh

DBR 5.5 should have Python 3.6 as well, you could try using that version.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM