简体   繁体   中英

Snowflake Python Connector error in Databricks

I have a simple python script that uses the snowflake python connector to connect and call a Snowflake stored procedure. The script has been running fine through datababricks but I am getting the following error now when creating a connection object (ctx). It runs fine from my laptop using Jupyter notebook but not through databricks anymore. Using Python 3.

ssl_wrap_socket_with_ocsp() got an unexpected keyword argument 'cert_reqs'

Here's my code:

import snowflake.connector as sc

Username = dbutils.secrets.get(scope = "SnowFlake", key = "username")         
Password = dbutils.secrets.get(scope = "SnowFlake", key = "password")

ctx = sc.connect(
account='myaccount',
user=Username,
password=Password,
warehouse='myWH',
database='myDB',
schema='Public'
)

This was a bug introduced with the Snowflake Python Connector version 2.0.3 , which has since been resolved in the most recent version.

The script has been running fine through datababricks but I am getting the following error now when creating a connection object (ctx). It runs fine from my laptop using Jupyter notebook but not through databricks anymore

The change in behaviour is likely because of the installation scripts in use inside your deployment is not pinning versions. If a fresh install of snowflake-python-connector via pip is done without explicitly specifying a version, it will always pull the most recently published one. This may not be a desirable behaviour for use in production workloads, as it pulls in new changes (and breakages, deprecations, or removals) without a developer testing against them first.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM