简体   繁体   English

如何在 Python 请求中禁用安全证书检查

[英]How do I disable the security certificate check in Python requests

I am using我在用

import requests
requests.post(url='https://foo.example', data={'bar':'baz'})

but I get a request.exceptions.SSLError.但我收到了一个 request.exceptions.SSLError。 The website has an expired certficate, but I am not sending sensitive data, so it doesn't matter to me.该网站的证书已过期,但我没有发送敏感数据,所以这对我来说无关紧要。 I would imagine there is an argument like 'verifiy=False' that I could use, but I can't seem to find it.我想我可以使用像“verifiy=False”这样的论点,但我似乎找不到它。

From the documentation :文档中

requests can also ignore verifying the SSL certificate if you set verify to False.如果将verify设置为 False, requests也可以忽略验证 SSL 证书。

 >>> requests.get('https://kennethreitz.com', verify=False) <Response [200]>

If you're using a third-party module and want to disable the checks, here's a context manager that monkey patches requests and changes it so that verify=False is the default and suppresses the warning.如果您正在使用第三方模块并且想要禁用检查,这里有一个上下文管理器,它可以修补requests并更改它,以便verify=False是默认值并抑制警告。

import warnings
import contextlib

import requests
from urllib3.exceptions import InsecureRequestWarning

old_merge_environment_settings = requests.Session.merge_environment_settings

@contextlib.contextmanager
def no_ssl_verification():
    opened_adapters = set()

    def merge_environment_settings(self, url, proxies, stream, verify, cert):
        # Verification happens only once per connection so we need to close
        # all the opened adapters once we're done. Otherwise, the effects of
        # verify=False persist beyond the end of this context manager.
        opened_adapters.add(self.get_adapter(url))

        settings = old_merge_environment_settings(self, url, proxies, stream, verify, cert)
        settings['verify'] = False

        return settings

    requests.Session.merge_environment_settings = merge_environment_settings

    try:
        with warnings.catch_warnings():
            warnings.simplefilter('ignore', InsecureRequestWarning)
            yield
    finally:
        requests.Session.merge_environment_settings = old_merge_environment_settings

        for adapter in opened_adapters:
            try:
                adapter.close()
            except:
                pass

Here's how you use it:以下是你如何使用它:

with no_ssl_verification():
    requests.get('https://wrong.host.badssl.example/')
    print('It works')

    requests.get('https://wrong.host.badssl.example/', verify=True)
    print('Even if you try to force it to')

requests.get('https://wrong.host.badssl.example/', verify=False)
print('It resets back')

session = requests.Session()
session.verify = True

with no_ssl_verification():
    session.get('https://wrong.host.badssl.example/', verify=True)
    print('Works even here')

try:
    requests.get('https://wrong.host.badssl.example/')
except requests.exceptions.SSLError:
    print('It breaks')

try:
    session.get('https://wrong.host.badssl.example/')
except requests.exceptions.SSLError:
    print('It breaks here again')

Note that this code closes all open adapters that handled a patched request once you leave the context manager.请注意,一旦您离开上下文管理器,此代码将关闭处理修补请求的所有打开的适配器。 This is because requests maintains a per-session connection pool and certificate validation happens only once per connection so unexpected things like this will happen:这是因为 requests 维护每个会话的连接池,并且每个连接只发生一次证书验证,所以会发生这样的意外情况:

>>> import requests
>>> session = requests.Session()
>>> session.get('https://wrong.host.badssl.example/', verify=False)
/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py:857: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecureRequestWarning)
<Response [200]>
>>> session.get('https://wrong.host.badssl.example/', verify=True)
/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py:857: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecureRequestWarning)
<Response [200]>

Use requests.packages.urllib3.disable_warnings() and verify=False on requests methods.requests方法上使用requests.packages.urllib3.disable_warnings()verify=False

import requests
from urllib3.exceptions import InsecureRequestWarning

# Suppress only the single warning from urllib3 needed.
requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning)

# Set `verify=False` on `requests.post`.
requests.post(url='https://example.com', data={'bar':'baz'}, verify=False)

To add to Blender's answer , you can disable SSL certificate validation for all requests using Session.verify = False要添加到Blender 的答案,您可以使用Session.verify = False禁用所有请求的 SSL 证书验证

import requests

session = requests.Session()
session.verify = False
session.post(url='https://example.com', data={'bar':'baz'})

Note that urllib3 , (which Requests uses), strongly discourages making unverified HTTPS requests and will raise an InsecureRequestWarning .请注意, urllib3 (Requests 使用) 强烈反对发出未经验证的 HTTPS 请求,并且会引发InsecureRequestWarning

也可以使用环境变量来完成:

export CURL_CA_BUNDLE=""

If you want to send exactly post request with verify=False option, fastest way is to use this code:如果您想使用 verify=False 选项准确发送 post 请求,最快的方法是使用此代码:

import requests

requests.api.request('post', url, data={'bar':'baz'}, json=None, verify=False)

If you are writing a scraper and really don't care about the SSL certificate you can set it global:如果您正在编写一个爬虫并且真的不关心 SSL 证书,您可以将其设置为全局:

import ssl

ssl._create_default_https_context = ssl._create_unverified_context

DO NOT USE IN PRODUCTION不要在生产中使用

What has worked for me Due verify=False Bug什么对我有用 Due verify=False Bug

Due to a bug on session.verify=False that makes urllib* ignore由于session.verify=False上的错误导致urllib*被忽略
that when a environment variable ( CURL_CA_BUNDLE ) is set.当设置环境变量( CURL_CA_BUNDLE )时。 So we set it to nothing.所以我们将其设置为空。

import requests, os
session = requests.Session()
session.verify = False
session.trust_env = False
os.environ['CURL_CA_BUNDLE']="" # or whaever other is interfering with 
session.post(url='https://example.com', data={'bar':'baz'})

Not sure I need trust_env不确定我需要trust_env

first import ssl then make a variable like this with three lines of code in your python script file-首先导入 ssl 然后在 python 脚本文件中使用三行代码创建一个这样的变量 -

ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE

An Example that I have use in html parsing with beautifulsoup was like this -我在 html 中使用 beautifulsoup 解析的示例是这样的 -

import urllib.request,urllib.parse,urllib.error

from bs4 import BeautifulSoup
import ssl

ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE


url = input('Enter - ')
html = urllib.request.urlopen(url, context=ctx).read()
soup = BeautifulSoup(html, 'html.parser')

python 3.6+ python 3.6+

import warnings
warnings.filterwarnings("ignore", message="Unverified HTTPS request")

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM