I am trying to connect to an Elasticsearch node from Python with SSL.
I'm using the basic code for that:
from elasticsearch import Elasticsearch
from ssl import create_default_context
context = create_default_context(cafile="path/to/cafile.pem")
es = Elasticsearch("https://elasticsearch.url:port", ssl_context=context, http_auth=('elastic','yourpassword'))
From: https://github.com/elastic/elasticsearch-py
I need to supply cafile.pem
, and http_auth
parameters. On the server where my Python is running, SSL connection is already set up, so I can do basic queries to Elasticsearch. It was set up using keys in the ~/.ssh
directory: id_rsa
, id_rsa.pub
.
So, now I am wondering whether I should supply id_rsa.pub
key in place of path/to/cafile.pem
, and if yes, then I would need to change permissions of ~/.ssh
folder which seems like not a good idea from security perspective.
Then, I am not sure that .pub
is the same as .pem
, do I need to convert it first? Then, should http_auth
just be omitted since I do not use any password when I do simple queries from the terminal?
How should I go about this issue of setting up access in Python to ES with SSL according to best practices?
I tried both .pub
and generated from it pem
: https://serverfault.com/questions/706336/how-to-get-a-pem-file-from-ssh-key-pair
But both failed to create_default_context
with an unknown error
in context.load_verify_locations(cafile, capath, cadata)
.
The answer for my particular case turned out to be very simple. I found it here:
https://elasticsearch-py.readthedocs.io/en/master/
es = Elasticsearch(['https://user:secret@localhost:443'])
Just specified https
url
inside and it worked out right away.
Elasticsearch Docker image & Python2.7. Have Copied ssl certificate file to root of the project. Made sure it's readable, ownership and group ownership will allow read access. Put pass and login to constants.
es = Elasticsearch(
hosts=[
"https://localhost:9200"
],
http_auth=(USR_LOGIN, USR_PASS),
use_ssl=True,
verify_certs=True,
ca_certs="./http_ca.crt",
)
For self-signed certificates , using:
from elastic_transport import NodeConfig
from elasticsearch import AsyncElasticsearch
client = AsyncElasticsearch(
hosts=[
NodeConfig(
scheme= "https",
host="<host URL>",
port=443,
verify_certs=False,
ca_certs=None,
ssl_show_warn=False,
)
],
http_auth=("username", "password"),
verify_certs=False,
ca_certs="/path/to/cafile.pem", # PEM format
client_cert="/path/to/tls.cert" # PEM format
client_key="/path/to/tls.key" # PEM format
)
client.info()
Explanation:
verify_certs=False
disables the underlying Python SSL modules from verifying the self-signed certs, but properly sends it upstream to the server. For non-self-signed certificates , you should try enabling verify_certs=True
.AsyncElasticsearch
, but if you need the sync Elasticsearch
version, it should be directly compatible as all the parameters are the same. See: https://elasticsearch-py.readthedocs.io/en/v8.8.1/async.html#getting-started-with-async )So, now I am wondering whether I should supply id_rsa.pub key in place of path/to/cafile.pem, and if yes, then I would need to change permissions of ~/.ssh folder which seems like not a good idea from security perspective.
These SSH keys is most likely not related to Elasticsearch, but for allowing you to connect and authenticate with your server running Elasticsearch.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.