简体   繁体   中英

not able to connect to cloud sql private IP via cloud run

I am trying to connect to cloud sql(postgres) via simple python application code.. I am trying to run container locally using cloud shell .. getting below messages for a while and then finally a error.

vikrant@cloudshell:~/test/docker (test-project)$ docker run -ti --name=my-test-container test-docker-image
2022/06/12 03:56:09 current FDs rlimit set to 1048576, wanted limit is 8500. Nothing to do here.
2022/06/12 03:56:09 using credential file for authentication; email=service-gcf-transform@test-project.iam.gserviceaccount.com
2022/06/12 03:56:10 Listening on 127.0.0.1:5432 for test-project:europe-west1:test-uat-pgsql-65d01f80
2022/06/12 03:56:10 Ready for new connections
2022/06/12 03:56:11 Generated RSA key in 286.419278ms
connecting to postgres database
2022/06/12 03:56:19 New connection for "test-project:europe-west1:test-uat-pgsql-65d01f80"
2022/06/12 03:56:19 refreshing ephemeral certificate for instance test-projec:europe-west1:test-uat-pgsql-65d01f80
2022/06/12 03:56:22 Scheduling refresh of ephemeral certificate in 54m57s

  File "/usr/local/lib/python3.10/site-packages/psycopg2/__init__.py", line 122, in connect
    conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server closed the connection unexpectedly
        This probably means the server terminated abnormally
        before or while processing the request.

below is sample code. I tried it with cloud sql IP also..

import sqlalchemy
import os

db_user='test-user'
db_pass='test-pswd'
db_name='test-database'

db_engine = None

host = '127.0.0.1'
port ='5432'

drivername='postgresql+psycopg2'

def _create_engine():

    return sqlalchemy.create_engine(
        sqlalchemy.engine.url.URL(
            drivername=drivername,
            host=host,
            username=db_user,
            password=db_pass,
            database=db_name,
        )
    )


def get_engine():
    global db_engine
    if not db_engine:
        db_engine = _create_engine()
    return db_engine


def read_postgres(conn):
    sql = f"""
    select bucket_id,object_id from test_requests limit 5
    """
    result = conn.execute(sqlalchemy.text(sql)).fetchall()
    result_list=[]
    for r in result:
        d = dict(zip(r.keys(), r))
    result_list.append(d)

    return result_list

def connect_postgres():
    print('connecting to postgres database')
    with get_engine().connect() as conn:
        print('connected to postgres database')
        result = read_postgres(conn)
        result_list=[]
        for r in result:
            print(f"row data as:{r}")
            print(f"bucket name:{r['bucket_id']}")
            print(f"object name:{r['object_id']}")

connect_postgres()

gcloud run deploy run-postgres \
--image eu.gcr.io/test-project/test-docker-image \
--add-cloudsql-instances=test-project:europe-west1:test-uat-pgsql-65d01f80 \
--region=europe-west1 \
--service-account service-cce-transform@test-project.iam.gserviceaccount.com \
--vpc-connector test-uat-cf-connector

running it locally also not working

Dockerfile:

FROM python:3-alpine

WORKDIR usr/src/app
COPY main.py .
COPY requirements.txt .
COPY test_cf_SA_key.json .

ADD start.sh /
RUN chmod +x /start.sh

RUN wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
RUN chmod +x cloud_sql_proxy

RUN cp ./cloud_sql_proxy /cloud_sql_proxy

RUN pip install --no-cache-dir -r requirements.txt

CMD ["/start.sh"]

running application code using start.sh to sort out the timing issue.. read over some stackoverflow link

#!/bin/sh

./cloud_sql_proxy -instances=test-project:europe-west1:test-uat-pgsql-65d01f80=tcp:5432 -credential_file=test_cf_SA_key.json &

chmod +x cloud_sql_proxy
sleep 10
python3 main.py

The main reason you can't connect to the Cloud SQL instance is the way you are using the Cloud SQL auth proxy. The proxy will not work for connecting to private instances if you run it from an environment not within the same VPC as the private SQL instance . Both your local machine and Cloud Shell are not part of the VPC where your SQL instance is.

An environment where you can test it, for example, would be a GCE instance located at the same VPC as your SQL instance. I tested a simplified version of your code and could query the instance from both GCE and a Cloud Run service. Below are the snippets I used.

However , using the auth proxy in a Cloud Run container to connect to a private CloudSQL instance is redundant. The official guide shows the recommended setup to query a private Cloud SQL instance from Cloud Run. The serverless VPC connector will allow your Cloud Run container to be in the same VPC as your SQL instance, and connect directly to it. Let me know if this was useful.

Dockerfile

FROM python:3-alpine

WORKDIR /app
COPY main.py .
COPY requirements.txt .
COPY test_service_acctkey.json .

COPY start.sh .
RUN chmod +x ./start.sh

RUN wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
RUN chmod +x cloud_sql_proxy

RUN pip install --no-cache-dir -r requirements.txt

CMD ["./start.sh"]

start.sh

#!/bin/sh

./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:5432 -credential_file=<PATH_TO_SERVICE_ACCTKEY> -ip_address_types=PRIVATE &

chmod +x cloud_sql_proxy
sleep 10
gunicorn --bind :8080 --workers 1 --threads 8 --timeout 0 main:app

main.py

from flask import Flask
import sqlalchemy
import os

db_user='USER'
db_pass='PASS'
db_name='DATABASE'
db_engine = None

host = '127.0.0.1'
port ='5432'

drivername='postgresql+psycopg2'

app = Flask(__name__)

def _create_engine():

    return sqlalchemy.create_engine(
        sqlalchemy.engine.url.URL(
            drivername=drivername,
            host=host,
            username=db_user,
            password=db_pass,
            database=db_name,
        )
    )


def get_engine():
    global db_engine
    if not db_engine:
        db_engine = _create_engine()
    return db_engine


def read_postgres(conn):
    sql = f"""
    select * from testtable;
    """
    result = conn.execute(sqlalchemy.text(sql)).fetchall()
    return result

def connect_postgres():
    with get_engine().connect() as conn:
        result = read_postgres(conn)
        return result[0][1]

@app.route('/')
def entry():
    sql_data = connect_postgres()
    return f'TEST: {sql_data}'

if __name__ == '__main__':
    app.run(debug=True, host="0.0.0.0", port=int(os.environ.get("PORT", 8080)))

Deployment command (for testing)

gcloud run deploy SERVICE_NAME \
--source . \
--service-account=SERVICE_ACCT_ADDRESS \
--max-instances=3 \
--allow-unauthenticated \
--region=us-central1 \
--vpc-connector=SERVERLESS_CONNECTOR_NAME \
--no-cpu-throttling

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM