Running into an issue with two docker containers, everything works fine writing to the MySQL DB, but I'm getting occasional errors in the MySQL log:
2020-09-18 17:03:02 21 [Warning] Aborted connection 21 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
2020-09-18 17:05:02 47 [Warning] Aborted connection 47 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
2020-09-18 17:08:02 49 [Warning] Aborted connection 49 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
2020-09-18 17:08:02 48 [Warning] Aborted connection 48 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
2020-09-18 17:08:02 50 [Warning] Aborted connection 50 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
2020-09-18 17:10:03 52 [Warning] Aborted connection 52 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
2020-09-18 17:10:03 51 [Warning] Aborted connection 51 to db: 'database' user: 'dbuser' host: '172.18.0.5' (Got an error reading communication packets)
My Python code that's writing to the DB is:
mydb = mysql.connector.connect(
host="mysqlprd",
user="dbuser",
passwd="password",
database="database"
)
mycursor = mydb.cursor()
sql = "INSERT INTO filldbstats VALUES .....
mycursor.executemany(sql,val)
mydb.commit()
pass
Is there a way to pass a timeout in the MySQL part of the python script, or is it something that needs to be set on the DB side?
Thanks
As far as I know, you can't pass a timeout in MySQL from a python script, but here you can find a good solution using a decorator:
Note: part of the code is taken from here
import multiprocessing.pool
import functools
# Define a decorator for timeout
def timeout(max_timeout):
"""Timeout decorator, parameter in seconds."""
def timeout_decorator(item):
"""Wrap the original function."""
@functools.wraps(item)
def func_wrapper(*args, **kwargs):
"""Closure for function."""
pool = multiprocessing.pool.ThreadPool(processes=1)
async_result = pool.apply_async(item, args, kwargs)
# raises a TimeoutError if execution exceeds max_timeout
return async_result.get(max_timeout)
return func_wrapper
return timeout_decorator
@timeout(3) #kills execution if it takes more than 3 seconds
def make_consult_aux(query):
"""
send your query to db
"""
return res # return your query operation
def make_consult(query):
try:
res = make_consult_aux(query)
return res
except TimeoutError:
return "" #empty answer for control timeout
Also, you may want to use pysycopg2 or pandas for sql consults with pandas.read_sql(query,connection_string)
psycopg2
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.