简体   繁体   中英

Query a Postgress table using a large python list and Psycopg2 with speed.?

I have a python list of upwards of 2000 elements. I have a postgress table i need to query for each one of these elements, i am currently using the following.

cur.execute("SELECT * FROM patent_info WHERE patentno = ANY (%s);", (list_of_patent,))

What this does is query the table one at a time for every element in the list and i store this data in a dataframe afterwards using

data = cur.fetchall()

The issue is that with over 2000 elements the response takes a while to generate sometimes causing my webapp to timeout or the browser to timeout.

I was looking for a way to query all the elements from the table faster or more efficiently.

i thought about using a for loop for every element in the list and using cur.fetchone() but i assumed that would be slower.

Do a join in instead:

cursor.execute("""
    select *
    from
        patent_info
        inner join
        unnest(%s) u(patentno) using (patentno)
""", (list_of_patent,))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM