I'm curious about best practices and also performance to be gained when working with lists of large strings in python. Specifically, I have a list that contains different postgreSQL queries as strings, and I'm wondering on best practices on how to initialize the list. Consider the following 2 methods:
Method 1 - Creating the list in code:
query_load = [("SELECT val_1, COUNT(*) as frequency "
"FROM table "
"GROUP BY val_1 "
"ORDER BY val_1 ASC"),
("SELECT val_2, COUNT(*) as frequency "
"FROM table "
"GROUP BY val_2 "
"ORDER BY val_2 ASC"),
("SELECT val_3, COUNT(*) as frequency "
"FROM table "
"GROUP BY val_3 "
"ORDER BY val_3 ASC"),
("SELECT val_4, COUNT(*) as frequency "
"FROM table "
"GROUP BY val_4 "
"ORDER BY val_4 ASC"),
...
]
Method 2 - Reading queries into list from file
my_list = [line.rstrip('\n') for line in open("..Desktop/my_queries.txt")]
As far as readability and less lines of code, Method 2 appears to be the better choice, but I'd like to stay in line with best practices. Additionally, will reading a file line-by-line into a list give worse performance (by a non-trivial amount)?
Just use a triple-quoted string; SQL won't care about the whitespace used to make your Python source readable.
query_load = [
"""
SELECT val_1, COUNT(*) as frequency
FROM table
GROUP BY val_1
ORDER BY val_1 ASC
""",
"""
SELECT val_2, COUNT(*) as frequency
FROM table
GROUP BY val_2
ORDER BY val_2 ASC
""",
# etc
]
Whether you hard-code the queries or read them from external files is really a separate question that is only marginally related to the appearance of your source code.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.