I'm trying to insert data into a redshift table using python in aws glue job . Some of the columns in my test table(which is loaded froma csv) has single quotes (like Carl's) and when i run the script the insert statement is prepared as Insert into Details values ('1','Mark Jason','Carl's', 'NY') and the code fails detecting extra s.
I have tried to use parameterized statements but not able to get the correct results. Please help on where am i going wrong.
import pg8000
from datetime import datetime
conn = pg8000.connect(user = 'xxx', password = 'xx', host = 'xxx.redshift.amazonaws.com', port = 5000, database = 'xx')
cursor = conn.cursor()
currentDT = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
dest_table = "detail"
stg_table = "test"
dest_columns=['id','Name','Store','Location','Insert_time']
stg_columns = ['id','Name','Store','Location']
add_column=currentDT
dest_col_str = ""
for i in dest_columns:
dest_col_str += i+", "
dest_col_str = dest_col_str [:-2]
data = "select distinct * from "+stg_table
cursor.execute(data)
List = cursor.fetchall()
for i in range(len(List)):
e = List[i]
source_data = ""
for x in e:
if str(x) == '':
source_data += 'null'
elif type(x) == datetime or type(x) == str or type(x) == unicode:
source_data += "'"+str(x)+"'"
else:
source_data += str(x)
source_data+=','
query="INSERT INTO "+dest_table+"("+dest_col_str +" , "+add_column+") values (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s)"
data=source_data+add_column
cursor.execute(query,data)
cursor.execute("COMMIT;")
Use backslash before 's' as follows.
('1','Mark Jason','Carl\'s', 'NY')
it will solve your issue.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.