How could I possibly patch the sql statement in the pandas to_sql() function, so the newly created table uses the MYISAM storage engine?
I need MYISAM because of a very large amount of columns. This currently causes issues with the standard database engine INNODB (Row size too large (> 8126).
I am aware that it is possible to set the database engine explicitly in a mysql statement upon table creation. Perhaps it's possible to patch the sql that is generated by the to_sql() function?
To specify explicitly that you want a MyISAM table, indicate that with an ENGINE table option: CREATE TABLE t (i INT) ENGINE = MYISAM;
This is how I create the table currently
df.to_sql(con=engine, name="generated_" + reportConfiguration.shortName + "_" + reportConfiguration.marketplace, if_exists='replace',index=False)
As per my knowledge, storage engine of MySQL can't be set using df.to_sql
or engine = create_engine('mysql+pymy....://x@y/z')
.
MySQL storage engine can be added to table structure once it is created.
executing
an alter table
command over the engine
connection, you can modify the table storage engine.
Example :
with engine.begin() as conn:
conn.execute("ALTER TABLE table_name ENGINE = MYISAM")
Documentation :
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.