I am using Python on Snowflake where I need to move a pandas dataframe with hundreds of columns to Snowflake. I am at loss to figure out a way to do so without typing in each column name and its data type in " Create Table
..." inside the snowflake connector cursor.
Does anyone know any advise? Any guidance will be helpful.
Thank you
Pandas .toSql()
should work. If you tried it and it didn't get you the results you wanted - please include those details in your question.
The following can create the table with the correct schema:
df.head(0).to_sql(
'MY_TABLE', con=connection, if_exists="replace", index=False)
Otherwise, see below.
You don't need to "type" every column and its data type, you can use Python to generate a create table
statement to match your dataframe.
I did something similar with JS:
CREATE or replace PROCEDURE create_wide_table()
RETURNS VARCHAR
LANGUAGE javascript
AS
$$
ss = 'create or replace table wide2000 (id int';
for (const x of Array(2000).keys()) {
ss += ', a' +x + ' int default uniform(1, 10000, random())';
}
ss += ');'
//return ss;
var rs = snowflake.execute( { sqlText: ss } );
return 'Done.';
$$;
That will generate and execute a SQL query that looks like:
create or replace table wide2000
(id int
, a0 int default uniform(1, 10000, random())
, a1 int default uniform(1, 10000, random())
, ...
);
(from https://stackoverflow.com/a/67132536/132438 )
In your case, use Python to iterate over the dataframe column names and types, and generate the create table
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.