简体   繁体   中英

How to upload multiple csv files to a SQL server Database using R

I have a number folders that store >50 different csv files in each folder, each with different names, different number of variables, and lengths. I have to upload them to a SQL server (SSMS) and instead of having to upload them one by one I would like to ask how to batch upload them all at once, especially their unique names (ie Demographics, Biologic, Sites, Labs, OthLabs, Subjects, etc.) and they are not big (under 5mb per folder). I've tried creating a list csv <- dir(pattern = "csv") but haven't figured out how to use it with DBI::dbWriteTable(conn = con, name =, value =, overwrite = FALSE) . I would be very grateful if someone has dealt with this in the past could help me.

Update

the tables in question are all different, having different dimensions and lengths (ie(130,12), (151,33), (1,6), (603, 16), etc.), what i'm trying to do is to upload them with their names. I've tried the code bellow

alldat <- lapply(csv, read.csv) 
for (i in alldat) {
dbWriteTable(conn = con,
 name = "i",
 value =  i,
 overwrite = FALSE)
  }

yes it works, but will create only 1 table called "i", any recommendation on what should the arguments for name = and value = be so that the loop will continue copying the tables from R to SQL server with the table names?

other functions that I've tried unsuccessfully are list2env , assign do.call(rbind, i)

Like @r2evans I do this using DBI, rather than dbplyr (I often use the two packages together).

Below is a cut down version of the custom function I use. You can find the full version here .

copy_r_to_sql <- function(db_connection, db, schema, sql_table_name, r_table_name) {

  suppressMessages( # mutes translation message
    DBI::dbWriteTable(
      db_connection,
      DBI::Id(
        catalog = db,
        schema = schema,
        table = sql_table_name
      ),
      r_table_name
    )
  )
}

For uploading multiple tables, either loop through them or apply .

So, after finally trying my hand on writing functions and a loop this is what I came up with. thanks to @Simon.SA and @r2evans for their input

library(dbi)

# Connect to the SQL Server database
con <- DBI::dbConnect(odbc::odbc(),
                      driver = "SQL Server",
                      server = "server_name",
                      database = "database_name",
                      uid = "username",
                      pwd = "password")

# Set the path to the folder containing the CSV files
folder <- "C:/path/to/folder"

#Get a list of the CSV files in the folder
csv_files <- list.files(folder, pattern = "\\.csv$")`

Loop through the CSV files
for (csv_file in csv_files) {
  # Read the CSV file into a data frame
  df <- read.csv(file.path(folder, csv_file))
  
  # Get the name of the table to import the data into
  table_name <- gsub("\\.csv$", "", csv_file)
  
  # Import the data into the SQL Server database
  dbWriteTable(con, table_name, df)
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM