简体   繁体   English

带 SQL Server 的 Amazon RDS 不允许从 CSV 源批量插入

[英]Amazon RDS w/ SQL Server wont allow bulk insert from CSV source

I've tried two methods and both fall flat...我已经尝试了两种方法,但都失败了......

BULK INSERT TEMPUSERIMPORT1357081926 
FROM 'C:\uploads\19E0E1.csv' 
WITH (FIELDTERMINATOR = ',',ROWTERMINATOR = '\n')

You do not have permission to use the bulk load statement.您没有使用批量加载语句的权限。

but you cannot enable that SQL Role with Amazon RDS?但您无法使用 Amazon RDS 启用该 SQL 角色?

So I tried... using openrowset but it requires AdHoc Queries to be enabled which I don't have permission to do!所以我尝试...使用 openrowset 但它需要启用 AdHoc 查询,我无权这样做!

You can enable ad hoc distributed queries via heading to your Amazon Management Console, navigating to your RDS menu and then creating a DB Parameter group with ad hoc distributed queries set to 1, and then attaching this parameter group to your DB instance.您可以通过前往您的 Amazon 管理控制台,导航到您的 RDS 菜单,然后创建一个将即席分布式查询设置为 1 的数据库参数组,然后将此参数组附加到您的数据库实例来启用即席分布式查询。

Don't forget to reboot your DB once you have made these changes.完成这些更改后,不要忘记重新启动数据库。

Here is the source of my information:以下是我的信息来源:

http://blogs.lessthandot.com/index.php/datamgmt/dbadmin/turning-on-optimize-for-ad/ http://blogs.lessthandot.com/index.php/datamgmt/dbadmin/turning-on-optimize-for-ad/

Hope this helps you.希望这对你有帮助。

I know this question is really old, but it was the first question that came up when I searched bulk inserting into an aws sql server rds instance.我知道这个问题真的很老,但这是我搜索批量插入 aws sql server rds 实例时出现的第一个问题。 Things have changed and you can now do it after integrating the RDS instance with S3.事情发生了变化,您现在可以在将 RDS 实例与 S3 集成后执行此操作。 I answered this question in more detail on this question .我在这个问题上更详细地回答了这个问题 But overall gist is that you setup the instance with the proper role, put your file on S3, then you can copy the file over to RDS with the following commands:但总体要点是,您使用适当的角色设置实例,将文件放在 S3 上,然后您可以使用以下命令将文件复制到 RDS:

exec msdb.dbo.rds_download_from_s3
        @s3_arn_of_file='arn:aws:s3:::bucket_name/bulk_data.csv',
        @rds_file_path='D:\S3\seed_data\data.csv',
        @overwrite_file=1;

Then BULK INSERT will work:然后BULK INSERT将起作用:

FROM 'D:\S3\seed_data\data.csv'
WITH
(
    FIRSTROW = 2,
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n'
)

AWS doc 文档

2022 2022

I'm adding for anyone like me who wants to quickly insert data into RDS from C#我正在为像我这样想要从 C# 快速将数据插入 RDS 的人添加

While RDS allows csv bulk uploads directly from S3 instances , there are times when you just want to directly upload data straight from your program.虽然 RDS 允许直接从 S3 实例批量上传 csv ,但有时您只想直接从程序上传数据。

I've written a C# utility method which does inserts using a StringBuilder to concatenate statements to do 2000 inserts per call, which is way faster than an ORM like dapper which does one insert per call.我编写了一个 C# 实用程序方法,它使用 StringBuilder 进行插入以连接语句以每次调用执行 2000 次插入,这比像 dapper 这样的 ORM 更快,每次调用执行一次插入。

This method should handle date, int, double, and varchar fields, but I haven't had to use it for character escaping or anything like that.这个方法应该处理 date、int、double 和 varchar 字段,但我不必将它用于字符转义或类似的事情。

//call as
FastInsert.Insert(MyDbConnection, new object[]{{someField = "someValue"}}, "my_table");


class FastInsert
{
    static int rowSize = 2000;
    internal static void Insert(IDbConnection connection, object[] data, string targetTable)
    {
        var props = data[0].GetType().GetProperties();

        var names = props.Select(x => x.Name).ToList();

        
        foreach(var batch in data.Batch(rowSize))
        {
            var sb = new StringBuilder($"insert into {targetTable} ({string.Join(",", names)})");
            string lastLine = "";

            foreach(var row in batch)
            {
                sb.Append(lastLine);
                var values = props.Select(x => x.GetValue(row)).Select(x => x.GetType() == typeof(DateTime) ? ((DateTime)x).ToString("yyyy-MM-dd HH:mm:ss") : x.ToString());
                lastLine = $"select '{string.Join("','", values)}' union all ";
            }

            lastLine = lastLine.Substring(0, lastLine.Length - " union all".Length) + " from dual";
            sb.Append(lastLine);
            var fullQuery = sb.ToString();
            connection.Execute(fullQuery);
        }
    }
}

static class Extensions
{
    public static IEnumerable<T[]> Batch<T>(this IEnumerable<T> source, int size) //split an IEnumerable into batches
    {
        T[] bucket = null;
        var count = 0;

        foreach (var item in source)
        {
            if (bucket == null)
                bucket = new T[size];

            bucket[count++] = item;

            if (count != size)
                continue;

            yield return bucket;

            bucket = null;
            count = 0;
        }

        // Return the last bucket with all remaining elements
        if (bucket != null && count > 0)
        {
            Array.Resize(ref bucket, count);
            yield return bucket;
        }
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM