[英]Upload large SQL database to Azure Table Storage
我有一个包含10个字段和超过3000万行的数据库表。 这是表存储的理想选择,因为我只需要搜索一列,然后返回其余的列。
我编写了一个程序,该程序将从数据库中获取行并上载到表存储中,但以这种速度,至少需要9或10天才能完成。
有没有一种快速的方法可以将完整的表上传到Azure表格存储中?
Cloud Storage Studio是Cerebrata的商业软件包,内置了此功能。我相信它们将对上传进行多线程处理,尽管我没有专门检查。 仍然需要一段时间才能上网。
可能最快的方法是将原始数据上传到BLOB存储,并编写一个WorkerRole,该WorkerRole可以在读取blob并将其写入表存储的同一数据中心中运行。 拥有大量线程和良好的分区策略,您可以快速进行。 但是,实现该目标所需的时间可能比仅以“缓慢”的方式节省下来的时间要多。
您可以从数据中心获得最佳性能。 如果确实有那么多数据,则可能值得花全部精力将其压缩为blob,然后将blob上传到存储中,然后在同一数据中心中运行一个角色,以下载,解压缩和插入内容。 这将比远程尝试快几个数量级。
如果还可以通过分区键订购数据,则可以一次批量插入数据(100个条目或价值4MB)。 您也可以批量并行化它。
我不知道有什么可以为您提供开箱即用的功能,因此您可能现在必须自己编写此内容。
这是我编写的存储过程,可让您加载非常大的数据集。 您必须一次只做一张表,但有一些警告,但与此同时,我在10分钟内上传了7GB或大约1000万行。
我在此处创建的代码项目文章中的更多信息: http : //www.codeproject.com/Articles/773469/Automating-Upload-of-Large-Datasets-to-SQL-Azure
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Document Title: usp_BulkAzureInsert.sql
Script Purpose: Dynamically insert large datasets into Azure
Script Notes: 1) This assumes the current user has write access to the C drive for file copy
If the current windows user does not have access override by hardcoding in an export folder location
Leave ExportFolder as 'NULL' for C:\User\CurrentUser\AzureExports
2) User must have permission to create permanent tables (dropped at the end of the script but used for staging)
Parameters: @DB_Schema_Table = DatabaseName.Schema.TableName (3 part no server)
@AzureTableName = DatabaseName.Schema.TableName (3 part no server)
@AzureServer = Azure Server location ending in .net (no HTTP)
@AzureClusteredIDX = Azure requires each table to have a clustered index. Comma delimited index definition.
@AzureUserName = Azure User Name
@AzurePassword = Azure Password
@ExportFolder = 'NULL' defaults to C:\User\CurrentUser\AzureExports - Use this to override
@CleanupDatFiles = Set to 1 to delete the directory and files created in the upload process
@ViewOutput = Set to 1 to view insert information during upload
--Sample Execution
EXECUTE usp_BulkAzureInsert
@DB_Schema_Table = 'MyDatabase.dbo.Customers',
@AzureTableName = 'AZ001.dbo.Customers',
@AzureServer = 'abcdef123.database.windows.net',
@AzureClusteredIDX = 'CustomerID, FirstName, LastName',
@AzureUserName = 'AzureUserName',
@AzurePassword = 'MyPassword123',
@ExportFolder = 'NULL',
@CleanupDatFiles = 1,
@ViewOutput = 1
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
IF OBJECT_ID ( N'usp_BulkAzureInsert', N'P' ) IS NOT NULL
DROP PROCEDURE usp_BulkAzureInsert;
GO
CREATE PROCEDURE usp_BulkAzureInsert
@DB_Schema_Table NVARCHAR(100),
@AzureTableName NVARCHAR(100),
@AzureClusteredIDX NVARCHAR(100),
@AzureServer NVARCHAR(100),
@AzureUserName NVARCHAR(100),
@AzurePassword NVARCHAR(100),
@ExportFolder NVARCHAR(100),
@CleanupDatFiles BIT,
@ViewOutput BIT
AS
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Start with Error Checks
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
IF ( SELECT CONVERT(INT, ISNULL(value, value_in_use)) FROM sys.configurations WHERE name = N'xp_cmdshell' ) = 0
BEGIN
RAISERROR ('ERROR: xp_cmdshell is not enable on this server/database',16,1)
RETURN
END
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Declare and Set Script Variables
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
IF ( @ViewOutput = 1 )
BEGIN
SET NOCOUNT ON;
END
DECLARE @CMD NVARCHAR(1000), @SQL NVARCHAR(MAX), @i TINYINT = 1, @NTILE VARCHAR(10), @NTILE_Value TINYINT, @TempTableName VARCHAR(1000),
@ColumnNames NVARCHAR(MAX), @TableName VARCHAR(100), @Server NVARCHAR(100)
--Set the export folder to the default location if the override was not used
IF @ExportFolder = 'NULL'
BEGIN
SET @ExportFolder = N'C:\Users\' +
CAST(REVERSE(LEFT(REVERSE(SYSTEM_USER), CHARINDEX('\', REVERSE(SYSTEM_USER))-1)) AS VARCHAR(100)) +
N'\AzureExports';
END;
--Set a permanent obejct name based on
SET @TempTableName = ( LEFT(@DB_Schema_Table, CHARINDEX('.',@DB_Schema_Table)-1) +
'.dbo.TempAzure' +
CAST(REVERSE(LEFT(REVERSE(@DB_Schema_Table), CHARINDEX('.', REVERSE(@DB_Schema_Table))-1)) AS VARCHAR(100)) )
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Calculate the amount of files to split the dataset into (No more than 250,000 lines per file)
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
SET @SQL = ' SELECT @NTILE = CEILING((CAST(COUNT(*) AS FLOAT) / 250000)) FROM ' + @DB_Schema_Table +'; ';
EXECUTE sp_executesql @SQL, N'@NTILE VARCHAR(100) OUTPUT', @NTILE = @NTILE OUTPUT;
SET @NTILE_Value = CAST(@NTILE AS TINYINT);
SET @TableName = CAST(REVERSE(LEFT(REVERSE(@DB_Schema_Table), CHARINDEX('.', REVERSE(@DB_Schema_Table))-1)) AS VARCHAR(100));
SET @Server = ( SELECT @@SERVERNAME );
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Create a folder to stage the DAT files in
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
--Remove the directory if it already exists and was not previously deleted
SET @CMD = N'rmDir /Q /S ' + @ExportFolder;
EXECUTE master.dbo.xp_cmdshell @CMD, NO_OUTPUT;
--Create a folder to hold the export files
SET @CMD = N' mkDir ' + @ExportFolder;
EXECUTE master.dbo.xp_cmdshell @CMD, NO_OUTPUT;
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Create a staging table that breaks the file into sections based on the NTILE_Value
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
--Find the names of the columns in the table
IF OBJECT_ID('tempdb.dbo.#ColumnNames') IS NOT NULL
DROP TABLE #ColumnNames
CREATE TABLE #ColumnNames
(
ColumnOrder INTEGER IDENTITY(1,1) NOT NULL,
ColumnName NVARCHAR(100) NOT NULL
);
INSERT INTO #ColumnNames
SELECT COLUMN_NAME
FROM information_schema.columns
WHERE table_name = @TableName
ORDER BY ordinal_position
--Create a list of the column names
SELECT @ColumnNames = COALESCE(@ColumnNames + ', ', '') + CAST(ColumnName AS VARCHAR(MAX))
FROM #ColumnNames;
--Split the results by the NTILE_Value
DECLARE @Column1 NVARCHAR(100) = ( SELECT ColumnName FROM #ColumnNames WHERE ColumnOrder = 1 );
SET @SQL = ' IF OBJECT_ID(''' + @TempTableName + ''') IS NOT NULL
DROP TABLE ' + @TempTableName + '
SELECT ' + @ColumnNames + ', ' + '
NTILE(' + @NTILE + ') OVER(ORDER BY ' + @Column1 + ') AS NTILE_Value
INTO ' + @TempTableName + '
FROM ' + @DB_Schema_Table
EXECUTE (@SQL);
--Now split the dataset into equal sizes creating a DAT file for each batch
WHILE @i <= @NTILE_Value
BEGIN
SET @SQL = 'IF OBJECT_ID(''' + @TempTableName + 'DatFile'') IS NOT NULL
DROP TABLE ' + @TempTableName + 'DatFile
SELECT ' + @ColumnNames + '
INTO ' + @TempTableName + 'DatFile
FROM ' + @TempTableName + '
WHERE NTILE_Value = ' + CAST(@i AS VARCHAR(2)) + '
CREATE CLUSTERED INDEX IDX_TempAzureData ON ' + @TempTableName + 'DatFile ( ' + @AzureClusteredIDX + ' )';
EXECUTE (@SQL);
SET @CMD = N'bcp ' + @TempTableName + 'DatFile out ' +
@ExportFolder + N'\' + @TableName + 'DatFile' +
CAST(@i AS NVARCHAR(3)) + '.dat -S ' + @Server + ' -T -n -q';
IF ( @ViewOutput = 1 )
BEGIN
EXECUTE master.dbo.xp_cmdshell @CMD;
END
ELSE
EXECUTE master.dbo.xp_cmdshell @CMD, NO_OUTPUT;
SET @i += 1;
END
--Clean up the temp tables
SET @SQL = ' DROP TABLE ' + @TempTableName;
EXECUTE (@SQL);
SET @SQL = ' DROP TABLE ' + @TempTableName + 'DatFile' ;
EXECUTE (@SQL);
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Insert the data into the AzureDB
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
--Reset the Variable
SET @i = 1;
--Move each batch file into the DB
WHILE @i <= @NTILE_Value
BEGIN
SET @CMD = N'Bcp ' + @AzureTableName + ' in ' +
@ExportFolder + N'\' + @TableName + 'DatFile' + CAST(@i AS NVARCHAR(2)) + '.dat -n -U ' +
@AzureUserName + '@' + LEFT(@AzureServer, CHARINDEX('.',@AzureServer)-1) +
N' -S tcp:' + @AzureServer +
N' -P ' + @AzurePassword;
IF ( @ViewOutput = 1 )
BEGIN
EXECUTE master.dbo.xp_cmdshell @CMD;
END
ELSE
EXECUTE master.dbo.xp_cmdshell @CMD, NO_OUTPUT;
SET @i += 1;
END
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Cleanup the finished tables
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
IF ( @CleanupDatFiles = 1 )
BEGIN
SET @CMD = N'rmDir /Q /S ' + @ExportFolder;
EXECUTE master.dbo.xp_cmdshell @CMD, NO_OUTPUT;
END
/*---------------------------------------------------------------------------------------------------------------------------------------------------------
Script End
-----------------------------------------------------------------------------------------------------------------------------------------------------------*/
IF ( @ViewOutput = 1 )
BEGIN
SET NOCOUNT OFF;
END
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.