简体   繁体   中英

Copying data from one database to another - Out of Memory Exception

I have a task where i want to copy all data from one database to another database & skipping 2 tables. There are more than 200 tables.

I have table structure ready for my 2nd databas.

So as a solution i created a page & on a button click i have below code :-

DataSet ds = new DataSet();
                string connectionString = "Data Source=COMP112\\MSSQLSERVER2014;Initial Catalog=HCMBL;Integrated Security=True;Persist Security Info=True";
                SqlConnection con = new SqlConnection(connectionString);
                //render table name from database
                string sqlTable = "SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' and TABLE_Schema='" + Session["SchemaName"].ToString() + "' and TABLE_NAME!='ENTRY' and TABLE_NAME!='OT' and TABLE_NAME!='BL_ENTRY' and TABLE_NAME!='BL_OT'";
                con.Open();
                SqlDataAdapter da = new SqlDataAdapter();
                SqlCommand cmd = new SqlCommand(sqlTable, con);
                cmd.CommandType = CommandType.Text;
                da.SelectCommand = cmd;
                da.Fill(ds);
                con.Close();
                //render connection string from WebConfig file
                string strcon = ConfigurationManager.ConnectionStrings["SPSchema"].ConnectionString;

                for (int i = 0; i < ds.Tables[0].Rows.Count; i++)
                {
                    if (!(ds.Tables[0].Rows[i]["TABLE_NAME"].ToString().Contains("Asp")))
                    {
                        string deleteQuery = "Truncate table " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
                        con.Open();
                        SqlCommand cmdDelete = new SqlCommand(deleteQuery, con);
                        cmdDelete.ExecuteNonQuery();
                        con.Close();

                        DataSet dataSet = new DataSet();
                        SqlConnection conn = new SqlConnection(strcon);
                        conn.Open();
                        string selectData = "select * from " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
                        SqlCommand command = new SqlCommand(selectData, conn);
                        DataTable dataTable = new DataTable();
                        SqlDataAdapter dataAdapter = new SqlDataAdapter(selectData, conn);
                        dataAdapter.FillSchema(dataSet, SchemaType.Mapped);
                        dataAdapter.Fill(dataSet);
                        dataTable = dataSet.Tables[0];
                        conn.Close();

                        if (dataSet.Tables[0].Rows.Count > 0)
                        {
                            //Connect to second Database and Insert row/rows.
                            SqlConnection conn2 = new SqlConnection(connectionString);
                            conn2.Open();
                            SqlBulkCopy bulkCopy = new SqlBulkCopy(conn2);
                            bulkCopy.DestinationTableName = Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"].ToString();
                            bulkCopy.WriteToServer(dataTable);
                            conn2.Close();
                        }

                    }

                }

As i run the above code after inserting data in less than 10 tables, it gives out of memory exception & program crashes.

How to handle this? I tried increasing the memory capacity of SQL Server but still same error.

Is there any other way to achieve the task?

What you are doing is very far from the best solution. You are using an ASP.NET MVC process to get all data of your entire database into memory, and then outputting it to another database. If your database is anything more than small and trivial, that will most definitely fill your process's alotted memory.

This type of task should never be done through the memory of a process, but rather using some form of Backup/Restore pattern.

You should look into SSIS projects and create an extract, transfer, and load (ETL) solution, which can be triggered from your ASP.NET MVC solution asynchronously.

An SSIS solution can be triggered from C# code in this way:

var app = new Application();
var package = app.LoadPackage("compiled-package.dtsx", null);
var results = package.Execute();

See this question for a little more information (not specifically about duplicating databases, but has information about triggering SSIS packages from code): How to execute an SSIS package from .NET?

Alternatively

You also have the option of running a query against both databases at once, however this requires some additional plumbing to be done. The user account of your ASP.NET MVC solution needs to have access to both databases. If your databases are hosted on different servers, you also need to link one server to the other: Create linked servers

To perform an insert directly from the output of a select , consider this:

string source = "NAME_OF_SOURCE_DATABASE";
string target = "NAME_OF_TARGET_DATABASE";
string schema = Session["SchemaName"].ToString();
string table = ds.Tables[0].Rows[i]["TABLE_NAME"];

// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery = $"SET IDENTITY_INSERT {target}.{schema}.{table} ON";
var idInsCommand = new SqlCommand(idInsQuery, conn);
idInsCommand.ExecuteNonQuery();*/

string insQuery = $"INSERT INTO {target}.{schema}.{table} SELECT * FROM {source}.{schema}.{table}";
var insCommand = new SqlCommand(insQuery, conn);
insCommand.ExecuteNonQuery();

// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery2 = $"SET IDENTITY_INSERT {target}.{schema}.{table} OFF";
var idInsCommand2 = new SqlCommand(idInsQuery2, conn);
idInsCommand2.ExecuteNonQuery();*/

This will only work if the table structures are identical. There might be problems with autoincrement ids or columns with default values, too.

This will copy data from a table in database 1 to a table in database 2

Insert into db2.dbo.table2 (col1,col2)
Select col1,col2 from db1.dbo.table1

Run this sql statement and the data will be copied without a round trip to your app.

Let me know if you find my approach is useful. First of all, why you want to write down one whole application to do this job while SQL Server have inherited property to do it .

My approach would be configure an Linked Server and configure it which tables you want to copy and which one not. https://docs.microsoft.com/en-us/sql/relational-databases/linked-servers/create-linked-servers-sql-server-database-engine

Secondly, You can just write down simple stored procedure and schedule that in your sql server to push into another server database as per your schedule. In this way you can control it in N number of ways. I mean about controlling any dependencies(Table level or Business level).

To do this in t-sql, you can use the following system stored procedures to schedule a daily job. This example schedules daily at 1:00 AM. See Microsoft help for details on syntax of the individual stored procedures and valid range of parameters.

DECLARE @job_name NVARCHAR(128), @description NVARCHAR(512), @owner_login_name NVARCHAR(128), @database_name NVARCHAR(128);

SET @job_name = N'Some Title';
SET @description = N'Periodically do something';
SET @owner_login_name = N'login';
SET @database_name = N'Database_Name';

-- Delete job if it already exists:
IF EXISTS(SELECT job_id FROM msdb.dbo.sysjobs WHERE (name = @job_name))
BEGIN
    EXEC msdb.dbo.sp_delete_job
        @job_name = @job_name;
END

-- Create the job:
EXEC  msdb.dbo.sp_add_job
    @job_name=@job_name, 
    @enabled=1, 
    @notify_level_eventlog=0, 
    @notify_level_email=2, 
    @notify_level_netsend=2, 
    @notify_level_page=2, 
    @delete_level=0, 
    @description=@description, 
    @category_name=N'[Uncategorized (Local)]', 
    @owner_login_name=@owner_login_name;

-- Add server:
EXEC msdb.dbo.sp_add_jobserver @job_name=@job_name;

-- Add step to execute SQL:
EXEC msdb.dbo.sp_add_jobstep
    @job_name=@job_name,
    @step_name=N'Execute SQL', 
    @step_id=1, 
    @cmdexec_success_code=0, 
    @on_success_action=1, 
    @on_fail_action=2, 
    @retry_attempts=0, 
    @retry_interval=0, 
    @os_run_priority=0, 
    @subsystem=N'TSQL', 
    @command=N'EXEC my_stored_procedure; -- OR ANY SQL STATEMENT', 
    @database_name=@database_name, 
    @flags=0;

-- Update job to set start step:
EXEC msdb.dbo.sp_update_job
    @job_name=@job_name, 
    @enabled=1, 
    @start_step_id=1, 
    @notify_level_eventlog=0, 
    @notify_level_email=2, 
    @notify_level_netsend=2, 
    @notify_level_page=2, 
    @delete_level=0, 
    @description=@description, 
    @category_name=N'[Uncategorized (Local)]', 
    @owner_login_name=@owner_login_name, 
    @notify_email_operator_name=N'', 
    @notify_netsend_operator_name=N'', 
    @notify_page_operator_name=N'';

-- Schedule job:
EXEC msdb.dbo.sp_add_jobschedule
    @job_name=@job_name,
    @name=N'Daily',
    @enabled=1,
    @freq_type=4,
    @freq_interval=1, 
    @freq_subday_type=1, 
    @freq_subday_interval=0, 
    @freq_relative_interval=0, 
    @freq_recurrence_factor=1, 
    @active_start_date=20170101, --YYYYMMDD
    @active_end_date=99991231, --YYYYMMDD (this represents no end date)
    @active_start_time=010000, --HHMMSS
    @active_end_time=235959; --HHMMSS

Let me know in case you need more details on this.

Thanks, Ayan

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM