简体   繁体   中英

How can I speed up this C# function / SQL insert?

This code serializes an array of integers then inserts it into a sql table. It isn't as fast as I need it to be. Could I be doing anything more efficiently?

Thanks!

    public void SetItem(long Id, int[] items)
    {
        using (MemoryStream stream = new MemoryStream())
        {
            foreach (int d in items)
            {
                var bin = BitConverter.GetBytes(d); //Serialize
                stream.Write(bin, 0, bin.Length);
            }
            var array = stream.ToArray();

            using (SqlCommand cmd = new SqlCommand("INSERT INTO Items(Id, Item, DateCreated) VALUES (@Id, @binaryValue, @dateCreated)", sqlConnection))
            {
                cmd.Parameters.Add("@binaryValue", SqlDbType.VarBinary, array.Length).Value = array;
                cmd.Parameters.Add("@Id", SqlDbType.BigInt).Value = Id;
                cmd.Parameters.Add("@dateCreated", SqlDbType.DateTime2).Value = DateTime.Now;
                cmd.ExecuteNonQuery();
            }
        }
    }

I advise you to divide this function in two. One regarding the byte array the other for insertion into the DB.

Then run profiling and see if you byte array code is slow or if it is a db problem.

Maybe you are trying to accelerate something that isn't slow :)

If you're inserting a lot of rows, the SqlBulkCopy class is much faster than calling insert a lot of times. See this blog post for an example .

First thing I would try is preallocation of byte[] for memory stream:

var array = new int[BitConverter.GetBytes(0).Length * items.Length];
using (MemoryStream stream = new MemoryStream(array))
{
    // ... rest is almost the same
}

You could create a procedure, with you insert command. It is faster because procedure is already compiled for Sql

Something like this:

    SqlConnection conn = new SqlConnection(actual_string);
conn.Open();

// Create the command string
SqlCommand cmd = new SqlCommand("EXEC insert_test @var1, @var2, @var3, @str1, @str2", conn);

// Iterate through all of the objects
try {
    for (int i = 0; i < 10000; i++) {
        cmd.Parameters.Clear();
        cmd.Parameters.Add(new SqlParameter("@var1", var1));
        cmd.Parameters.Add(new SqlParameter("@var2", var2));
        cmd.Parameters.Add(new SqlParameter("@var3", var3));
        cmd.Parameters.Add(new SqlParameter("@str1", str1));
        cmd.Parameters.Add(new SqlParameter("@str2", str2));

        // Read in all the data
        cmd.ExecuteNonQuery();
    }
} finally {
    conn.Close();
}

But my preference is to send a XML to a procedure.

You could see more in this good article

My first inclination would be to pre-allocate an array to be used by the MemoryStream , and then use BinaryWriter to write to it:

var OutputArray = new byte[items.Length * 4];
using (var ms = new MemoryStream(OutputArray))
{
    using (var writer = new BinaryWriter(ms))
    {
        foreach (var i in items)
        {
            writer.Write(i);
        }
    }
}
// You can now send the OutputArray to SQL server

BinaryWriter does not use BitConverter.GetBytes internally. Rather, it extracts the bytes one at a time from the int and places them in a buffer. Then the buffer is written to the stream. BitConverter , on the other hand, allocates a new 4-byte buffer every time you call it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM