简体   繁体   中英

Strange EF6 performance issue on ExecuteStoreCommand

I have a strange performance issue with executing a simple merge SQL command on Entity Framework 6.

First my Entity Framework code:

var command = @"MERGE [StringData] AS TARGET
                USING (VALUES (@DCStringID_Value, @TimeStamp_Value)) AS SOURCE ([DCStringID], [TimeStamp])
                    ON TARGET.[DCStringID] = SOURCE.[DCStringID] AND TARGET.[TimeStamp] = SOURCE.[TimeStamp]
                WHEN MATCHED THEN
                   UPDATE
                      SET [DCVoltage] = @DCVoltage_Value,
                          [DCCurrent] = @DCCurrent_Value
                WHEN NOT MATCHED THEN
                      INSERT ([DCStringID], [TimeStamp], [DCVoltage], [DCCurrent]) 
                      VALUES (@DCStringID_Value, @TimeStamp_Value, @DCVoltage_Value, @DCCurrent_Value);";

using (EntityModel context = new EntityModel())
{                            
    for (int i = 0; i < 100; i++)
    {
        var entity = _buffer.Dequeue();

        context.ContextAdapter.ObjectContext.ExecuteStoreCommand(command, new object[]
            {
                new SqlParameter("@DCStringID_Value", entity.DCStringID),
                new SqlParameter("@TimeStamp_Value", entity.TimeStamp),
                new SqlParameter("@DCVoltage_Value", entity.DCVoltage),
                new SqlParameter("@DCCurrent_Value", entity.DCCurrent),
            });
    }
}

Execution time ~20 seconds .

This looks a little bit slow so I tried the same command to run direct in management studio (also 100 times in a row).

SQL Server Management Studio:

编辑

Execution time <1 second .

Ok that is strange!?

Some tests :

First I compare both execution plans (Entity Framework and SSMS) but they are absolutely identical.

在此输入图像描述

Second I tried is using a transaction inside my code.

using (PowerdooModel context = PowerdooModel.CreateModel())
{
    using (var dbContextTransaction = context.Database.BeginTransaction())
    {
        try
        {
            for (int i = 0; i < 100; i++)
            {
                context.ContextAdapter.ObjectContext.ExecuteStoreCommand(command, new object[]
                    {
                        new SqlParameter("@DCStringID_Value", entity.DCStringID),
                        new SqlParameter("@TimeStamp_Value", entity.TimeStamp),
                        new SqlParameter("@DCVoltage_Value", entity.DCVoltage),
                        new SqlParameter("@DCCurrent_Value", entity.DCCurrent),
                    });
            }

            dbContextTransaction.Commit();
        }
        catch (Exception)
        {
            dbContextTransaction.Rollback();
        }
    }
}

Third I added 'OPTION(recompile)' to avoid parameter sniffing.

Execution time still ~10 seconds . What is still very poor performance.

Question : what am I doing wrong? Please give me a hint.

-------- Some more tests - edited 18.11.2016 ---------

If I execute the commands inside a transaction (like above), the following times comes up:

Duration complete: 00:00:06.5936006

Average Command: 00:00:00.0653457

Commit: 00:00:00.0590299

Is it not strange that the commit nearly takes no time and the average command takes nearly the same time?

In SSMS you are running one long batch that contains 100 separate MERGE statements. In C# you are running 100 separate batches. Obviously it is longer.

Running 100 separate batches in 100 separate transactions is obviously longer than 100 batches in 1 transaction. Your measurements confirm that and show you how much longer.


To make it efficient use a single MERGE statement that processes all 100 rows from a table-valued parameter in one go. See also Table-Valued Parameters for .NET Framework

Often a table-valued parameter is a parameter of a stored procedure, but you don't have to use a stored procedure. It could be a single statement, but instead of multiple simple scalar parameters you'd pass the whole table at once.

I never used entity framework, so I can't show you a C# example how to call it. I'm sure if you search for "how to pass table-valued parameter in entity framework" you'll find an example. I use DataTable class to pass a table as a parameter.

I can show you an example of T-SQL stored procedure.

At first you define a table type that pretty much follows the definition of your StringData table:

CREATE TYPE dbo.StringDataTableType AS TABLE(
    DCStringID int NOT NULL,
    TimeStamp datetime2(0) NOT NULL,
    DCVoltage float NOT NULL,
    DCCurrent float NOT NULL
)

Then you use it as a type for the parameter:

CREATE PROCEDURE dbo.MergeStringData
    @ParamRows dbo.StringDataTableType READONLY
AS
BEGIN
    SET NOCOUNT ON;
    SET XACT_ABORT ON;

    BEGIN TRANSACTION;
    BEGIN TRY

        MERGE INTO dbo.StringData WITH (HOLDLOCK) as Dst
        USING
        (
            SELECT
                TT.DCStringID
                ,TT.TimeStamp
                ,TT.DCVoltage
                ,TT.DCCurrent
            FROM
                @ParamRows AS TT
        ) AS Src
        ON
            Dst.DCStringID = Src.DCStringID AND
            Dst.TimeStamp  = Src.TimeStamp
        WHEN MATCHED THEN
        UPDATE SET
            Dst.DCVoltage  = Src.DCVoltage
            ,Dst.DCCurrent = Src.DCCurrent
        WHEN NOT MATCHED BY TARGET THEN
        INSERT
            (DCStringID
            ,TimeStamp
            ,DCVoltage
            ,DCCurrent)
        VALUES
            (Src.DCStringID
            ,Src.TimeStamp
            ,Src.DCVoltage
            ,Src.DCCurrent)
        ;

        COMMIT TRANSACTION;
    END TRY
    BEGIN CATCH
        -- TODO: handle the error
        ROLLBACK TRANSACTION;
    END CATCH;
END

Again, it doesn't have to be a stored procedure, it can be just a MERGE statement with one table-valued parameter.

I'm pretty sure that it would be much faster than your loop with 100 separate queries.

Details on why there should be HOLDLOCK hint with MERGE : “UPSERT” Race Condition With MERGE


A side note: It is not strange that Commit in your last test is very fast. It doesn't do much, because everything is already written to the database. If you tried to do Rollback , that would take some time.

It looks like the difference is that you are using OPTION (RECOMPILE) in your EF code only, but when you run the code in SSMS there is no recompile. Recompiling 100 times would certainly add some time to execution.

Related to the answer from @Vladimir Baranov I take a look how Entity Framework supports bulk operations (for me bulk merge).

Short answer... no! Bulk delete, update and merge operations are not supported.

Then I found this sweet little library!

zzzproject Entity Framework Extensions

It provides a finished extension that works like @Vladimir Baranov descripted.

  • Create a temporary table in SQL Server.
  • Bulk Insert data with .NET SqlBulkCopy into the temporary table.
  • Perform a SQL statement between the temporary table and the destination table.
  • Drop the temporary table from the SQL Server.

You can find a short interview by Jonathan Allen about the functions and how it can dramatically improve Entity Framework performance with bulk operations.

For me it really kicks up the performance from 10 seconds to under 1 second. Nice!

I want to be clear, I'm not working or be related to them. This is no advertisement. Yes the library is commercial, but basic version is free.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM