简体   繁体   中英

DateTime Behavior in SQL Server & C#

I am pulling data from an XML feed via C# code. When I get the data, there is a datetime in the following format:

  <a:updated>2010-11-05T20:21:43.8303793Z</a:updated> 

I read that into variable of DateTime type. I then (using EF) to put that data into a table in my DB.

Later, when my code loops back around to download the feed again, I check to see if the value in the UPDATED field stored in my DB is the same as what is returned on the XML feed.

I am evaluating:

if (currentApp.Updated < app.Updated)

where currentApp.Updated is what's in my DB, and app.Updated is what has been read from the most recent XML feed download. Running it just now, both are showing (in the debug window):

{12/11/2010 8:13:44 PM}

but the IF statement is evaluating as TRUE. Looking deeper into the objects reveals:

currentApp.Updated.Ticks = 634276952242500000

app.Updated.Ticks = 634276952242511865

So when storing to the DB, it would appear that either C#, EF or SQL Server is dumping the last 5 digits (11865) and setting them to 0s (00000).

Color me frustrated. Any ideas how to solve this problem? My ideal is for that IF statement to eval false, since the UPDATED times are the same.

  TimeSpan tsUpdated = app.Updated - currentApp.Updated;

  if(tsUpdated.TotalMinutes > 1) 
  // Your current set of statements

According to the book "Microsoft SQL server 2005 T-SQL programming", SQL server is rounding up the value to the nearest 1/300 second.

So, I think you should change your if statement to

if (currentApp.Updated - app.Updated < TimeSpan.FromMilliseconds(-3.34))

.net provides a SqlDateTime structure to reflect the different methods used by SQL Server and .net to store times. From the documentation for SqlDateTime:

"Represents the date and time data ranging in value from January 1, 1753 to December 31, 9999 to an accuracy of 3.33 milliseconds to be stored in or retrieved from a database. The SqlDateTime structure has a different underlying data structure from its corresponding .NET Framework type, DateTime, which can represent any time between 12:00:00 AM 1/1/0001 and 11:59:59 PM 12/31/9999, to the accuracy of 100 nanoseconds. SqlDateTime actually stores the relative difference to 00:00:00 AM 1/1/1900. Therefore, a conversion from "00:00:00 AM 1/1/1900" to an integer will return 0."

This agrees with your observation that the time you are getting back from SQL is accurate to the millisecond level.

I'd suggest converting both times to SqlDateTime before doing your comparison.

     DateTime currentApp = new DateTime(634276952242500000);
     DateTime app = new DateTime(634276952242511865);

     SqlDateTime currentAppSql = currentApp;
     SqlDateTime appSql = app;

     if (currentAppSql < appSql) {
     }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM