简体   繁体   中英

Which is the “best” data access framework/approach for C# and .NET?

(EDIT: I made it a community wiki as it is more suited to a collaborative format.)

There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends".

However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems . For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application.

I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go.

So far, this is my understanding at a high level - but I am sure it is wrong... I am primarily focusing on the Microsoft approaches to keep this focused.

ADO.NET Entity Framework

  • Database agnostic
  • Good because it allows swapping backends in and out
  • Bad because it can hit performance and database vendors are not too happy about it
  • Seems to be MS's preferred route for the future
  • Complicated to learn (though, see 267357 )
  • It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code

LINQ to SQL

"Standard" ADO.NET

  • No ORM
  • No abstraction so you are back to "roll your own" and play with dynamically generated SQL
  • Direct access, allows potentially better performance
  • This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend.
  • Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures.

ASP.NET Data Source Controls

Are these something altogether different or just a layer over standard ADO.NET? - Would you really use these if you had a DAL or if you implemented LINQ or Entities?

NHibernate

  • Seems to be a very powerful and powerful ORM?
  • Open source

Some other relevant links; NHibernate or LINQ to SQL Entity Framework vs LINQ to SQL

I think LINQ to SQL is good for projects targeted for SQL Server.

ADO.NET Entity Framework is better if we are targeting different databases. Currently I think a lot of providers are available for ADO.NET Entity Framework, Provider for PostgreSQL, MySQL, esql, Oracle and many other (check http://blogs.msdn.com/adonet/default.aspx ).

I don't want to use standard ADO.NET anymore because it's a waste of time. I always go for ORM.

Added for new technologies:

With Microsoft Sql Server out for Linux in Beta right now, I think it's ok to not be database agnostic. The.Net Core Path and MS-SQL route allows you to run on Linux servers like Ubuntu entirely with no windows dependencies.

As such, imo, a very good flow is to not use a full ORM framework or data controls and leverage the power of SSDT Visual Studio Projects (Sql Server Data Tools) and a Micro ORM.

In Visual Studio you can create a Sql Server Project as a legit Visual Studio Project. Doing so allows you to create the entire database via table designers or raw query editing right inside visual studio.

Secondly, you get SSDT's Schema Compare tool which you can use to compare your database project to a live database in Microsoft Sql Server and update it. You can sync your Visual Studio Project with the server causing updates in your project to go out to the server. Or you can sync the server with your project causing your source code to update. Via this route you can easily pick up changes the DBA made in maintenance last night and push out your new development changes for a new feature easily with a simple tool.

Using that same tool you can compute the migration script without actually running it, if you need to pass that off to an operations department and submit a change order, it works for that flow to.

Now for writing code against you MS-SQL Database, I recommend PetaPoco.

Because PetaPoco works Perfectly inline with the above SSDT solution. PetaPoco comes with T4 text templates you can use to generate all your data entity classes, and it generates the bulk data layer classes for you.

The catch is, you have to write queries yourself, which isn't a bad thing.

So you end up with something like this:

var people = dbContext.Fetch<Person>("SELECT * FROM People where Username Like '%@0%'", "bob");

PetaPoco automatically handles parameterizing @0 for you, it also has the handy Sql class for building queries.

Furthermore, PetaPoco is an order of magnitude faster than EF6 and 8+ times faster than EF7.

So in total, this solution involves using SSDT for SCHEMA management, and PetaPoco for code integration at the gain of high maintainability, customization, and very good performance.

The only downfall to this approach, is that you're hard tieing yourself to Microsoft Sql Server. However, imo, Microsoft Sql Server is one of the best RDBM's out there.

It's got DBMail, Jobs, CLR object capabilities, and on and on. Plus the integration between Visual Studio and MS-SQL server is phenomenal and you don't get any of that if you choose a different RDBMS.

Having worked on 20+ different C#/ASP.NET projects I always end up using NHibernate . I often start with a completely different stack - ADO.NET, ActiveRecord, hand rolled wierdness. There are numerous reasons why NHibernate can work in a wide range of situations, but the absolutely stand out for me is the saving in time, especially when linked to code generation. You can change the datamodel, and the entities get rebuilt, but most/all the other code doesn't need to be changed.

MS does have a nasty habit of pushing technologies in this area that parallel existing open source, and then dropping them when they don't take off. Does anyone remember ObjectSpaces?

I must say that I never used NHibernate for the immense time that needed to start using... time wasted on the XML setup .

I recently did a web application in MVC2 , where I did choose ADO Entities Framework and I use Linq all the time.

I must say, I was impressed with the speed , and our site was having around 35 000 unique visitors per day, in around 60Gb bandwidth per day (I reduced radically this 60Gb number by hosting all static files in Amazon S3 - Great .NET wrapper they have. I must say).

I will always go this way . It's easy to start (just add new data item, choose tables and that's it! for every change in the database we just need to refresh the model - made automatically in just 2 clicks) and it's fun to use - Linq rules!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM