I need to move some data daily from a Sql Server database on one server to another Sql Server database on another server. I have complete read access on the origin server. The destination database is pulled from to process some transforms for an accounting system. I have to transform the data and use stored procedures for the destination database. After my data is loaded on the destination database a transform is triggered and my data is altered/moved.
We only want data changes sent to the destination database so we intend to use a temp database to compare it (on a different sql server) before sending anything.
We were thinking about using Entity Framework for reading and caching, but I'm worried that this would involve us creating two different models and comparing them before saving them. This would be a pain to do, but it would allow us to transform/modify the data as objects and would greatly simplify our business logic.
Is it recommended to have separate processes for this portion or to continue with two different data models?
To simplify:
Or
If I choose the first option, is it worth my time to use entity framework any more? Am I over thinking this and there is a better way around this entirely?
1: create a 2 database context models. lets say datacontext1 and datacontext2.
2: Create a dto(Data Transfer Object) which will be very similar to the tables that are being transferred from datacontext1 to datacontext2.
3: Use automapper to map the properties of datacontext1 to the dto that is created as well as mapping datacontext2 to the dto.
4: run the use a service layer with functions of "public dto readdata(table1)" and "public void savedata(dto)"
5: read the data from dbcontext1 and save it to dbcontext2
this way if anything changes in database1 or database2, it would be very easy to handle the change.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.