简体   繁体   中英

Linq2Sql: How do manage large resultsets?

Let say I have a query with a very large resultset (+100.000 rows) and I need to loop through the and perform an update:

var ds = context.Where(/* query */).Select(e => new { /* fields */ } );

foreach(var d in ds)
{
//perform update
}

I'm fine with this process taking long time to execute but I have limited amount of memory on my server.

What happens in the foreach? Is the entire result fetched at once from the database?

Would it be better to use Skip and Take to do the update in portions?

Best way is to use Skip and Take yes and make sure that after each update, you dispose the DataContext (by using "using")

You could check out my question, has a similiar problem with a nice solution: Out of memory when creating a lot of objects C#

YOu basically abuse LINQ2SQL - not made for that.

  • ALl results are laoded into memory.
  • YOur changes are written out once, after you are done.

This will be slow, and it will be - hm - using TONS of memory. Given limited amounts of memory - not possible.

Do NOT load all data in at once. Try to run multiple queries with partial result sets (1000-2500 items each).

ORM's are not made for mass manipulation.

您是否可以使用存储过程一次性更新所有内容?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM