[英]It is possible to stream a large SQL Server database result set using Dapper?
I have about 500K rows I need to return from my database (please don't ask why). 我需要从我的数据库返回大约500K行(请不要问为什么)。
I will then need to save these results as XML (more URGH) and the ftp this file to somewhere magical. 然后我需要将这些结果保存为XML(更多URGH)和ftp这个文件到某个神奇的地方。
I also need to transform the each row in the result set. 我还需要转换结果集中的每一行。
Right now, this is what I'm doing with say .. TOP 100
results: 现在,这就是我正在做的事情.. TOP 100
结果:
Query<T>
method, which throws the entire result set into memory 使用Dapper的Query<T>
方法,将整个结果集抛出到内存中 This works fine for 100 rows, but I get an Out Of Memory exception with AutoMapper when trying to convert the 500K results to a new collection. 这适用于100行,但在尝试将500K结果转换为新集合时,我使用AutoMapper获得Out Of Memory异常。
So, I was wondering if I could do this... 所以,我想知道我是否能做到这一点......
I'm trying to stop throwing everything into RAM. 我试图停止把所有东西扔进RAM。 My thinking is that if I can stream stuff, it's more memory efficient as I only work on a single result set of data. 我的想法是,如果我可以流式传输,那么它的内存效率更高,因为我只处理单个数据结果集。
using Dapper's
Query<T>
method, which throws the entire result set into memory 使用Dapper的Query<T>
方法,将整个结果集抛出到内存中
It is a good job, then, that one of the optional parameters is a bool
that lets you choose whether to buffer or not ;p 那么,这是一个很好的工作,其中一个可选参数是一个bool
,可以让你选择是否缓冲; p
Just add , buffer: false
to your existing call to Query<T>
. 只需添加, buffer: false
为您对Query<T>
现有调用。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.