简体   繁体   English

当存储过程返回 1700 万行时,它在访问 Delphi 中的数据集时抛出了 memory

[英]When stored procedure returns 17 million rows, it's throwing out of memory while accessing dataset in Delphi

I'm using Delphi 6 for developing windows application and have a stored procedure which returns around 17 million rows.我正在使用 Delphi 6 开发 windows 应用程序,并且有一个返回大约 1700 万行的存储过程。 It takes 3 to 4 minutes while returning data in SQL Server Management Studio. SQL Server Management Studio返回数据需要3到4分钟。

And, I'm getting an "out of memory" exception while I'm trying to access the result dataset.而且,当我尝试访问结果数据集时,出现“内存不足”异常。 I'm thinking that the sp.execute might to executed fully.我认为sp.execute可能会完全执行。 Do I need to follow any steps to fix this or shall I use sleep() to fix this issue?我需要按照任何步骤来解决这个问题还是应该使用sleep()来解决这个问题?

  • Delphi 6 can only compile 32 bit executables. Delphi 6 只能编译32位可执行文件。
  • 32 bit executables running on a 32 bit Windows have a memory limit of 2 GiB .在 32 位 Windows 上运行的 32 位可执行文件的memory 限制为 2 GiB This can be extended to 3 GiB with a hardware boot switch.这可以通过硬件启动开关扩展到3 GiB
  • 32 bit executables running on a 64 bit Windows have the same memory limit of 2 GiB.在 64 位 Windows 上运行的 32 位可执行文件具有相同的 memory 限制 2 GiB。 Using the "large address aware" flag they can at max address 4 GiB of memory .使用“大地址感知”标志,他们可以在 memory 的最大地址 4 GiB 处
  • 32 bit Windows executables emulated via WINE under Linux or Unix should not be able to overcome this either, because 32 bit can at max store the number 4,294,967,295 = 2³² - 1, so the logical limit is 4 GiB in any possible way.通过 Linux 或 Unix 下的 WINE 模拟的 32 位 Windows 可执行文件也不应该能够克服这个问题,因为 32 位最多可以存储数字 4,294,967,295 = 2³² - 1,因此逻辑限制在任何可能的方式下都是 4 GiB。
  • Wanting 17 million datasets on currently 1,9 GiB of memory means that 1,9 * 1024 * 1024 * 1024 = 2,040,109,465 bytes divided by 17,000,000 gives a mean of just 120 bytes per dataset.当前 memory 的 1.9 GiB 上需要 1700 万个数据集意味着 1.9 * 1024 * 1024 * 1024 = 2,040,109,465 字节除以 17,000,000 给出每个数据集的平均值仅为 120 字节。 I can hardly imagine that is enough.我很难想象这就足够了。 And it would even only be the gross load, but memory for variables are still needed.它甚至只是总负载,但仍然需要 memory 变量。 Even if you manage to put that into large arrays you'd still need plenty of overhead memory for variables.即使您设法将其放入大的 arrays 中,您仍然需要大量开销 memory 用于变量。

Your software design is wrong.你的软件设计是错误的。 As James Z and Ken White already pointed out: there can't be a scenario where you need all those dataset at once, much less the user to view them all at once.正如James ZKen White已经指出的那样:不可能存在您同时需要所有这些数据集的场景,更不用说用户一次查看它们了。 I feel sorry for the poor souls that yet had to use that software - who knows what else is misconcepted there.我为那些不得不使用该软件的可怜人感到难过——谁知道那里还有什么误解。 The memory consumption should remain at sane levels. memory 消耗应保持在正常水平。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM