简体   繁体   English

在单个插入activerecord中发出大量数据

[英]rails large amount of data in single insert activerecord gave out

So I have I think around 36,000 just to be safe, a number I wouldn't think was too large for a modern sql database like mysql. 因此,我认为大约36,000只是为了安全起见,对于像mysql这样的现代sql数据库,我认为这个数字不会太大。 Each record has just two attributes. 每个记录只有两个属性。

So I do: so I collected them into one single insert statement 所以我做:所以我将它们收集到一个插入语句中

sql = "INSERT INTO tasks (attrib_a, attrib_b) VALUES (c1,d1),(c2,d2),(c3,d3)...(c36000,d36000);"

ActiveRecord::Base.connection.execute sql

from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/abstract_adapter.rb:219:in `log'
from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/mysql_adapter.rb:323:in `execute_without_analyzer
from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute'
from C:/Ruby/lib/ruby/1.8/benchmark.rb:308:in `realtime'
from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute'
from (irb):53
from C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/vendor/tzinfo-0.3.12/tzinfo/time_or_datetime.rb:242

I don't know if the above info is enough, please do ask for anything that I didn't provide here. 我不知道上面的信息是否足够,请索要我在这里没有提供的任何信息。 So any idea what this is about? 那么这是什么意思吗?

THANK YOU!!!! 谢谢!!!!

the problem is because of time out. 问题是因为超时。 I had same kind of problem while Using doctrine ORM.In php we can solve this issue by changing script time in php.ini file. 我在使用doctrine ORM时遇到同样的问题。在php中,我们可以通过更改php.ini文件中的脚本时间来解决此问题。 But i dont know how to change the number in rails.May be some one here will help u.. 但是我不知道如何更改滑轨的数量。可能有人在这里会有所帮助..

好吧,如果没有更多信息,我可能会冒险猜测您超出了MySQL服务器的max_packet_size值。

I found a solution: 我找到了解决方案:

Inspired by Piemesons's suggestion, I chopped up the values to be included in one single insert into into groups of 10,000. 受Piemesons的建议启发,我将要包含在一个插入物中的值切成10,000个一组。 so therefore having n/10000 .ceil many inserts. 因此,n / 10000 .ceil有许多插入。

And it worked! 而且有效!

It's still speedy and I am still not sure what configuration limitation was and is still preventing me to do a single insert with that size. 它仍然是快速的,我仍然不确定是什么配置限制,并且仍在阻止我执行该大小的单个插入操作。 if anyone knows, please do offer it as a comment so we can all learn from this. 如果有人知道,请提供它作为评论,以便我们都可以从中学习。

Best, 最好,

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM