[英]How to seed the production database using the Capistrano gem?
I am using Ruby on Rails 3.0.9 and I would like to seed the production database in order to add some record without re-building all the database (that is, without delete all existing records but just adding some of those not existing yet). 我正在使用Ruby on Rails 3.0.9并且我希望播种生产数据库以便添加一些记录而无需重新构建所有数据库 (即,不删除所有现有记录但只添加一些尚未存在的记录) 。 I would like to do that because the new data is needed to make the application to work.
我想这样做是因为需要新数据才能使应用程序正常工作。
So, since I am using the Capistrano gem, I run the cap -T
command in the console in order to list all available commands and to know how I can accomplish what I aim: 因此,由于我使用的是Capistrano gem,我在控制台中运行
cap -T
命令以列出所有可用命令并知道如何实现我的目标:
$ cap -T
=> ...
=> cap deploy:seed # Reload the database with seed data.
=> ...
I am not sure on the word "Reload" present in the "Reload the database with seed data." 我不确定“使用种子数据重新加载数据库”中出现的“重新加载”这个词。 sentence.
句子。 So, my question is: if I run the
cap deploy:seed
command in the console on my local machine will the seeding process delete all existing data in the production database and then populate it or will that command just add the new data in that database as I aim to do? 所以,我的问题是: 如果我在本地机器的控制台中运行
cap deploy:seed
命令,播种过程将删除生产数据库中的所有现有数据,然后填充它或者该命令只是在该数据库中添加新数据我的目标是什么?
If you are using bundler, then the capistrano task should be: 如果您使用的是bundler,那么capistrano任务应该是:
namespace :deploy do
desc "reload the database with seed data"
task :seed do
run "cd #{current_path}; bundle exec rake db:seed RAILS_ENV=#{rails_env}"
end
end
and it might be placed in a separate file, such as lib/deploy/seed.rb
and included in your deploy.rb file using following command: 它可能放在一个单独的文件中,例如
lib/deploy/seed.rb
并使用以下命令包含在deploy.rb文件中:
load 'lib/deploy/seed'
This worked for me: 这对我有用:
task :seed do
puts "\n=== Seeding Database ===\n"
on primary :db do
within current_path do
with rails_env: fetch(:stage) do
execute :rake, 'db:seed'
end
end
end
end
capistrano 3, Rails 4 capistrano 3,Rails 4
Using Capistrano 3, Rails 4, and SeedMigrations , I created a Capistrano seed.rb task under /lib/capistrano/tasks: 使用Capistrano 3,Rails 4和SeedMigrations ,我在/ lib / capistrano / tasks下创建了一个Capistrano seed.rb任务:
namespace :deploy do
desc 'Runs rake db:seed for SeedMigrations data'
task :seed => [:set_rails_env] do
on primary fetch(:migration_role) do
within release_path do
with rails_env: fetch(:rails_env) do
execute :rake, "db:seed"
end
end
end
end
after 'deploy:migrate', 'deploy:seed'
end
My seed migrations are now completely separate from my schema migrations, and ran following the db:migrate process. 我的种子迁移现在完全独立于模式迁移,并在db:migrate进程之后运行。 What a joy!
真高兴啊! :)
:)
Try adding something like this in your deploy.rb : 尝试在deploy.rb中添加类似的内容 :
namespace :deploy do
desc "reload the database with seed data"
task :seed do
run "cd #{current_path}; rake db:seed RAILS_ENV=#{rails_env}"
end
end
After a discussion with capistrano-rails gem authors I decided to implement this kind of tasks in a separate gem. 在与capistrano-rails gem作者讨论后,我决定在一个单独的gem中实现这种任务。 I think this helps to follow the DRY idea and not implementing the same task over and over again.
我认为这有助于遵循DRY的想法而不是一遍又一遍地执行相同的任务。
I hope it helps you: https://github.com/dei79/capistrano-rails-collection 我希望它可以帮到你: https : //github.com/dei79/capistrano-rails-collection
cap deploy:seed
should basically be a reference to
rake db:seed
.
cap deploy:seed
应该基本上是对
rake db:seed
的引用。
It should not delete existing data, unless you specified it to do so in your
seed.rb
.
除非您在
seed.rb
指定它,否则不应删除现有数据。
Best assumption for the word "Reload" is that
:seed
is a stateless command, I does not automatically know where it left off, like regular rails migrations.
“Reload”这个词的最佳假设是
:seed
是一个无状态命令,我不会自动知道它停在哪里,就像常规的rails迁移一样。
So technically you would always be "reloading" the seed, every time you run it.
所以从技术上讲,每次运行时,你都会“重新加载”种子。
...A wild guess, but it sounds good, no?
......一个疯狂的猜测,但听起来不错,不是吗?
Please view Javier Vidal
answer below 请查看下面的
Javier Vidal
答案
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.