简体   繁体   中英

How to selectively export mysql data for a github repo

We're an opensource project and would like to collaboratively edit our website through github public repo.

Any ideas on the best solution to export the mysql data to github, as mysql can hold some sensitive info in it, and how we can version the changes that happen in it ?

Answer is you don't hold data in the repo. You may want to hold your ddl, and maybe some configuration data. But that's it. If you want to version control your data, there are other options. GIT isn't one of them

Use a blog engine "backend-ed by git", forget about mysql, commit on github.com, push and pull, dominate !

Here it is a list of the best:

  1. http://jekyllrb.com/
  2. http://nestacms.com/
  3. http://cloudhead.io/toto
  4. https://github.com/colszowka/serious

and just in case, ... a simple, Git-powered wiki with a sweet API and local frontend. :

Assuming that you have a small quantity of data that you wish to treat this way, you can use mysqldump to dump the tables that you wish to keep in sync, check that dump into git, and push it back into your database on checkout.

Write a shell script that does the equivalent of:

mysqldump [options] database table1 table2 ... tableN > important_data.sql

to create or update the file. Check that file into git and when your data changes in a significant way you can do:

mysql [options] database < important_data.sql

Ideally that last would be in aa git post-receive hook, so you'd never forget to apply your changes.

So that's how you could do it. I'm not sure you'd want to do it. It seems pretty brittle, esp. if Team Member 1 makes some laborious changes to the tables of interest while Team Member 2 is doing the same. One of them is going to check-in their changes first, and best case you'll have some nasty merge issues. Worst case is that one of them lose all their changes.

You could mitigate those issues by always making your changes in the important_data.sql file, but the ease or difficulty of that depend on your application. If you do this, you'll want to play around with the mysqldump options so you get a nice readable, and git- mergable file.

似乎dbdeploy正是您所寻找的

You can export each table as a separate SQL file. Only when a table is changed it can be pushed again.

If you were talking about configuration then I'd recommend sql dumps or similar to seed the database as per Ray Baxters answer.

Since you've mentioned Drupal, I'm guessing the data concerns users/ content. As such you really ought to be looking at having a single database that each developer connects to remotely - ie one single version. This is because concurrent modifications to mysql tables will be extremely difficult to reconcile (eg two new users both with user.id = 10 each making a new post with post.id = 1, post.user_id = 10 etc).

It may make sense, of course, to back this up with an sql dump (potentially held in version control) in case one of your developers accidentally deletes something critical.

If you just want a partial dump, PHPMyAdmin will do that. Run your SELECT statement and when it's displayed there will be an export link at the bottom of the page(the one at the top does the whole table).

You can version mysqldump files which are simply sql scripts as stated in the prior answers. Based on your comments it seems that your primary interest is to allow the developers to have a basis for a local environment.

Here is an excellent ERD for Drupal 6. I don't know what version of Drupal you are using or if there have been changes to these core tables between v6 and v7, but you can check that using a dump, or phpMyAdmin or whatever other tool you have available to you that lets you inspect the database structure. Drupal ERD

Based on the ERD, the data that would be problematic for a Drupal installation is in the users, user_roles, and authmap tables. There is a quick way to omit those, although it's important to keep in mind that content that gets added will have relationships to the users that added it, and Drupal may have problems if there aren't rows in the user table that correspond to what has been added.

So to script the mysqldump, you would simply exclude the problem tables, or at very least the user table.

mysqldump -u drupaldbuser --password=drupaluserpw 0-ignore-table=drupaldb.user drupaldb > drupaldb.sql

You would need to create a mock user table with a bunch of test users with known name/password combinations that you would only need to dump and version once, but ideally you want enough of these to match or exceed the number of real drupal users you'll have that will be adding content. This is just to make the permissions relationships match up.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM