简体   繁体   中英

How to import a mysql dump file into a Docker mysql container

Greetings and thanks in advance, I'm actually new to docker and docker-compose, watching a lot of videos and reading a lot of articles so far along with trying things.

I've got a front end container and a back end container that build and run alone as a Dockerfile and in a docker-compose setup.

(I've been building with Dockerfile first and then integrating the containers into docker-compose to make sure i understand things correctly)

I'm at the point where i need my database info, since i'll use docker-compose, as i understand it, it should build under the same network with a react front end and django back end.

I have a backup mysql dump file that I'm working with, what i think i need to do is have a container running mysql server and serving out my tables (like I have it locally working). I haven't been able to figure out how to import the backup into my docker mysql container.

Any help is appreciated.

What I've tried so far is using docker in the command line to outline the pieces i'll need in the Dockerfile and then what to move into the docker-compose as mentioned above:

docker run -d --name root -e MYSQL_ROOT_PASSWORD=root mysql # to create my db container

Then I've tried a bunch of commands and permutations of commands, recently in the CLI, here are some of my most recent trials and errors:

docker exec -i root mysql -uroot -proot --force < /Users/homeImac/Downloads/dump-dev-2020-11-10-22-43-06.dmp

ERROR 1046 (3D000) at line 22: No database selected

docker exec -i f803170ce38b sh -c 'exec mysql -uroot -p"root"' < /Users/homeImac/Downloads/dump-dev-2020-11-10-22-43-06.dmp

ERROR 1046 (3D000) at line 22: No database selected

docker exec -i f803170ce38b sh -c 'exec mysql -uroot -h 192.168.1.51 -p"root"' < /Users/homeImac/Downloads/dump-dev-2020-11-10-22-43-06.dmp

ERROR 1045 (28000): Access denied for user 'root'@'homeimac' (using password: YES)

I've scoured the web so far and i'm not sure where to go next, have I got the right idea? If anyone has an example of how to import a database dump (in dmp or dmp.gz), once i get that working, I'll actually do that in the docker-compose file.

Thinking about it, i just have to create the container and import so I might not even need a Dockerfile. I'll cross that bridge when i get there. This is what I'm thinking though:

db:
    image: mysql:5.7
    restart: always
    environment:
      MYSQL_DATABASE: 'app'

etc etc

I've learned a lot super fast, maybe too fast. Thanks for any tips!

The answer to your question is given in the docker hub page of MySQL .

Initializing a fresh instance

When a container is started for the first time, a new database with the specified name will be created and initialized with the provided configuration variables. Furthermore, it will execute files with extensions.sh, .sql and.sql.gz that are found in /docker-entrypoint->initdb.d. Files will be executed in alphabetical order. You can easily populate your mysql services by mounting a SQL dump into that directory and provide custom images with contributed data. SQL files will be imported by default to the database specified by the MYSQL_DATABASE variable.

In your docker-compose.yml use:

volumes:
  - ${PWD}/config/start.sql:/docker-entrypoint-initdb.d/start.sql

and that's it.

Here's the answer that worked for me after working with 2 colleagues that know backend better where I work.

It was pretty simple actually. I created a directory in my repo that would be empty. I added *.sql and *.dmp to my.gitignore so the dump files would not increase the size of my image.

That directory using docker-compose would be used as a volume under the mysql service:

volumes:
      - ~/workspace/app:/workspace/app

The dump file is placed there and is imported into the sql service when I run:

mysql -u app -papp app < /path/to/the/dumpfile

I can go in using docker exec and verify not only the database is there but the tables from my dump file are there as well.

For me, I had to create a new superuser also in my backend container through our Django app.

python3 manage.py createsuperuser

With that, then logging in on localhost:8000/api, everything was linked up between the mysql, backend, and frontend containers.

Hope this helps, I'm sure not all the details are the same for others post volumes, but using volumes. I didn't have to copy any dump file in and it ended up automatically imported and served. That was my big issue.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM