简体   繁体   中英

Is copying /node_modules inside a Docker image not a good idea?

Most/all examples I see online usually copy package.json into the image and then run npm install within the image. Is there a deal breaker reason for not running npm install from outside on the build server and then just copying everything including the node_modules/ folder?

My main motivation for doing this is that, we are using a private npm registry, with security, and running npm from within an image, we would need to figure out how to securely embed credentials. Also, we are using yarn, and we could just leverage the yarn cache across projects if yarn runs on the build server. I suppose there's workarounds for these, but running yarn/npm from the build server where everything is already set up seems very convenient.

thanks

Public Dockerfiles out there are trying to provide generalized solution. Having dependencies coded in package.json makes it possible to share only one Dockerfile and not depend on anything not public available.

But at runtime Docker does not care how files got to container. So this is up to you, how you push all needed files to your container.

PS Consider layering. If you copy stuff under node_modules/ , do it in one step, by that only one layer is used.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM