Collectives™ on Stack Overflow
Find centralized, trusted content and collaborate around the technologies you use most.
Learn more about Collectives
Teams
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
Learn more about Teams
In this Node.js tutorial on Docker:
https://nodejs.org/en/docs/guides/nodejs-docker-webapp/
What is the point of
COPY package*.json ./
?
Isn't everything copied over with
COPY . .
?
The Dockerfile in question:
FROM node:8
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm@5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
–
–
This is a common pattern in Dockerfiles (in all languages). The npm install
step takes a long time, but you only need to run it when the package dependencies change. So it's typical to see one step that just installs dependencies, and a second step that adds the actual application, because it makes rebuilding the container go faster.
You're right that this is essentially identical if you're building the image once; you get the same filesystem contents out at the end.
Say this happens while you're working on the package, though. You've changed some src/*.js
file, but haven't changed the package.json
. You run npm test
and it looks good. Now you re-run docker build
. Docker notices that the package*.json
files haven't changed, so it uses the same image layer it built the first time without re-running anything, and it also skips the npm install
step (because it assumes running the same command on the same input filesystem produces the same output filesystem). So this makes the second build run faster.
–
–
–
–
–
During building of an image docker works on the basis of layer based architecture that is each line you write in a Dockerfile gets into the layer and gets cached... now the purpose of copying the package*.json
file first is a kind of optimization you did in the Dockerfile during the building of an image if bcoz we want to run the command npm install
only when some dependencies gets added into the project hence copying first package*.json into the image file system for every successive build runs npm install only when a new dependency gets added into the project and then just copy everything into the image filesystem then after docker is a headless pc of software it doesn't check a layer subsequent to the change of a layer it just executes after then ... hence we get saved each time without running npm install after copying the entire host file system into image file system
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.