NPM & Docker: Sharing volumes

Post by Saul Shanabrook

Update: This is probably a better idea, though I haven't tried it.

Many different people have had trouble working with node and Docker. The basic issue is this.

I want to run npm install in my Dockerfile so that when I do a docker run ... it doesn't have to reinstall my modules. I also want to mount my local folder in the container, so that I don't have to rebuild after making a source code change (docker run -v $PWD:/app ...). If I just share my whole folder, however, then my node_modules will be wiped out in the docker container, because it doesn't exist on my host machine.

There a bunch of solutions. For a while I had been making a symlink from ./node_modules to /install/node_modules and then installing my node modules into that folder in the docker container. I would check the node_modules symlink into git. This did work, but it meant the project was tied exclusively to docker. If someone wanted to install without docker, they had to delete that file. Also, it felt a bit weird to have that file sitting there.

Then I tried telling npm to install everything globally.

So something like this:

FROM node

# to properly install node-gyp package
ENV USER root

# so that npm installs not into the code path, so we can share this directory
# and still have npm_modules not shared
RUN mkdir -p /install/  
# so that executable from modules are added to the path
ENV PATH /install/bin:$PATH  
# so that you can require('..') a global module
ENV NODE_PATH /install/lib/node_modules/  
# so that it installs global modules into /install/lib/node_modules
ENV NPM_CONFIG_PREFIX=/install/  
# so that it install in global location by default
ENV NPM_CONFIG_GLOBAL=true


RUN npm install tape

WORKDIR /src/  
COPY ./package.json /src/package.json  
RUN npm install  
COPY . /src/

CMD npm test  

However, then when I tried to require something that was installed by my own package, I couldn't, because it was in /install/lib/node_modules/cerebreal-babaob/node_modules, instead of /install/lib/node_modules, because I guess when npm installs something globally, it doesn't install it's dependencies at the top level.

So instead I just went with installing in that folder, and just setting the NODE_PATH so you could require from it:

FROM node

# to properly install node-gyp package
ENV USER root

# so that npm installs not into the code path, so we can share this directory and still have npm_modules not shared
RUN mkdir -p /install/  
# so that executable from modules are added to the path
ENV PATH /install/node_modules/.bin:$PATH  
# so that you can `require` any installed modeule
ENV NODE_PATH /install/node_modules/

COPY ./package.json /install/package.json  
RUN cd install; npm install


WORKDIR /src/  
COPY . /src/

CMD npm test
docker build -t cerebral-baobab .  
docker run --rm -it -v $PWD:/src/ cerebral-baobab  

Lemme know if you find an easier way/this doesn't work for you in the comments.