As you may already be aware, resin.io offers native support for node.js. What this means is that you can push an ordinary node project and resin.io will build and run it on your devices without any modification.
In contrast, a non-node.js project requires one to add a Dockerfile
to a project to instruct resin.io how to build and run the container and subsequently your application.
So how do we handle node.js projects without a Dockerfile?
Fairly simply actually. If a node project is pushed to resin’s build server and it does not contain a Dockerfile
, we auto-generate and run a Dockerfile
similar to this:
# AUTOGENERATED DOCKER FILE
FROM resin/#{arch}-node:#{nodeVersion}-onbuild
Create and link directories
RUN mkdir -p /usr/src/app && ln -s /usr/src/app /app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN DEBIAN_FRONTEND=noninteractive JOBS=MAX npm install --unsafe-perm
COPY . /usr/src/app
CMD [ "npm", "start" ]
First thing we do in the FROM
command is grab a base image for the detected application type with node pre-installed. The node version is determined from the “engine” key in the package.json, or 0.10.22 (the default) when none is specified.
Then we create a directory /usr/src/app and COPY
your package.json
into that working directory we’ve just created. We also link /usr/src/app
with /app
. As many resin.io apps place source in /app
instead of /usr/src/app
.
We then install your node modules with npm install
and COPY
the remainder of your project files into the working directory of the container.
It’s important to note that we only copy package.json
before the npm install
and later copy the rest of your source. This lets us take advantage of the way caching in Docker works.
We use standard Docker caching which is done by comparing the instructions in the current Dockerfile with the ones in the previous build. If the instructions have changed, the cache is invalidated. This however works slightly different for the ADD
and COPY
instructions. For these instructions, Docker examines the contents of the files being put into the image. If there are any changes, even in the file metadata, then the cache is invalidated for that instructions, as well as all sub-sequential steps.
So by COPY
ing only package.json
first, then running npm install
and then COPY
ing your remaining project files you will avoid invalidating the cached npm install
every time you make a change to your project files. Of course, making changes to your package.json
will invalidate the cache, which is something you’d want anyway in order to install any newly added project dependencies.
All thats left to do is start your application with npm start
. This will run whatever is declared as the start script in your package.json
.
"scripts": {"start": "node server.js"}
Caching Caveats
Resin.io’s native node.js support can’t cache your npm install
step when your package.json
contains install
, preinstall
and/or postinstall
scripts. This is because these scripts likely come from your own source. As a result, we have to generate a different Dockerfile that COPY
s your source along with the package.json before the npm install runs. So instead of this:
COPY package.json /usr/src/app/
RUN DEBIAN_FRONTEND=noninteractive JOBS=MAX npm install --unsafe-perm
COPY . /usr/src/app
We generate this:
COPY . /usr/src/app
RUN DEBIAN_FRONTEND=noninteractive JOBS=MAX npm install --unsafe-perm
As a consequence, any change to your source would invalidate the cache for COPY . /usr/src/app/
and would result in a rebuild of the npm install
for every push.
Similarly your cache is invalidated if your package.json
contains:
- a file named
wscript
- a file ending in
.gyp
If your project meets any of these caching exceptions it will force the project to rebuild every time you push, which means longer iteration cycles and considerably less fun.
So how do I ensure caching on node projects with install scripts?
Luckily it’s fairly straight forward. Create your own Dockerfile. Resin will detect this and negate the Dockerfile auto-generation. Meaning you’ll have total control over building your container and subsequently the caching too.
For a quick example on how to write a Dockerfile for a node
project, I looked for a native node project with install
scripts. I didn’t have to look too far. Our demo project text2speech has a pre-install script meaning it never caches the npm install
! Oopsy-daisy!
To remedy this I’ve created a forked version with a Dockerfile.template
. A Dockerfile.template
is the same as a Dockerfile
but it allows one to use certain resin.io variables, like %%RESIN_ARCH%%
. This variable allows you to automatically pull the correct base image for your application’s device architecture eg. armv7hf
. Neat!
Now, I have complete control over the COPY
commands meaning my cache is safe! Take a look at the Dockerfile.template
I created for text2speech. Pay special attention to the COPY
commands.
# Use base image for device arch with node installed
FROM resin/%%RESIN_ARCH%%-node:0.10.38
Create src dir
RUN mkdir -p /usr/src/app/
Set as WORKDIR
WORKDIR /usr/src/app
Only package.json and pre-install script here for caching purposes
COPY package.json deps.sh ./
Install deps
RUN JOBS=MAX npm install --unsafe-perm --production && npm cache clean
Copy all of files here for caching purposes
COPY . ./
npm start will run server.js by default
CMD npm start
In summary, if your package.json
doesn’t have any install scripts, it’s simpler to push a plain node project. However, if it does have any install
, preinstall
and/or postinstall
scripts, it’s best to include a Dockerfile or Dockerfile.template to ensure you’re optimizing your caching across builds.
Hope this clears things up!
Have questions, or just want to say hi? Find us on our community chat.