Docker: For development setup

LiftOff LLC
4 min readJan 19, 2021

Summary

Docker can expedite local development environment setup. If done right, setting up development environment can be fast, really fast, really really fast.

The Premise

In one of the recent projects, we had a requirement to convert webm video to mp4. We used ffmpeg binary to do this. If using ffmpeg binary, install(on whatever OS) ffmpeg and run something like

ffmpeg -i input.webm output.mp4

This is quite straight forward on your laptop(or computer system). Although, it’s a bit complicated on at least two occasions. One, when you got to deploy the solution. Second, when you get another developer to ameliorate miseries. It’s a bigger misery when both of you are working on different operating systems. Basic development knowledge of docker will be helpful to understand the following article.

In Action

Docker makes it easy to solve some of these issues. There are many ways you can go about it, of that, one of the ways is.

  1. Create a Dockerfile
  2. Build the container
  3. Push the build to remote repository
  4. Others pull the docker image.

Instant gratification

We will cover how to go about this in the implementation, but for instant gratification, if you have docker installed, run the following

docker run -it --rm ch4nd4n/docker-node-hello node

If everything went through fine, it should print bring up node REPL

This may not look like much, but behind the scenes, you downloaded a docker container and ran Node.js.

Implementation

For sake brevity, let’s cite this with a very simple node example(instead of ffmpeg).

Create a javascript file

touch index.js

Fill it in with a console log.

// index.js
console.log('Hello');

Run it with node

node index.js

Push out the code in Git and the rest of the developers can contribute to it and run it locally as long as they have node.js installed. That’s one logical way to go about it. Till you start adding dependencies.

Let’s get this rolling with Docker.

  1. Create Dockerfile
  2. Create index.js
  3. Build docker image
  4. Run the docker container

1. Create Dockerfile

echo "FROM node:alpine" > Dockerfile

2. Create javascript file

echo "console.log('hello')" > index.js

3. Build Docker

docker build -t tmp-node-app .

4. Run Docker container

docker run -v $PWD:/home/node --rm tmp-node-app node /home/node/index.js

Source: https://github.com/ch4nd4n/docker-node-hello/tree/main

At this point, if this set of code is shared with other developers/system who got Docker installed, can run this with docker run as cited above. The target system doesn’t require node.js to run this. This was the primary intent of the article, although it can be expanded further. Read on

Sprucing it up with docker-compose

Let’s use fastify to create a very simple server.

  1. Add package.json with dependency fastify and nodemon
  2. Update Dockerfile
  3. Add docker-compose.yml

Check out the source at https://github.com/ch4nd4n/docker-node-hello/tree/fastify

Dockerfile defines what it Base it extends

FROM node:lts-alpine

docker-compose.yml defines the configuration

version: "3"
services:
fastify:
build:
context: .
container_name: fastify
working_dir: /home/app
volumes:
- .:/home/app
ports:
- "3000:3000"
command: sh -c "npm install && npm run dev"

index.js

const fastify = require('fastify')({ logger: true })// Declare a route
fastify.get('/', async (request, reply) => {
return { hello: 'world' }
})
// Run the server!
const start = async () => {
try {
await fastify.listen(3000, '0.0.0.0')
fastify.log.info(`server listening on ${fastify.server.address().port} or is it?`)
} catch (err) {
fastify.log.error(err)
process.exit(1)
}
}
start()

package.json defines dependencies and script.

{
"name": "docker-node-hello",
"version": "1.0.0",
"description": "Docker fastify gratification",
"main": "index.js",
"dependencies": {
"fastify": "3.9.2"
},
"devDependencies": {
"nodemon": "^2.0.6"
},
"scripts": {
"dev": "nodemon index.js"
},
"keywords": [],
"author": "",
"license": "ISC"
}

If you have things in place, to run this locally

docker-compose up

If you change index.js in the editor, nodemon will restart the app automatically. To extend it further, let's say we required Redis for local development, we can extend the compose file and include Redis.

The change set goes something like below include Redis dependency inpackage.json`

Add following in docker-compose

redis:
image: redis:alpine

index.js

const redis = require('redis');
const client = redis.createClient('redis://redis:6379');
// add current date as a key when app starts
client.set('foo', new Date());
fastify.get('/', (request, reply) => {
client.get('foo', (err, redRply) => {
console.log({ redRply })
return reply.send({
msg: redRply
})
});
});

docker-compose up should bring up the instance(note that it may run into an error if Node.js boots up faster than Redis)

curl http://localhost:3000/

If everything goes fine above should return date persisted as a key in Redis.

Redis changes can be viewed at https://github.com/ch4nd4n/docker-node-hello/tree/fastify-redis

What we achieved here was, to have an environment where developers need not setup Node.js or Redis natively. Docker can take care of that. This setup is a coarse-grained demo of what we can do with Docker for local development. Docker can be setup for even complex setup where the development environment may depend on multiple services.

The original ffmpeg story, was nearly not that simple. It infact never made it on anyone’s box, it had its set of quirks, but running it within Docker made it dead simple to run it on any box(almost any box).

--

--

LiftOff LLC

We are business accelerator working with startups / entrepreneurs in building the product & launching the companies.