How to Dockerize a Node.js REST API

The process of deploying and running applications in different environments can be a hassle since there are a number of factors to consider such as setting up the environment variables to configuring the necessary dependencies and specific versions of different software packages.
However, by utilizing Docker’s containerization technology, you can deploy applications in different environments with minimal effort with all the necessary dependencies in the docker image. Meaning, you don’t have to worry about making any configuration. This makes the process of deploying and running applications in different environments a breeze.
What Is Docker?
Docker is a development platform that provides the tools and environment to package applications as portable images that can be run as self-contained executable components in containers.
These containers constitute the application’s code and the required dependencies for the application to run successfully on different runtime environments without any issues.
Before you get started, install Docker on your local machine. Check the platform-specific prerequisites and installation instructions from the official documentation.
Create a Node.js REST API
To get started, create a Node.js web server.
Next, install the required packages for this project.
npm install morgan pg knex
The pg package is used to establish a connection with a PostgreSQL database. knex, on the other hand, provides a simple API for interacting with PostgreSQL — you will use it to write SQL queries.
Lastly, you will use morgan, a middleware that logs HTTP requests and responses on the console, to debug and monitor your application running in a Docker container.
Finally, open the index.js file, and add the code below that implements a simple REST API with three routes.
const express = require("express")
const morgan = require("morgan")
const app = express()
const db = require('./db')
const PORT = process.env.PORT || 5000app.use(morgan('dev'))
app.use(express.json())
app.use(express.urlencoded({ extended: true }))
app.get("https://www.makeuseof.com/", (req, res) => res.send('Hello World!' ))
app.get('/users', async (req, res) => {
const users = await db.select().from('users')
res.json(users)
})
app.post('/users', async (req, res) => {
const user = await db('users').insert({ name: req.body.name }).returning('*')
res.json(user)
})
app.listen(PORT, () => console.log(`Server up at PORT:${PORT}`))
Configure the Database Connection
The REST API will interact with Docker’s PostgreSQL instance, however, you first need to configure the database connection in your application. In the root directory of your project folder, create a db.js file and add the code below.
const knex = require('knex')
module.exports = knex({
client: 'postgres',
connection: {
host: 'db',
user: 'testUser',
password: 'mypassword123',
database: 'testUser',
},
})
Set Up the migrate.js and seed.js Files
These two files will make it possible to create a table in the database and populate it with test data via the API. Create a new folder, scripts, in the root directory of your project and add two files: migrate.js and seed.js.
In the migrate.js file, add the code below:
const db = require('../db');
(async () => {
try {
await db.schema.dropTableIfExists('users')
await db.schema.withSchema('public').createTable('users', (table) => {
table.increments()
table.string('name')
})
console.log('Created users table!')
process.exit(0)
} catch (err) {
console.log(err)
process.exit(1)
}
})()
This code will create a users table with an auto-incrementing id column and a name column in the database.
Next, in the seed.js file, add the code below:
const db = require('../db');
(async () => {
try {
await db('users').insert({ name: 'Test User1' })
await db('users').insert({ name: 'Test User2' })
console.log('Added dummy users!')
process.exit(0)
} catch (err) {
console.log(err)
process.exit(1)
}
})()
This code implements an asynchronous function that will insert two users into the PostgreSQL database.
Finally, add these commands to your package.json file.
"scripts": {
"start": "node index.js",
"migrate": "node scripts/migrate.js",
"seed": "node scripts/seed.js" },
Since you don’t have a client configured, to test the API, you will need to run the two files as scripts alongside the npm run command.
Set Up a Dockerfile
A Dockerfile defines the instructions required by the Docker engine to build a Docker image. In the root directory of your project, create a new file and name it, Dockerfile. Then, add the following instructions to build a Docker image for the Node.js application.
FROM node:16.3.0-alpine3.13
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8000
CMD [ "node", "index.js" ]
Let’s break it down:
- FROM – This instruction sets the base image for the application, which is the Node.js Alpine image, a lightweight version of the Node.js image that can be found in the Docker registry.
- WORKDIR – sets /app directory as the working directory.
- COPY package*.json./ – instructs Docker to copy all files with that filename format from the current directory to the /app folder.
- RUN – executes and builds up the image.
- COPY . . – copies the source files into the /app folder.
- EXPOSE – this instructs Docker to expose a port within the container to the external environment, for this case, the host machine.
- CMD – specifies the command to be executed when the Docker container is created from the image.
Create the Docker Compose File
For the Node.js application to interact with Docker’s PostgreSQL instance, the two applications need to run in Docker containers within the same network environment.
For this reason, you need to define and build both the application’s image and the PostgreSQL instance using Docker Compose — a tool that allows you to build and manage multiple Docker containers.
Simply put, using a Docker Compose, you can define the services that make up your application as a single unit, for this case, the Node.js REST API and the PostgreSQL database.
Create a new file, docker-compose.yml, in the root directory and add the code below:
version: '3.9'services:
server:
build: .
ports:
- '5000:5000'
depends_on:
- db
db:
image: 'postgres'
ports:
- '4321:5432'
environment:
POSTGRES_PASSWORD: 'mypassword123'
POSTGRES_USER: 'testUser'
volumes:
- data:/var/lib/postgresql/data
volumes:
data:
This code will create and run two Docker containers. The first container, server, Docker Compose uses the Dockerfile to build the image for this container.
It also specifies that the server container depends on the db container. Meaning, the server container must be started after the db container to connect with it.
The second container is a PostgreSQL database container. You don’t need to specify a Dockerfile for this container since it will be created from the PostgreSQL image on Docker’s image registry.
Build the Docker Images
Use the Docker Compose command to build the images and start the two containers.
docker-compose up -d
You should see a similar response after the process is successfully completed.
Test the REST API
Run the command below to test the REST API running in the Docker container. It should create a table in the PostgreSQL database.
docker exec docker_node-server-1 npm run migrate
You should see a similar response.
Sharing the Docker Images
The final step is pushing the Docker image for your Node.js application to Docker Hub. This is similar to pushing your projects to GitHub.
- Head over to Docker Hub and sign up for an account and log in to the user dashboard.
- Next, click on Create a Repository. Provide the name of your repository and set its visibility to either Public or Private and then click Create.
- To push your application’s Docker image to Docker Hub, you first need to log into your account via the terminal and then provide your username and password.
docker login
- Next, update the name of your Docker image to match this format: <your docker username>/<repo name>. Run the command below to make this change:
docker tag <image> <your docker username>/<repo name>
- Finally, push your Docker image.
docker push <image>/< repo name>
Using Docker in Development
This guide only touched on a fraction of the potential that Docker can offer. However, you can now use Docker’s containerization technology to package any application and all its dependencies as images that can be deployed in different development, as well as, production environments like the cloud without any hiccups.