MongoDB REST API interface with Docker compose

In this post I will show you how easy it is to create a RESTful API interface on top of MongoDB using docker and docker-compose.
The advantage of using Docker is that all your services are installed, configured and ran automatically inside their own containers.

Technologies used: Docker, Docker Compose, Git & Github.

We will install everything locally but you can run it anywhere you have Docker and Docker compose installed, like in a Digital Ocean droplet.
If you don’t want to read through, you can just clone the Github repository.

1. Setting up

I am running on MacOS Sierra, so the instructions will be for Mac users, but you should be able to follow along regardless your OS.

2. Describing the services

We will need one Docker container for our MongoDB database, one Docker data volume to persist the data from MongoDB and one Docker container that will host our REST Api Node.js service.

First let’s create a folder for our project. Inside it, we will create a docker-compose.yml file in which we will define our services.

From here on we will be building our docker-compose.yml file, so you can open it in your favourite text editor.

2.1. Data Volume for persistent data

The first thing we’ll add to our Docker compose file will be the data volume. We do this so we can persist the data even if we destroy and rebuild the MongoDB container.
So the docker-compose.yml file should now look something like this:

2.2. The MongoDB service

We are ready to build our services now.

The first service that we will define will be the MongoDB database. For this we will use the alexpunct/mongo image that we created in a previous post.

The docker-compose.yml file should look like the one below. I’ve added #comments next to each instruction to explain what it does.

The environment variables will be set later on, don’t worry about them now.

2.3. The REST API service

Now that we have the database, we can add our REST API service that will allow us to insert records in the database by making HTTP requests to it.

For this container we will use the linuxenko/mongo-rest Docker image. You should open the link and have a quick read first at what environment variables and options you have available for the image as it will help you understand better what we’re doing next.
If however, you want to make changes to the way the API works, like adding more security, you can clone the repo, make the changes then create your own Docker image and push it to Dockerhub or wherever you’re storing your images.

So our docker-compose.yml file should finally look like this:

3. Running the services locally

3.1. Environment variables

As you saw when we were creating the docker-compose.yml file, we’ve used the Environment variables term in a few places. Basically these are just some configurable values that need to be passed by us when running the containers. While you could hardcode the values directly in the file, that’s not good practice since you might want to use the same file in different places or share it with other people.
For this reason, we need to define them on the machine that will be responsible for creating the containers, in this case the local environment.

The way I find it easy for demo purposes is to include an .env file in the project folder which will be picked up by Docker compose when running the .yml file.

The .gitignore file just instructs git to not commit the .env file when we’ll push the code to Github later on:

3.2. Launching the services

We’re all good now, let’s start the engines! All we have to do is type in the console docker-compose up (your current working directory must be the one where the docker-compose.yml file is located):

And the images will be downloaded, then the containers will be ran using the instructions in our docker-compose.yml file. See below (some rows were removed to save space).

3.3 Testing our API

Let’s test that our API is working and the data is being saved in the MongoDB database. For this I will use Postman to make HTTP requests.
The URL to send the request (as per the mongo-rest instructions) should be in the following format: http://SERVER_HOST:SERVER_PORT/api/collection/?apiKey=API_KEY which in our case, based on the environment variables we’ve set is

The test_mongo_collection is the collection that we’re inserting into. It will create it if it doesn’t exist.

It’s just a JSON POST request with some dummy data. It looks like we got a 201 response with the inserted value back and we can see it has an id so all that means it was inserted successfully!

Let’s open up Robomongo and inspect the database to double check the data is there.

4. Wrap-up

4.1. Freeing up resources

Now that we tested and we’re good, let’s see what was added and how we can get rid of it so we free up the space on our hdd.

4.1.1. Stopping the services

Since our terminal is running in interactive mode, we can just press ctrl+c and that should abort the session. Now let’s actually stop the services.

4.1.2. Removing the containers

Even though we stopped the containers, they are still present and waiting for us to restart them using the docker-compose up command we used above.
To actually remove the containers you need to run the docker-compose rm command:

4.1.3. Removing the data storage volumes

Remember how we created a data volume to persist data at step 2.1? Well that’s still present:

To remove these 2 volumes Docker created for us we run the docker volume prune command:

4.1.4. Removing the cached Docker images

Docker caches the images based on which it creates the containers so it can reuse them when building or rebuilding other containers in the future. Let’s see them using docker images command:

To remove them we need to run the docker image rm command on each image. We can do this in one command though using some bash trickery:

4.2. Pushing our code to Github

Let’s initialize an empty repository:

Add, commit and push to github:

Conclusions and next steps

Now we have a the ability to create a MongoDB database and a REST API as an interface with just one file! We can deploy it to any server that has Docker and Docker compose installed and we can run the services with one command.

I recommend using this images only for rapid prototyping, I don’t recommend running it in production without adding some extra security layers to the Node app and running the Api on HTTPS.

Leave a reply: