[Day 2] JS in Pipeline (2): Docker and Local Development Environment (2)


The goal of this series is to introduce some best practices in the local development environment and create a CI/CD pipeline for NodeJS applications.

Previous Article - JS in Pipeline (1): Docker and Local Development Environment (1): https://www.coderbridge.com/@jeanycyang/b3a9d680b61348f7bda41c97ab77fd1b

GitHub repo for this article series: https://github.com/jeanycyang/js-in-pipeline

In this article, we are going to dive deeper into how to use docker in a local development environment.


Start your Docker container

So, we have built an image that contains our application — gift code server. Let’s see how to start a Docker container.

We can start a Docker container using the docker-run command.

docker run [OPTIONS] IMAGE[:TAG|@DIGEST] [COMMAND] [ARG...]

Let’s try docker run -it -p 40000:3000 --name myapp giftcodeserver.
-it is short for -i and -t flags. You can use -it to bring your Docker container in the foreground. See more on https://docs.docker.com/engine/reference/run/#foreground. Then you will see the stdout directly writes to your terminal console. You can also interact with the Docker container from stdin. If you prefer to run it in background, you can use -d, the detach flag, instead of -it.

The -p flag is for port mapping. The container’s port 3000 is now accessible on the host’s port 40000. Make a POST request to localhost:40000/api/redeem/abc123 , then you will see the magic happening.

--name is to name this Docker container. If you don’t give a name to it, later you can only reference it using container id. For example, 6064cf4d4b8f. It is hard to remember and you need to keep this name somewhere; therefore, always name your container.

giftcodeserver is the image name. If you don’t specify the tag, it will try to find the image tagged as latest.

You can make requests to the host’s port (in this case, port 40000). Great! The container works perfectly!

You can learn more about docker commands on Docker’s official website. And here is also a useful cheat sheet for Docker.

So far so good?

“No…Wait…!” You might say.

“I am developing. When I make some changes, the image remains unchanged as it is already built. I understand that Docker is good for the production environment. But…It seems not useful at all for local development environment…”

“Doesn’t it mean that I need to stop the container, rebuilt the image, then start a new container based on the latest image EVERY TIME I make some changes?”

No, of course NOT!

Let’s see how to REALLY use it in our daily development process!

Auto-Reloading

Let’s temporarily forget Docker.

When you are developing a NodeJS application, say, an Express.js server, you run node index.js . Every time you make some changes and want to see if your changes work as you expect, you still need to restart your application again and again.
Some people might already know how to auto-reload a NodeJS application. If you have already known how to use Nodemon or some hot-reload techs, you can just skip this section :)

Nodemon is a tool that helps develop node.js based applications by automatically restarting the node application when file changes in the directory are detected.

Install Nodemon as a dev dependency.

$ npm install --save-dev nodemon

nodemon is a replacement wrapper for node, to use nodemon replace the word node on the command line when executing your script.

Let’s add an npm script in our package.json file.

"scripts": {
  "dev": "nodemon index.js",
  ...
}

npm run dev and try to change the index.js file. Auto-reloading goes live now!
You can also add a nodemon.json file in the project folder. Ignore files: Nodemon will not auto-reload your app even if those files are changed. Watch files: Nodemon will auto-reload your app if those files are updated.

{
  "verbose": true,
  "ignore": ["node_modules", "mysql-data"],
  "watch": [".env", "index.js", "api/**/*", "database/**/*", "middlewares/**/*", "config/**/*"],
  "ext": "js json"
}

https://github.com/jeanycyang/js-in-pipeline/blob/75816a6f17873caac859267f56f26ffb68d155ea/nodemon.json

There are also other ways to automatically reload your NodeJS apps and even some ways to “hot-reload”(instead of reloading the whole app, it just replaces the updated modules/files). But it is beyond the scope of this article series.

Now, we will come back to our subject — Docker for Local Development Environment.

Docker Volume

So, now we know the auto-reloading tool, Nodemon, detects files changes then decides whether or not to automatically restart your NodeJS application.

When you are developing and changing your file on your machine(host), the running container seems to have no ways to detect the file changes — After all, once the image is built, all files have already been copied into the image.

Then Docker Volume comes into play — How about “temporarily” “replace” the files inside the running container?

Docker allows you to mount files/directories in the container. This is, you can mount the files in your machine (host) in the running container!

You can use the -v flag to mount your volume.

$ docker run -it -p 40000:3000 --name myapp -v `pwd`:`pwd` -w `pwd` giftcodeserver npm run dev

-v directory/file(host):directory/file(container)

Your working directory on your machine now is being mounted to the container’s working directory (ie. /usr/src/app)!

And we also add npm run dev ([COMMAND]). npm run dev overrides CMD ["node" "index.js"] instructed in the Dockerfile.

Now, the container runs npm run dev when it starts, and Nodemon can detect the file changes on your machine. You can imagine that, now the location /usr/src/app inside the container “links” to the host’s directory.

Wooow! Now every time we make changes, our application will be reloaded automatically!


Docker is so great! We can’t wait to run a MySQL container instead as well!

Moreover, later, your architecture might be more and more complicated. Maybe your team would decide to use Redis for caching or/and use RabbitMQ for queues.

Therefore, in the next article, we will introduce Docker-Compose, which can run multiple Containers. Plus, you can instruct the relationships between a container and other containers.

[Docker] Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration.
https://docs.docker.com/compose/

We will add MySQL to our Docker-Compose, and link the MySQL Container with our gift code server Container. Also, we will talk about best practices for configs and “stateless” applications.


Useful Links/References

https://docs.docker.com/engine/reference/commandline/run/#mount-volume--v---read-only
https://docs.docker.com/storage/bind-mounts/
https://nodemon.io/

#devops #nodejs #Node #docker #docker-compose







你可能感興趣的文章

JavaScript 程式執行原理:Closure

JavaScript 程式執行原理:Closure

Vue起手式

Vue起手式

資訊安全:XSS

資訊安全:XSS






留言討論