Building container-based development environment with Visual Studio Code
tl;dr: You can use Docker and VS Code’s Remote Containers to containerize your local dev environment, speed up the onboarding process, use the same base image across all environments, provide the same editor tools to all developers, implement standards more easily, might not work for all, if VS Code is not your choice of the code editor, you can give it a pass, unless you want to explore!
One of the challenges when setting up the local development environment for your team is ensuring that all developers have a setup that is either same or meets the requirements. The traditional approach to this problem is to lay down onboarding guidelines and expect developers to follow them. However, from version compatibility issues to individual’s experience to not using the right tools, there are varying hurdles to achieve uniformity in setting up the environment.
An alternative solution is a development environment pre-configured with all the required libraries and dependencies that developers can spin-off in a container. Developers can then work inside the isolated environment the container offers. This drastically minimizes the time a developer spends between cloning the codebase to begin working on it.
In addition to providing the same environment to all developers, we can facilitate the same set of tools, extensions, and even the theme in Visual Studio Code. Though this is optional, we can leverage this to automatically install specific extensions that your project requires. This can avoid inconsistent use of tooling and as well stop bothering developers to install them manually.
All these are made possible using the combination of Docker and VS Code’s Remote — Containers extension.
Setup
In this article, I will provide an example of a JavaScript application running in the Node environment. Read Developing inside a Container for detailed documentation for all tech stacks.
- Start by installing Docker and VS Code, if you do not have them installed already. Install the Remote — Containers extension in VS Code. Ensure the Docker is running on your machine.
- Go to your project and create a folder named .devcontainer in the root directory. This new folder holds the configuration files required for the development container.
- Create a Dockerfile and devcontainer.json inside the .devcontainer and add the following configuration.
Once done, we need to build the container. To do this, either use “Open Folder in Container” or “Reopen in Container” from the VS Code command palette.
This should initialize the dev container. It pulls the docker base image, configures the container, and starts the development server.
You should now be able to access the application in your browser and develop it as usual in VS Code. Even the hot reload works just fine! I have created a repo with a sample setup that you can try out!
The build and configuration of the container is a one-time activity and takes time. Subsequent rebuilds are faster if there are no changes. But if there is a change in devcontainer.json or the Dockerfile, a rebuild is required to apply the changes. You will be prompted to rebuild if you try to reopen directly.
If you exit the container or VS Code, you can get back into the container using the option ‘Reopen in Container” This will spin up the container that was already configured and just starts the development server again. If VS Code finds the .devcontainer configuration in a codebase, it automatically prompts you to start the container.
A few other benefits:
- The file system between the container and your local machine is in sync, so you can access your changes in either environment.
- You could run as many applications that require a different version of dependencies without installing or modifying any on your computer.
- Anyone in your team can run the application on their computer to code, review or play around, including non-technical members.
- Runs agnostic of the operating system.
Gotchas and workarounds.
- The VS Code terminal lets you run any scripts or commands since it is already in the container’s workspace. But if you want to use other tools such as the “Terminal” of macOS, you need to find the container and then perform docker exec.
- Since the application is running inside a Docker container, it will have access to a limited resource (CPU, Memory, Swap, etc). In most cases, the default limit should be fine. However, depending on your application, you may need to increase them in the Docker preferences to avoid lags.
Not for everyone
- While this could make a lot of things easier, it might not be ideal for apps that require extensive integrations and configurations within the environment that beyond the scope of containers.
- Power users and seasoned developers might not like this right off the bat. Especially if they prefer other code editors. This setup can be treated as optional and does not force developers to run the container. One can set up the environment manually.
- If your application is heavy on the resource (consider optimizing?), running Docker containers could eat up more resources.
Summary
This is a relatively new concept and there are more opportunities to explore, more limitations to tackle and improve the overall developer experience. I am personally excited to see the possibilities and would recommend this setup, even if you want it to be optional. If you already use this setup, please share your experience in the comments. Even otherwise, feel free to share your thoughts and suggestions.