System Design : What is the difference between throughput and latency ?

In Computer Networks latency and throughput plays an important role. While latency checks whether the network is fast enough or not, throughput checks if the network can receive large amount of data concurrently. High latency means the network ping is slow i.e. the network takes time to return its response. High latency we generally say the network is lagging. But low latency means the network is able to send response back faster.  Throughput indicates the number of data packets transmitted from source to destination. High throughput means the network can transfer large amount of data packets to their destination. Low throughput means the network will struggle while sending small amount of data packets. A network with low throughput and high latency will struggle to send and process high volume data while the network with high throughput and low latency will result in efficient application. Throughput is measured in megabytes per second while the latency is measured in milliseconds Net

What is docker and how to run any application in docker?


Docker is a framework that runs developers application using containerization technology in a standalone standardized unit called container. It helps developers to build, test and deploy their application using containers. To understand docker thoroughly we must understand what containerization is. 

What is Containerization ? 

Containerization is a form of virtualization that allows you to pack and isolate an application and its dependencies into a single container. A container includes the application code , runtime , library and system tools and ensure that application runs consistently across different environments.

Why Containerize an application?

  1.  Portability : Containers can run consistently across various environments, such as development, testing, and production. This portability ensures that an application behaves the same way regardless of where it is deployed.
  2.  Consistency : With containerization, developers and operations teams can work with the same environment, reducing the "it works on my machine" problem. This consistency simplifies the development and deployment processes. AS all the dependencies are defined in a single file docker file. Does not matter the OS. 
  3. Resource Efficiency : Container are more lightweight than traditional virtual machines because they are hosted on the operating system kernel. They start up quickly and use less resources. 
  4. Easy to setup and remove files needed for the application (As the application Docker file is common throughout). Ex : If the project is using SQL so you set up SQL in your system for your application. Once your done with your application, you have to delete the whole SQL for your system but with containerization your system does not install SQL but your isolated environment container can easy install and you can remove it with single command. 
  5. Makes deployment easy as you just have to change the docker file and then automate it.

Now that we have a proper understanding of containerization we must learn what are the some of the docker components :

Docker have three main component : 

  1. Docker-CLI : This is command line tool through we can control our ports, codebases , application behavior, containers etc.
  2. Daemon Engine : This is the core component of docker. All our containers are run by this engine named daemon. 
  3. Docker Hub ( Registry ) : Just like Git Hub, Docker Hub is a platform where we deploy our project image. Once images are uploaded on the Docker Hub  then Aws or any service can take a pull from the hub and run.

What is the different between Docker Image and Docker Container? 

  • Docker Image :  It is a template from which consistent container can be created. Image is an initial filesystem state of  a new containers. Docker images are created using Docker File which is a script to build an image. immutable blueprint for running an application while a Docker container is a dynamic, runnable instance of that image. Image is created only once.
  • Docker Container : Is an isolated environment in which the application and its dependencies runs. Container Changes the underlying application files. Docker containers can be created multiple times.
Now that we have understood basics about Docker and Containerization, We will creating our first Docker Image and run our application in docker. But before that we need to install docker in our system. You can read the instruction and download docker in your system as per your Operating System from here : docker.com 
You can check the docker is installed in your system or not by executing docker --version command in your terminal. It should return docker version.

Creating a Docker Image

After installing docker in your system, choose a project you want to create docker image of. I am using a Next project here.

  1. Create a new file named "dockerfile" and add the following code inside it:
# Use a base image with Node.js pre-installed
FROM node:18-alpine AS base

# Set the working directory inside the container
WORKDIR /app

# Copy application files into the container
COPY . .

# Install dependencies
RUN npm install

# Expose the application port
EXPOSE 8000

# Define the command to run the application
CMD ["npm", "run" , "dev"]

2.Now run your docker build . -t next-project in your root folder. It will create

a docker image after build.


3. Now you can check the image by executing docker images you can see your new created image here.


Run Docker Image

Run docker run -p 8000:8000 next-project in your project root file. Now you can access your application which will be exposed to the dockerfile port. 


You can also share this image using Docker Hub to other developer. They will see the exact current copy of the build you created for the application.

Comments