Sunday, 24 December 2017

Docker-- part2


Docker Compose

Docker Compose is going to become your best friend. Do you remember in the previous post, we talked about running multiple containers in a manual fashion? Well, it’s time to say goodbye to that. In this post, we will look at a more friendlier approach to running multiple containers.

What is Docker Compose?

So, what is Docker Compose? In the previous post, we manually pulled and started containers which contained a WordPress and MySql software. We were able to link both containers together and also tested it in a web browser. In short, we executed a number of commands followed by some magic. In this post, we will take the manual magic and put it in a YAML file. Basically, we will write a short script which will go off and pull, start and manage all containers from a single point of contact. The act of writing a YAML script which does all the handling for us is done through Docker Compose. Let’s have a look at an example.

WordPress, MySql and Docker Compose

docker compose
As shown in the diagram above, we will take our existent solution and convert the entire thing to run from a YAML file. This means we will write a simple and short script which will be able to mimic the same commands we did in the previous post.
Let’s have a quick look at the commands which we executed in the last post, to remind us of the magic we have to write on the command line.
docker pull mysql
docker pull wordpress
docker run –-nameqas-sql–eMYSQL_ROOT_PASSWORD=password -d mysql:latest
docker run –-name qas-wordpress –-linkqas-sql:mysql–p 8080:80 –d wordpress
As you can see, we had to manually pull the images and then run them with parameters, forcing them to talk to one another.

Let’s YAMLify this

There is however a better way, we can convert all the above manual commands into a YAML file as shown below:
File Name: docker-compose.yml
version: '2'
 
services:
 
 mysql:
  image: mysql
  environment:
   MYSQL_ROOT_PASSWORD: password
 
 wordpress:
  image: wordpress
  ports:
   - 8080:80
  environment:
   WORDPRESS_DB_PASSWORD: password
Looking closely at the file, you can see that we define a number of things. Firstly, we define the version. This is the version of YAML that we will use. The benefit of using version 2, this automatically links all the services (containers) which we outline in the YAML file. It is also important to note that the indentation, case sensitivity and keywords used in the YAML file need to be correct. Using additional spaces, incorrect words can easily result in errors when trying to execute the YAML file.
We define containers as services. In this case we are using two services i.e. WordPress and MySql. It should be clear that each service can control its own variables, this is great since we can control a given service in isolation. Also, notice that the link is not directly noted in the YAML file. This is because the link is explicit. The link is explicit, this feature comes from using YAML version 2.

Docker Compose Commands

Now that we have a YAML file ready, let’s have a look at some of the commands which can help to manage a YAML file.
docker-compose ps
docker-compose up
docker-compose stop
docker-compose start
docker-compose restart
The commands above are a little different to what we have seen to date. These commands are used to manage a docker compose instance.
DOCKER-COMPOSE PS
When trying to run a YAML file, one of the first commands we need to use should be docker-compose ps. This tells us the current containers that are running which were executed as part of a YAML file.
DOCKER-COMPOSE UP
You have the option of either starting your containers in a YAML file through ‘up’ or ‘start’. ‘Up’ will try to download a container defined in your YAML file if it has not already been downloaded. It will then try to start the services.
DOCKER-COMPOSE STOP
This is a very self-explanatory command, this is used to stop any services running through a composed file.
DOCKER-COMPOSE START
The ‘start’ command is similar to ‘up’. When trying to ‘start’ a docker compose file, each service listed will be started as a container. However, ‘start’ will not download any images if they don’t exist.
DOCKER-COMPOSE RESTART
The restart command is used to restart all the services in the YAML file. This is a useful command as this encompasses both the start and stop commands.

Docker Compose Conclusion

In this post, we have finally looked at how we can use Docker Compose to control management of multiple containers from a single source. It should be clear how powerful the Docker Compose approach is and the benefits one can gain.

Multiple Docker Containers

The concept of Linking Docker Container allows a user to get one container to talk to another. This concept of linking is important as it allows for easier management of containers. Before diving into the world of container linking, perhaps it would be best if you quickly read-up on my blog post about Containers in general.

Linking Docker Container, Why?

We all know what a container is by know, if not then checkout my previous post on Containers. Moving on, why is it important that we are able to link multiple containers together. Well, as discussed, a container is something that can be used to contain a specific package of information in the form of OS, apps etc. Containers can also contain servers, applications etc. Do you think it would be a good idea to bundle a backend server and front end application into a single container?
No, why would you want to couple two types of application into a single container? Decoupling the two different types of applications means that you can hook up different types of backend servers (provided by a container) to a different front end application. The beauty of having containers with different types of application is that you can hook up different containers.

WordPress and MySQL

Let’s have a look at a real example where we link a WordPress container to a MySql container:
linking docker container
In the image above, we are hooking up a WordPress container to a MySql container. They both have their own variables and are also able to talk to each other. For instance, let’s follow the instruction set below:
docker pull mysql
docker pull wordpress
docker run –-nameqas-sql–eMYSQL_ROOT_PASSWORD=password -d mysql:latest
docker run –-name qas-wordpress –-linkqas-sql:mysql–p 8080:80 –d wordpress
In the above commands, we are firstly pulling both images into our local drive. We then run the MySQL image as a container and set it some additional properties such as database password, version of the image etc. This gives us control over the MySql server in isolation.
We then run the WordPress image as a container but also link it to the already running MySql server. Without the server, the WordPress container would not work. We are also able to set the port of the WordPress site.
If you now navigate to http://localhost:8080, this should now redirect you to your running instance of WordPress.

What Have We Learned?

This blog post is really an extension of the Containers post. In this post, we discussed the topic of linking Docker container. More importantly, we discussed the benefits of decoupling containers.
Well, what do you think?

Docker Container

A Docker Container is an encapsulated solution for running a machine with applications/software. It internalizes all dependencies allowing anyone to run a container on any machine, any cloud etc therefore making the process much more easier.
First of all, consider quickly going through ‘Installing Docker in Windows’before continuing with this post.

Docker Container

Docker has many commands which can be used to manage containers. Some of these are:
docker --help
docker search <image>
docker pull <image>
docker images
docker run <image>
docker ps -a
docker rm <containerID>
docker rmi <image>
DOCKER –HELP
docker container
‘Docker help’ is a great way of seeing a list of all the available options.
DOCKER SEARCH
docker container
The ‘search’ command can be used to locate an image in the Docker Hub. This can be useful when trying to find a specific image to either pull or run.

Docker Image Management

DOCKER PULL
docker container
‘Pull’ is used to create an exact copy of an image from the Docker Hub in your local machine. In other words, ‘docker pull’ is used to copy the image to a hard drive. This gives the option to run the image on a machine as a container.
DOCKER IMAGES
docker container
After an image has been pulled, it is possible to see a list of all the local images. ‘docker images’ can be used to see a list of all the images currently on a machine.
DOCKER RUN
docker container
‘docker run’ can be supplied with the name of an image. This effectively tries to run the image as a container on a Docker instance.
You can supply more parameters with ‘docker run’ command, for instance:
docker run -it ubuntu bash
This will run an Ubuntu image as a container, giving control over the terminal in Ubuntu.
DOCKER PS -A
docker container
Have you ever wondered how you can see a list of all the current containers on your machine? Well, try typing in ‘docker ps -a’, this will return all the current containers on your machine.
DOCKER RM 
docker container
It is possible to remove a container by supplying the container ID with ‘docker rm’. If the container is still running when trying to remove, you will have to stop it first. You can stop a container by:
docker rm <containerID>
DOCKER RMI
docker container
For a given image if there are no container instances, you can remove the image by typing in ‘docker rmi ‘. This will remove the image from your machine.

Conclusion

Docker has a number of commands. These commands can be used to search for an image, to pull the image and to ultimately deploy an image. This gives a Docker user the control to manage an image and to turn it into a container.

Installing Docker

Did you know that you can install Docker in Window. Well, let’s have a look.
Before you install Docker in Window, you may want to checkout my previous post on an introduction to Docker:
Introduction to Docker – Docker for Testing

Install Docker In Window

Before you install Docker, let’s go through the requirements. Firstly, you must own a legal copy of Windows 10 Professional or Enterprise, 64-bit. This will allow you to install the Docker MSI for Windows. However, let’s assume you don’t match the requirements above fully, you might be able to get away with installing the Toolbox version instead.
DOCKER TOOLBOX
Navigate to www.docker.com and follow the links to download the Docker Toolbox. Alternatively, you can access the link directory from below:
https://download.docker.com/win/stable/DockerToolbox.exe
Follow the instructions below to complete the installation.
01 – CLICK NEXT WHEN PROMPTED TO START THE SETUP WIZARD
install docker in window
02 – SELECT YOUR INSTALLATION DESTINATION AND SELECT NEXT
install docker in window
03 – SELECT ALL COMPONENTS (EXCEPT GIT) AND SELECT NEXT
install docker in window
04 – SELECT ANY ADDITIONAL TASKS AND SELECT NEXT
install docker in window
05 – CONFIRM ALL YOUR SELECTIONS AND SELECT INSTALL
install docker in window
Once the installation is complete, you will have both Kitematics and Docker Terminal installed. Run the Docker Terminal and wait until you see a screen like this:
install docker in window

Docker Hello World Example

Let’s run through a quick Docker ‘hello world’ example to ensure your installation is complete. In your terminal, type in the following command:
docker run hello-world
This should give you the output below:
The above output now confirms that you have successfully installed Docker on your machine. You were also able to pull an image from the Docker Hub, deploy it and run it.
Congratulations, you have now just installed Docker in Windows with a hello world example.

Introducing Docker

To Docker or not to Docker, that is the question. Docker is an open source project but you already know that, right? It is used to deploy containers which can contain software and applications. I assume you also know that? The biggest question here is why you should use it? Or better yet, what value can it bring to testing?

Docker, what is it?

Docker is a tool which allows you to deploy containers. A container can contain a version of an OS, application and software. Let’s take a second to understand why this is even important. In other words, so what if Docker does this. Just what problem does Docker solve?
Have you ever come across the scenario below:
Developer: Hey, run the code below.
QA: It does not run on my machine :(
Developer: Well, it works on mine!
The above scenario can be caused by external dependencies, libraries, humans etc. Containers on the other hand actually contain everything that you need to run it. This means as long as both the Developer and the QA have an instance of Docker running, they both should be able to run the container.
Let’s examine the structure of a Docker image in more details, examine the image below:
docker
The image above describes the difference between a VM and a Container. A VM contains the full OS system, applications, software etc. However, it is not always possible to share VM’s and have them reliable run on more than one machine. This can be caused by reasons described above. The Docker container however run’s on a Docker Engine. This increases the reliability of the container running.
With all of this said, why should you bother learning and running Docker containers, especially for test?

Why should you use it?

Docker for testing, here are the advantages:
* It’s free
* Environment
The free part makes sense, but what’s the deal with the ‘Environment’?
Let’s take a step back for a moment. When running tests against an environment, you are subject to the stability of that environment. This means if your environment is unstable, your tests will fail. This is clearly not the fault of your tests. What if you can control the environments? What if you can ensure your environments are more stable?
With Docker, you have that option. Using Docker, you will be able to run your environments in one container with your tests running in another. Learning to use and run containers will give you more control over your testing environments.
Source From   http://www.thetestroom.com/category/development-operations/docker/