One step further into Docker world, the biggest question I have to ask myself is “where do I want to go?“. What do I know about Docker? How could I explain in a simple sentence?
As far as I’ve known so far, Docker gives me a box (container), in which I can run something in an isolated environment. When I do not need it, I might throw it away. And when I need it, I can build it quickly.
Given that I have a box, and I can open the box, what will I put in? what should I do with the box?
As a developer, the first thing I want to try is to put some code inside the box and be able to run it. Yes! That’s right! I will put ASP.NET Core web application code into the box, and run it.
In this post, I will go through all the things I have learned, collected so far and put them into practice.
Hey, Boss! How Do I Get The Source, Docker Asked?
Calm down! Calm down! Docker. Let’s me show you where to get the stuff you need.
External – Mounted Volume
Docker can refer to the source from an external (relative to itself) system. In a development environment, it is our host system, the folder where we put our code.
I will install a container that specializes for .NET Core. Should be easy, just head over to Docker hub, find Microsoft. Take the microsoft/aspnetcore image. Time to type some fun commands with Docker in Powershell (I started to love CLI interface). Because I am going to use it for a while, I will pull aspnetcore image from the registry.
docker pull microsoft/aspnetcore hit enter. Boom! you have the image ready to serve you.
As a good habit should run the docker images and docker inspect to look into the container. Just to make sure everything is in place.
To connect, Docker introduces the Data Volume concept. You can read the full document at the website. However, I would like to go with the basic approach.
Think of Data Volume is a mean define a mapping between a location inside the Container and a location on the host machine.
We have
A Docker container that can host and run a website.
A folder where we develop our source code.
And Data Volume allows a running container access that folder. When it finishes, the folder is still intact. All the changes that the container made, persists there. Let’s try out.
I want to run a container that will start my website. Breaking down to small steps, when running a container, it must
Start the container instance in background mode
Connect to the folder where my code hosted (local drive)
Start a Kestrel server to run my application
While experience, it seems I need to use the aspnetcore-build image instead. Because I need the Core SDK to run and build my application. Jump over the Powershell, I run my very first command with a smile. Boom! it does not work
Let’s try to make it worked first and then I will explain every single part in the command.
The best way we should do when investing an issue is to look into it, docker inspect command is your friend.
Something is wrong with the volume mapping.
After some investigations, the correct syntax that I should use
In term of speaking language, the above command says
I want to run a container from the microsoft/aspnetcore-build image, named it as coconut. Once started, create a volume mapping between the current host folder (via ${pwd} keyword) and app folder in the container, create a port mapping between port 5000 (host machine) and port 80 (on the container). The finally runs the command dotnet run with under working folder app.
Just like before, I hit another error, saying that I should share the drive with the Docker. The solution is written here.
I must admit that thing is not easy as it seems. After struggling for a while, I finally manage to get it worked.
Pay attentions to
bash -c “dotnet restore && dotnet run”
The command instructs the container to restore NuGet packages and then run the application.
And
-p 5000:80
Once the application is started in the container, the website is exposed via port 80. Note: if you start an application on your host machine, it is exposed via port 5000. That is also why I take the port 5000 on the host machine.
Initially, I wanted to write both the data mounted volume and Dockerfile in one post. However, the more I get my hands dirty, the more I learn. It is better to recap what I reap.
Recap
To run a container that starts an aspnet core application, uses this command
The port mapping (-p 5000:80) is very important. It allows you to access your website from the host browser. To know the port that dotnet exposes after running the website, you should look at the output in the container console.
To map your current working directory to a folder in the container, use -v ${pwd}:/app. The ${pwd} on Powershell will get the current directory.
Because I made many mistakes, I learn to use the docker stop/start/rm commands. Now they are mine 🙂
Evil is everywhere. You should use docker inspect to look inside the container.
One of the biggest achievements is the feeling of getting started, of getting into. I am able to use many commands at my disposal.
What’s Next?
Finish what I have started – Use Dockerfile to build and ship my applications.
Now that I finished the flow to push the code from my laptop up to the Cloud. I also started to play around with Docker; also a bit of ASP.NET Core. I have many dots. The next challenge, as always, is to connect them. Try to make sense from all pieces. Let’s focus a bit more on Docker; create a connection between Docker and ASP.NET Core.
There are many materials about Docker. It is easy to start playing around with Docker, as I wrote here. Pretty basic stuff. It is easy to find out how to run a command. The Docker website has an extensive documentation, which tells you everything you need to know about Docker.
But, to me, It is useless if I cannot make sense of “the why”. After playing around with commands, watching some courses (mostly from Pluralsight), I have to find the answer for questions
Why do I need it?
What problems does it solve?
I try to create an image of it in my mind. It is such an important step, that if I fail, I cannot continue. Just like, how can you run if you do not know why and where to run!
So let give it a try before moving deeper into the implementation detail.
Why Docker? Container Approach
I would like to call it “Container Approach“. Below are some benefits, in my opinion, that give me good reasons to invest in.
Improve Cooperation
If we develop small systems which are easy to setup and deploy, the need of Container might not be obvious. Because, as a developer, we can check out the code, hit F5 and we are good to go. If we want to deploy those systems to QA’s machine or some sort of central servers, it is still a trivial task.
However, when systems are big where there are many services, require complex infrastructure, things get complicated quickly and cost lots of time. Take an example of a web application. I would assume the system consists of
Front end application: Build on top of AngularJS. This application will connect to a backend service via WebAPI endpoints.
Backend service: The backend service built with ASP.NET Core. It supplies endpoints that Front end application will consume.
Database: SQL family server. Can be SQL Server, SQL Express, SQLight, or MySQL. Use EF as Data Access Layer (ORM).
The team consists of Frontend Developer, Backend Developer, and Tester. Each might have to install the same infrastructure to run the application. Given that we have not had a central deployment place yet. It means that
Front end developer has to install all requirement software, infrastructure, that only Backend Developer required.
Back end developer has to install the things that only Front end developer needs.
Tester has to install development environment even they do not know about coding. They might have to setup local IIS.
When a new member joins the team, there are so many repetitive works.
What if
Front end developer will take the api component, and load it up on his machine without installing anything. He will focus on building the cool frontend, and connect to that API.
A tester will take the application as a whole, and run it with a single click, without installing anything.
With a proper developed, Docker can help.
Develop Faster, Test Faster
Docker will give us the minimum-required infrastructure we need to build and test an application. For example, when I want to develop a system running with MongoDB. Instead of downloading and installing MondoDB locally, I can get a prebuilt docker image with MongoDB installed.
The ability to switch infrastructure gives developers a lot of flexibilities.
Develop Componentized Thinking
This might not be true for others. However, it works for me. I shape my thinking process. A system composes of many small components. Each component might run in a Container. This, in turn, will force me to think, what should I put in a single container.
What’s Next?
I try to make sense of Docker, try to explain it to me. I do not make any suggestion or judgment regarding its benefits. But will all those in mind, I am ready to invest more in the journey.
Docker helps
Improve cooperation
Develop faster, test faster
Develop componentized thinking
The next step is, as usual, to try things out. Connect the dots between Docker and ASP.NET Core. This post intended to write about it. However, I changed the intention as I wrote.
Given that you have to develop a full lifecycle of a web application, where will we start? If working in a company, where there are processes for developing and releasing products, the story might be different. But what if we want to develop our own pet projects, a web application, how could we do that? What is the minimum cost?
Asking a web developer, they might come up with many things, many pieces. They are dots. The question is how do we connect the dots?
As a developer, most of the time I play around with business code; solve business requirements with domain logics. In short, they are none-infrastructure code. When started my learning journey this year, I wanted to play around with the infrastructure.
After being able to build a minimal ASP.NET Core web application, I learned from Scott Allen Developing with Azure course. It is such a wonderful course as always. I decided to try my own with the newest .NET tools I have at hands. Try-and-write is a good way of learning. This makes sure that I get the best value out of my time.
Note: This is a long post. I write it while I experience. Write on the go.
Let explore and find the answer for the questions.
How to build a product from your laptop and deploy to cloud? How much does it cost? How to connect the dots?
Overview
I will need these tools/components complete the work
VS 2017 Community Edition + .NET Core 1.1: To code the application. But not required. Because with .NET Core you can code in notepad and it still works.
GitHub Account: I want to host my code on GitHub. Obviously, it is not a requirement. But I want to try out. Alternatively, you can host code on Visual Studio Team Service.
Visual Studio Team Service (VSTS): To host code (if not hosted on GitHub), build and release.
Azure Account: To host the web application.
We, however, can deploy directly from VS 2017 or Powershell to Azure. But I will not explore that option in the post. Because it rarely happens with real projects.
I want to setup this flow – I like to draw on paper (even it does not look good 😛 )
Next, I want to make a build in VSTS which will build the code hosted on my GitHub account. But, first I want to introduce you the Visual Studio Team Service (VSTS).
Take a look at its pricing scheme. You will choose the Free model. With free model, you have
Which will you pretty much everything you need to develop a pet project with your friends. Explore more when you have time. Here I want to pay attention to the build capacity.
If your build and deploy take 2 minutes, you have 120 builds per month; or 4 builds per day. That should be enough for a personal project. I suggest you create your own account if you have not had yet.
I have created a “Production” project. I was supposed to host my code in production (or something like that). But now, I will use it to take advantages of build capacity. Which means the project has everything, except code.
Add a new build definition and choose the “ASP.NET Core (PREVIEW)” template. Once created, take a deep look at the generated build definition
At the minimum, we have the process definition with 6 steps
Get sources
Restore
Build
Test
Publish
Publish Artifact
Get sources
To build, we have to tell the build engine where to get the source code. I want to use build code hosted on GitHub. There are 3 steps involved
Connect to GitHub used a proper authentication method. I used Personal Access Token.
Specify the repository to get the code, I used thaianhduc/coconut.
And finally tell the branch: master
After setup, it should look like the above.
Because I have not had Azure Setup or Unit Test yet, I will disable some steps. They should be enabled later when I need them
For any step, you can click on the Control Options, and uncheck the “Enabled” checkbox.
The other steps (Restore, Build) use default settings. It will just work.
Test the Build
Should I enjoy the setup? Time to test the integration.
Pretty simple with just a single click
A dialog shows up. Simply click ok. We will not tweak any configuration value. Wait a few seconds then the build is triggered. You see a nice Console output. The console will tell you what is going.
Not everything is easy as it seems. I got an error message at the restore step
error: Invalid input ‘d:\a\1\s\Code\Coconut.Web\Coconut.Web.csproj’. The file type was not recognized.
Error: C:\Program Files\dotnet\dotnet.exe failed with return code: 1
Dotnet command failed with non-zero exit code on the following projects : d:\a\1\s\Code\Coconut.Web\Coconut.Web.csproj
Let find out how to fix the problem. At this point in time, I depend on Google. We all depend on Google, do we? It turns out that many people having that problem. The solution is given here in the SO. In short, I have to use the Hosted VS2017 agent (instead of just “hosted”).
Azure Cloud
So far, I am half way to the cloud. Playing around with Azure is a bit of nervous because of Credit Card involved. I hesitated to play with it because I did not understand it.
Not anymore! Let’s go ahead and play with it (I was empowered with the course about Azure 😛 ).
All I need is a cheap app service. Maybe I can have a free account. For each app service, it is a good practice to have at least 2 slots: Staging and Production.
Login the Azure Portal and start creating a new App Service. Choose the default Web App from the gallery
Hit the “Create” and configure with minimum requirements.
There are a couple of concepts that I have to capture when working with Azure. I try to explain here in a way that I can understand. Hope the explanation also helps you.
First, take a look at the final result of my App Service
When visiting the Azure Portal, it looks confusing. There are so many items on the portal, many places you could click on. There are a bunch of documents out there showing you step by step how to create an App Service. I, once, read them and did not understand them. Of course, I could follow step by step; but with a blind eye.
There are 3 basic concepts we should understand: App Service, Resource Group, and App Service Plan.
Resource Group
Everything in Azure is a resource. Resources are organized in groups. By using group, we can handle many resources easily.
Resource group is a logical concept. You will not know or care where/how resources are deployed.
Imagine you practice developing a web application with Azure SQL. When you finish and do not want to keep them anymore, instead of going to each resource and delete, you could simply delete the resource group in one click.
App Service
It is a placeholder where you deploy your services. Think of App Service as your own computer without your maintenance. You do not have to install Softwares. You do not have to do anything at all.
However, you have to define a configuration applied to the service. Azure needs information to build a proper service instance. This is done at the first step. When you create a new App Service, Azure asks for a template, such as Web App, Web App + SQL Server, … By choosing a template, you define the configuration applied to the service. The beauty is that you have done with just a single click.
Each app service is given a unique URL (mine is: http://coconuttree.azurewebsites.net)
App Service Plan or Pricing Tier
Manage the cost. The service plan will tell you how much you will have to pay. You can choose the Free tier 🙂
So far, I have not had to pay for anything. They are all free. Clicking on the service URL to verify its existence. Well, so far it will display the default page generated by Azure.
I want to have 2 environments: Staging and Production. In Azure term, they are called development slots. Azure makes it super easy to create them. But, it seems with the free service plan, it is not allowed. I think it makes sense. You cannot ask too many things with a free service.
I will stick with the default deployment slot. The ultimate goal is to have a full cycle of development.
VSTS to Azure
From the concept level, deployment to Azure consists of 2 steps: Package the build result and Push the package to Azure. Ok! what does it means in the context of VSTS?
Go back to our VSTS build, and enable these 2 steps
By looking at each step configuration, we can guess what they do
Publish: will publish (by run the publish command) the web project to a place defined by {build.artifactstagingdirectory}. I do not know where it is. I just know there is such a thing
Publish Artifact: will take the output from the Publish step, create an Artifact name “drop” with type “Server”. I guess this step will take the input from Publish step, create an artifact “drop” which can be deployed to a server. Therefore, the output is a package which can be deployed to somewhere else, Azure for example.
So far, I just enable them and try to explain them base on my logical thinking of what they should be doing.
To deploy the build to Azure, I will look at the Release tab. As its name suggests, I think its job is to release a build into a production server. Let’s check it out.
Create a release
Click on create a new definition, you can choose among many options. Right there on the top, that is what I need: Azure App Service Deployment
Next step makes thing very clear for me. The release will take the output from the Build flow and deploy to Azure
Then you have to setup the connection with your Azure App Service. It is pretty straightforward.
So far so good. Let’s test things out.
Run the build, it looks ok, except
When I take a look at the Console (just curiosity), I found out interesting things. Btw I should able to fix it easy. As the message suggests, I add a web.config file to the project.
The result looks promising. I do not understand the output log. But seem it works as expected
Should the Release work? Try it out.
Access the Release tab, kick off a new release, here we go, a nice dialog. I will make some guess.
It takes the input from the last build, obviously. So far so good. Ok kick the build.
Bingo! It works. It just works. One of the nice things is that you can see all the logs. You know exactly what was going on
There are so many settings in VSTS that you can tweak. You can setup continuous integration (CI) with a checkbox.
At this point, I can claim the victory. My mission has completed. It is a long post. It is time to summary what I have accomplished.
Summary
First, let talk about the cost. It costs me nothing. I have accomplished these wonderful things with zero USD, 0USD.
I code my application in my laptop with VS2017 Community Edition. It has all the features I need to build a web application.
I host my code on GitHub. If I want to make my code private, I can host it directly on VSTS.
I use VSTS to define my Continuous Integration (CI) and Continuous Delivery (CD). I really like the idea of separating the build and the release. This allows me to build many times to ensure the code does not break. However, I do not want to release every time I commit code. Separation of Concern is a powerful design concept.
I use Azure Cloud to create an App Service to host my application.
At each step, I can customize to fit my needs. All those wonderful features cost me nothing.
I am so happy to get the job done. I hope it gives you some good information, well, to start, somewhere.
The learning journey continues with .NET Core stack, the latest framework from Microsoft. As a developer, I decided to check out ASP.NET Core.
The general path is to answer the question I asked (myself) in this post “How to take the advantages of Docker?” .NET Core seems to be a good candidate to explore.
As a habit, I immediately head to Pluralsight and check out the best Author, Scott Allen, course ASP.NET Core Fundamentals. Scott has a unique way of transferring complex stuff into your head. I highly recommend the course.
After watching the course, the biggest question I asked “What do I get? How do I start from here?” Follow step by step in the course is so simple. I will not learn much that way. Because I have experienced in code, not a starter.
How do I get the best out of the course?
Overview
I try to draw an image of the components that make up an ASP.NET Core application. Without an overall image, you will get lost in detail.
With a pencil, a paper, and my terrible drawing skill, I draw a picture of my understanding of ASP.NET Core pieces. Those are the fundamentals to get started with building web applications and exploring the power of ASP.NET Core.
Before explaining them, let take a look at the minimum code generated by VS2017
Very neat, clean! There are 2 files: Program.cs and Startup.cs.
Startup
The Startup is where developers build the application components. Take a look at my drawing, there are 3 major concepts
ConfigurationBuilder
public Startup(IHostingEnvironment env)
{
var builder = new ConfigurationBuilder()
.SetBasePath(env.ContentRootPath);
var configuration = builder.Build();
var appKey = configuration["appConfig_Hello"];
}
In the past, we have web.config. We define connection string, appSettings, …
ConfigurationBuilder allows developers to define where/how to read the configuration information. Usually, they are stored in JSON file. What it means is that the configuration builder will allow developers to wire up the dependent configuration data from static files.
ConfigureService
For any application to run, we need services. This is place we set up our dependency injection (DI)
// This method gets called by the runtime. Use this method to add services to the container.
// For more information on how to configure your application, visit https://go.microsoft.com/fwlink/?LinkID=398940
public void ConfigureServices(IServiceCollection services)
{
services.AddSingleton<IMyService>();
}
Note: You should be familiar with DI in practice.
ApplicationBuilder
The heart of the framework is the mighty ApplicationBuilder. As its name implied, it is a tool for us to build the application. It does many things. However, one of the important things is to build the Middlewares Stack.
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
loggerFactory.AddConsole();
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
app.Run(async (context) =>
{
await context.Response.WriteAsync("Hello World!");
});
}
How many middlewares do we have in the above code?
Answer: 2.
UseDeveloperExceptionPage: If there is an exception, display the exception page that developers can understand 🙂
Run: The middleware that handles all the request. It always returns “Hello Word!” text.
Middlewares are stacked. Which means that the order matters.
Some important behaviors of middlewares
Order matters.
One middleware can terminate the request which causes the next one is not called.
There are many built-in middlewares, delivered by using NuGet packages.
Most of them follow a naming convention. Usually, they start with UseXXX.
MVC Middleware
Let’s tweak the Run middleware a bit
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
loggerFactory.AddConsole();
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
app.Run(async (context) =>
{
await context.Response.WriteAsync($"You are requesting {context.Request.Path}");
});
}
In theory, we can build out web application by placing the logic inside the Run middleware. We have the power over the HttpContext. If we want to build our own MVC framework, we can do by following these high-level steps
If we want to build our own MVC framework, we can do by following these high-level steps
Parse the request to extract information about Controller, Action, Parameters, …
Dispatch the call to proper controller/action.
Build HTML result from the controller/action result.
Or we can simply turn it on with
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
loggerFactory.AddConsole();
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
app.UseMvc();
//app.Run(async (context) =>
//{
// await context.Response.WriteAsync($"You are requesting {context.Request.Path}");
//});
}
All the magic is at this line app.UseMvc() (from Microsoft.AspNetCore.Mvc NuGet package). We have to remove the Run middleware call. Otherwise, it will override the result produced in the MVC framework. Order matters, remember?
Recap
I try to draw a picture that I can understand; that should be easy to explain. From there, I explain my understanding of components/concepts that build up ASP.NET Core application. This, however, is not a “how to” post. Once you know what you want to do, you can search for “how to” quite easy. There are thousands of valuable resources out there.
By redesigning ASP.NET Core in this way, it is a powerful framework to play with. Developers have so many power over what they want to put into the application. We, as developers, know exactly what are in the pipeline. Grab your head around new concepts and take advantages of them.
I use this post as a reference for my understanding; and if asked, I can reference to this post. If you are reading this post, please give me feedbacks if I misunderstood something.
For anyone that is learning anything new, I suggest you draw a simple picture of your understanding. If you cannot, it is a sign of you have not understood it.
Having started the learning journey with Docker, I took some courses on Pluralsight, Nigel Poulton Docker Deep Dive Course. It is a wonderful course. It helps me understand the overall design, principles of the Docker. It also empowers me with many hands-on commands that can use immediately. I did, I practiced them while watching the course.
Suddenly, I asked myself. Hey, wait a minute! You will soon forget them all. Well, because you will not use them in the near future, or at least you will not know when you will them. I convinced myself that there is a document in the Docker office website. We are all, as a human being, having many good reasons to convince ourselves not doing something. I realize that I was set up a trap for myself.
With the exposing of the information age, the problem is not lacking information. Rather it is a problem of how to get started. Take an example of the Docker, head over to the document site, then what will you do first? Many people will know where to start. But at the same time, there is a vast majority of not.
Welcome to the fundamental of Docker!
By writing them out here, I, later, can know what to look for in detail. Instead of checking all the document, I simply look for the detail of specific commands, which is a very limited amount.
Docker Commands
A list of basic commands that I should need to know to work with
docker version
See client and server (daemon) version information
docker info
docker run
Run a docker image
docker pull
Pull a docker image from docker hub to local environment
docker push
Push a local docker image to the docker hub
docker build
Build a docker image from a Dockerfile file.
docker images
Display all images
docker history {image name/id}
See history of an image
docker ps
List containers
docker inspect
Inspect container. Very useful to dig deeper into a container
docker port {container name}
Display the port mapping between container and host
docker rmi {image id}
Remove an image
docker start {container id}
Start a container
docker stop {container id}
Stop a container
docker rm {container id}
Remove a container
docker attach {container id}
Attach (or interact) with a container
docker logs -f
Attach to a running container, which means we can interact with the container shell, if it is a Linux
Docker Networking
[List of things I learn through the course.]
docker0 bridge (Ethernet switch)
On host machine (Linux), install: apt-get install bridge-utils
command brctl show docker0.
icc (Inter Container Communication) and iptables: both are true by default
Next?
Learn what, how to take advantages of Docker in the modern software development.
What you do with what you know is more important than what you know