I was able to setup a VM in a vNet and RDP to it. It is the simplest scenario to use Azure IaaS. A virtual network supplies a perfect isolation to group related resources that talk to each other. Usually that is not how it is used in the real world.
There are many services living under isolated environments. They expose endpoints that other services can communicate with. Warning: I do not discuss about microservices. Regardless of the term, each service will stay inside a virtual machine in a virtual network. What would it take to make them talk to each other?
Follow the step I did in the previous post, I created another setup in Central US.
Network Peering
There are 2 different virtual networks at different locations, with different address spaces
For 2 virtual networks, there is Network Peering. From each virtual network, create a peering to the other.
A peering can
Peer 2 virtual networks (of course there must be 2) in different regions
Belong to a different subscription. It is possible to select a different subscription when creating a peering.
Creating a peer is pretty simple
The above creates a peer from ps-az300-vnet to ps-vnet. To finish the peering, create another one from ps-vnet to ps-az300-vnet.
The peering is ready. Let’s see if these virtual machines can talk to each other
Let’s RDP to each machine and test a connection to the other. This picture makes my day
So far, I am able to
Create a virtual machine with its network setup. In a more abstract term, I create an isolated environment which allows me to deploy whatever I want
Connect the 2 isolated environments via Azure Peering resource
Gateway, Hub-spoke topology
Another option is to use a gateway, hub-spoke. They are kind of advanced topics that I do not really need to grasp at the moment. There are step by step on MS Docs site.
Azure has been there for a while. It is huge. I once said that I will study Azure. Then I started. Lost. There are so many materials out there, wonderful MS docs site, super Pluralsight, and many other personal blogs. “How do I start? Where do I start?” I asked.
I took a chance to read around, tried to capture some Azure concepts especially the mindset. Without a correct mindset, everything is a mess. What I read will confuse me more.
Almost everything in Azure is a resource. To manage a resource there is Resource Manager. A resource can be created, managed using templates. So there is Resource Template. As a developer, that part makes sense to me.
The design is modular, component-based. In the high level, its design is familiar with the software design principles.
A virtual machine is deployed to the cloud. Its connection is controlled by a Network Interface (NIC), a separated resource.
Let’s say we need to deploy a virtual machine in Azure. And we should be able to remote (RDP) to it. How many resources do we need? How does it look? Let’s find out.
All my resources start with ps-az300. The rest are auto generated by Azure or my mistakes while experiencing.
Resource group: rg-az300
Virtual network (vNet): ps-az300-vnet
Virtual machine: ps-az300
Network interface (NIC): ps-az300-nic
Network security group (NSG): ps-az300-nsg
Public IP address (PIP): ps-az300-pip
Resource Group
Resource groups are logical containers for everything. All resources used to setup our example are grouped in a resource group. Once the experience is completed, deleting the resource group will wipe out its resources.
Virtual Network
Virtual network (vNet) supplies an isolated environment where resources inside a vNet can talk to each other. It increases security.
Network Security Group (NSG)
Like firewall in Windows. Define the inbound and outbound rules. Beside the default rules generated by Azure, the inbound rule “RDP_3389” is created to allow remote desktop connection.
Network Interface (NIC)
Act as a lawyer between resources with the internet. A virtual machine should not define its firewall directly. Instead, a network interface is attached to it.
A network interface has a vNet, a NSG, and attaches to a Virtual Machine. It might have a public IP address, defined by a public IP address resource (ps-az300-pip).
This network interface allows the VM (ps-az300) communicates with other resources or over the internet. What it can communicate with depends on the NSG settings.
Its public IP address is configured under the Settings -> IP configurations
The interesting thing here is the Public IP Address. One can create a PIP easily, just remember in the Assignment section to choose the static.
Public IP Address (PIP)
As seen in the NIC section above. The public IP address for the NIC is 23.101.16.27 supplied by Azure. There are 5 reserved public IP address for a public IP address resource. That’s why I choose the static assignment.
Virtual Machine (VM)
Just go through the Azure wizard and choose settings: Resource security group, Network interface, Virtual network. There is another course regarding creating virtual machines in Azure. Hope that I can write something about it soon.
Since I am learning Virtual Network, this is the most interesting about the virtual machine setup – the Networking.
The Virtual Machine is the actual resource that hosts other business services; if we want to deploy, say a website, an internal web service.
Use the network interface ps-az300-nic to communicate with the outside
Run under the virtual network with the default subnet
Have a public IP address 23.101.16.27 with a private IP (10.1.0.13) inside its virtual network
Follow the inbound/outbound rules from the network security group ps-az300-nsg
With those setup, I can click on the Connect button and download RDP file.
There are many things in the process that I do not understand. There are many concepts in those images I paste here. That’s ok. Things make more sense to me.
The next challenge is to have 2 virtual machines in different virtual networks communicate to each other.
Update: Stopped for while. Now Azure has changed tremendously. I published this post for me to, at least, have a reference later. The content has never finished 🙁
When something happens, executes a logic, outputs result (if there is) somewhere. At the highest abstraction, it is simple like that, Azure Functions. But its simplicity has captured almost all scenarios in real life.
I was about to write some concepts in Azure Functions. However, MS Docs is so good that you can go ahead and read them. I posted the link below to save you some typing; and for my reference next time
When a result is place into a storage, it triggers another action, and another action. The chain keeps going. It stops when a business requirement is met. The power of Azure Function is here. It gives us a flexible tool to build a complex business process. The limit is our design skills. Yeah, I know that there are limitations in term of technical challenge. But with a good design, architectural skill, you can build almost whatever you want to accomplish.
I am a developer. I need to know deeper. And most importantly, I have to answer this question
What can I do with such a powerful tool?
After reading around the internet and courses from Pluralsight, I head over to Azure portal and start my first function. Most of my writing here is not new. Many have written them. The point of my writing is for my learning purpose. I want to document what I learn in my own words.
Let’s build a simple team cooperation process using available tools. Say in my team whenever we do a release:
The biggest challenge, as a developer, is to get it done right. The infrastructure has many built-in support. There are variety of input and output bindings. Each binding has a set of convenient conversion support as well as customization at your disposal.
Azure Function in Azure Portal
Azure Function from Visual Studio
Environment: Visual Studio 2017 Enterprise, .NET Core 2.0
After creating a new Function project, here what VS gives me
using System.IO;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;
namespace Aduze.Functions
{
public static class HttpTriggerFunctions
{
[FunctionName("HttpTriggerDemo")]
public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string requestBody = new StreamReader(req.Body).ReadToEnd();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
}
}
There are notions of AspNetCore, WebJobs. In HttpTrigger function, the function accepts a HttpRequest and returns IActionResult. Whoever codes ASP.NET MVC knows what they are.
Challenge for Architects
How should you architect a system with all the power you have from Azure?
At the simplest form, WebJob is a background service (think of Windows Service) running alongside with the WebSite (or Web Application). The abstraction usage is that it will handle the long running jobs for web application, therefore, free the web application to serve as many requests as possible.
From the book or abstraction level thinking, think of a scenario where there is a book management website. A user has many books. One day, he wants to download all his books in a zip file. Assuming that the action will take time, in a matter of minutes, therefore, you do not want your users to wait. At the high design level, there are steps
Record a download all books request.
Trigger a background job to handle the request: Read all books and create a zip file.
Email the user with a link to download the zip file.
#1 and #3 are handled via web application. #2 is a good candidate for a WebJob. There are other ways to implement #2. But, in the context of this post, it is a WebJob.
That is the overview, the abstract level. Looks simple and easy to understand. But, hmm everything has a but, devil is at the detail. Let’s get our hands dirty in the code.
Context
Everything has its own context. Here they are
Given that there is an ASP.NET Core 2.0 website running in Azure App Service. Configuration settings, connection strings are configured using portal dashboard
I want to be able to build a WebJob that:
Can consume those settings. So I can manage application settings at one place, and change at wish without redeployment.
Take advantages of Dependency Inject from Microsoft.Extensions (the same as ASP.NET Core application)
Simple like that!
Environment and Code
Visual Studio Enterprise 2017
Version 15.6.4
.NET Framework 4.7.02556
If you are using a different environment, some default settings might be different.
Before showing code, you must install those packages using NuGet, I prefer using Package Manager Console. Tips: Type “Tab” to use auto complete.
class Program
{
// Please set the following connection strings in app.config for this WebJob to run:
// AzureWebJobsDashboard and AzureWebJobsStorage
static void Main()
{
IServiceCollection serviceCollection = new ServiceCollection();
ConfigureServices(serviceCollection);
var config = new JobHostConfiguration
{
JobActivator = new ServiceCollectionJobActivator(serviceCollection.BuildServiceProvider())
};
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
// See full trigger extensions https://github.com/Azure/azure-webjobs-sdk-extensions/blob/master/README.md
config.UseTimers();
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
/// <summary>
/// https://matt-roberts.me/azure-webjobs-in-net-core-2-with-di-and-configuration/
/// </summary>
/// <param name="serviceCollection"></param>
private static void ConfigureServices(IServiceCollection serviceCollection)
{
var config = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json", optional:true, reloadOnChange:true)
.AddEnvironmentVariables()
.Build();
serviceCollection.AddOptions();
serviceCollection.Configure<AppSetting>(config);
// Configure custom services
serviceCollection.AddScoped<Functions>();
}
}
First, create a ServiceCollection and configure it with all dependencies. Pay attention to the use of AddEnvironmentVariables()
Second, create a custom IJobActivator: ServiceCollectionJobActivator
Wire them up
public class ServiceCollectionJobActivator : IJobActivator
{
private readonly IServiceProvider _serviceProvider;
public ServiceCollectionJobActivator(IServiceProvider serviceProvider)
{
_serviceProvider = serviceProvider;
}
public T CreateInstance<T>()
{
return _serviceProvider.GetService<T>();
}
}
A very simple implementation. What it does is telling the JobHostConfiguration to use IServiceProvider (supply from ServiceCollection) to create instances.
And I have DI at will
public class Functions
{
private readonly AppSetting _settings;
public Functions(IOptions<AppSetting> settingAccessor)
{
_settings = settingAccessor.Value;
}
public void FetchTogglTimeEntry([TimerTrigger("00:02:00")] TimerInfo timer, TextWriter log)
{
log.WriteLine("Toggl job settings: {0}", _settings.TogglJobSettings.Url);
}
}
public class AppSetting
{
public TogglJobSettings TogglJobSettings { get; set; }
}
public class TogglJobSettings
{
public string Url { get; set; }
public string SecretKey { get; set; }
}
The Functions class accept IOptions<AppSetting> injected into its constructor, just like an MVC controller.
I just want to have only TogglJobSettings. However, it does not work if injected IOptions<TogglJobSettings>.
Looking at the syntax (having __ in the keys), they should have been correct with the binding. However, whatever in the Application settings, there will be APPSETTING_ prefix, by looking at the environment variables.
Go to Azure Website, access Console, type env to see all environment variables
Pretty cool tool 🙂
By creating a top level class AppSetting (think of appSettings in web.config), things just work out of the box.
Wrap Up
Once having them setup, I can start writing business code. The WebJobs SDK and its extensions supply many ways of triggering a job. The DI infrastructure setup might happen once, and reuse (copy and paste) many times in other WebJobs; however, I gain so much confident and knowledge when getting my hands on code. Well, actually, it is always a good, right way of learning anything.
If you are learning Azure, I suggest you open Visual Studio and start typing.
I am learning Entity Framework Core as part of my Azure journey. Database is an important part in an application. In the old days, developers wrote raw SQL queries. Later, we have had ADO.NET. Recently we have ORM. I have had a chance to work (know) the 2 big guys: NHibernate and Entity Framework.
ORM does more than just a mapping between object model and database representation, such as SQL Table, Column. Each ORM framework comes with a plenty of features, supports variety of scenarios. ORM helps you build a better application. Let’s discover some from the latest ORM from Microsoft: Entity Framework Core.
I was amazed by visiting the official document site. Everything you need to learn is there, in well-written, understandable pages. To my learning, I started with courses on Pluralsight, author Julie Lerman. If you happen to have Pluralsight account, go ahead and watch them. I is worth your time. Then I read the EF document on its official site.
It is easy to say that “Hey I know Entity Framework Core“. Yes, I understand it. But I need the skill, not just a mental understanding. To make sure I build EF skill, I write blog posts and write code. It is also my advice to you, developers.
Define a simple domain model and hook up with EF Core in ASP.NET Core + EF Core project
Migration: From code to database
API testing with Postman or Fiddler (I do not want to spend time on building UI)
Unit Testing with In Memory and real databases.
Running on Azure with Azure SQL
Retry strategy
1 – Domain Model
To get started, I have only these super simple domain model
namespace Aduze.Domain
{
public abstract class Entity
{
public int Id { get; set; }
}
public class User : Entity
{
public string LoginName { get; set; }
public string FullName { get; set; }
public Image Avatar { get; set; }
}
public class Image : Entity
{
public string Uri { get; set; }
}
}
A User with an avatar (Image).
Next, I have to setup DbContext
namespace Aduze.Data
{
public class AduzeContext : DbContext
{
public DbSet<User> Users { get; set; }
public AduzeContext(DbContextOptions options)
:base(options)
{
}
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
}
}
}
Pretty simple just like the example in the document site. Just a quick note here, I organize domain classes in Domain project, data access layer in Data project. I do not like the term Repository very much.
Just call the extension method: AddDbContext and done. God damn simple!
2 – Migration
The system cannot work unless there is a database. There are 2 possible solutions
Use your SQL skill and create database with correct schema.
Use what EF offers
I have done the former many years. Let’s explore the later.
Having your VS 2017 opened, access the Package Manager Console window
Add-Migration
Default project: Aduze.Data where the DbContext is configured.
Add-Migration: A PowerShell command supplied by EF Core. Tips: Type Get-Help Add-Migration to ask for help
InitializeUser: The migration name. One can give whatever makes sense.
After executed, The “Migrations” folder is added into the Data project. Visit EF Core document to understand what it does and syntaxes.
Script-Migration
So how does the SQL script look like?
PM> Script-Migration
IF OBJECT_ID(N'__EFMigrationsHistory') IS NULL
BEGIN
CREATE TABLE [__EFMigrationsHistory] (
[MigrationId] nvarchar(150) NOT NULL,
[ProductVersion] nvarchar(32) NOT NULL,
CONSTRAINT [PK___EFMigrationsHistory] PRIMARY KEY ([MigrationId])
);
END;
GO
CREATE TABLE [Image] (
[Id] int NOT NULL IDENTITY,
[Uri] nvarchar(max) NULL,
CONSTRAINT [PK_Image] PRIMARY KEY ([Id])
);
GO
CREATE TABLE [Users] (
[Id] int NOT NULL IDENTITY,
[AvatarId] int NULL,
[FullName] nvarchar(max) NULL,
[LoginName] nvarchar(max) NULL,
CONSTRAINT [PK_Users] PRIMARY KEY ([Id]),
CONSTRAINT [FK_Users_Image_AvatarId] FOREIGN KEY ([AvatarId]) REFERENCES [Image] ([Id]) ON DELETE NO ACTION
);
GO
CREATE INDEX [IX_Users_AvatarId] ON [Users] ([AvatarId]);
GO
INSERT INTO [__EFMigrationsHistory] ([MigrationId], [ProductVersion])
VALUES (N'20180420112151_InitializeUser', N'2.0.2-rtm-10011');
GO
Cool! I can take the script and run in SQL Management Studio. Having scripts ready, I can use them to create Azure SQL database later on.
Update-Database
Which allows me to create the database directly from Package Manager Console (which is a PowerShell). Let’s see
PM> Update-Database -Verbose
By turning Verbose on, It logs everything out in the console. The result is my database created
It is very smart. How could It do?
Read the startup project Aduze.Web and extract the ConnectionString from appsettings.json
Run the migrations created from Add-Migration command.
3 – API Testing
So far nothing has happened yet.
namespace Aduze.Web.Controllers
{
public class UserController : Controller
{
private readonly AduzeContext _context;
public UserController(AduzeContext context)
{
_context = context;
}
[HttpPost]
public async Task<IActionResult> Create([FromBody]User user)
{
_context.Add(user);
await _context.SaveChangesAsync();
return Json(user);
}
[HttpGet]
public async Task<IActionResult> Index()
{
var users = await _context.Users.ToListAsync();
return Json(users);
}
}
}
A typical Web API controller.
Create: Will insert a user. There is no validation, mapping between request to domain, … It is not a production code.
Index: List all users.
Here is the test using Postman
If I invoke the /user endpoint, the user is on the list.
Hey, what was going on behind the scene?
There are plenty of information you can inspect from the Debug window. When inserting a user, those are queries sent to the database (you should see the one to insert the avatar image).
So far so good. I have gone from domain model and build a full flow endpoint API. How about unit testing?
4 – Unit Test
One of the biggest concern when doing unit test is the database dependency. How could EF Core help? It has In-Memory provider. But first, I have to refactor my code since I do not want to test API controller.
namespace Aduze.Data
{
public class UserData
{
private readonly AduzeContext _context;
public UserData(AduzeContext context)
{
_context = context;
}
public async Task<User> Create(User user)
{
_context.Add(user);
await _context.SaveChangesAsync();
return user;
}
public async Task<IEnumerable<User>> GetAll()
{
return await _context.Users.ToListAsync();
}
}
}
namespace Aduze.Web.Controllers
{
public class UserController : Controller
{
private readonly UserData _userData;
public UserController(UserData userData)
{
_userData = userData;
}
[HttpPost]
public async Task<IActionResult> Create([FromBody]User user)
{
return Json(await _userData.Create(user));
}
[HttpGet]
public async Task<IActionResult> Index()
{
return Json(await _userData.GetAll());
}
}
}
That’s should do the trick. Then just register the new UserData service to IoC
Because my refactor UserData uses async version. It seems to have a problem with MS Tests runner. But it is the same with testing directly again AduzeDbContext.
Use DbContextOptionsBuilder to tell EF Core that the context will use In Memory provider.
Pass the options to DbContext constructor.
Having the power to control which provider will be using is a powerful design. One can have a test suite that is independent to the provider. Most of the time we will test with In Memory provider. But when time comes to verify that the database schema is correct, can switch to a real database.
5 – Azure SQL
Time to grow up … to the cloud with these simple steps
Publish the web to Azure
Create Azure SQL database
Update connection string
Run the script (remember the Script-Migration command?) to create database schema
Just add the connection string: AduzeSqlConnection (is defined in appsettings.json at the local development).
Test again with Postman. Oh yeah baby. It works like a charm.
6 – Retry Strategy
This topic is not something I want to explore at this stage of my learning journey. But it is important to be aware of, at least note down the reference link to Connection Resiliency.
Wrap Up
It is not something new nor complicated if we look at its surface. However, when I get my hands dirty at the code and writing, I learn so much. Knowing how to define a DbContext is easy, understanding why it was designed that way is another complete story.
But is that all about EF Core? No. It is just a beginning. There are many things that developers will look at them when they experience problems in real projects. The document is there, the community is there. Oh, S.O has all answers.
What I will look at next is how EF Core supports developers with DDD (Domain Driven Design).