One day my local development stops working. I was setting them up one year ago for a specific project. My development environment is using certificates for client-server communication. I know for sure that the certificates were expired. The ADFS signing token certificate and self-sign SSL certificate were expired.
A small problem! But the real problem is that I cannot remember what should be done step by step. A year is long enough for an occasional task. Not anymore! This time I will document here for … the next years.
In a normal day of developer life, I was hunting a performance issue and memory leak. It sounds mysterious, but, just another bug, another issue to solve, after all.
When come to the performance/memory issue, one should go for PerfView. The tool gives a very detail picture of what was going on in the memory at a reasonable level that a developer can understand.
The system is a WCF service which works base on the DataContract. From the profiler, I found out that if a returned value has 10MB in size, it will cost the OS of 50MB, proximately of 3 times extra cost. That does not count the memory consumed by the WCF framework to serialize the contract.
Note that I do not judge the architecture good or bad. There were good reasons what it was designed that way.
A very simplified version looks
With that simple code setup, a simple console app that consumes the service
Here is the result of downloading a file of 74MB. The total memory consumed in the heap is 146MB.
Where are those extra cost coming from? The extra cost comes from BinaryDataContractSerializer.Serialize method.
The memory consumed by the DataContractSerializer.
And the memory consumed by the MemoryStream to return an array of bytes.
In many cases, with modern hardware, it is not a big problem. There is Garbage Collector taking care of reclaiming the memory. And if both request and response are small, you do not even notice. Well, of course, unless one day in the production, there are many requests.
There are a couple of potential issues about consuming more memory
If the size is more than 85K (85000 bytes), it is stored in Gen 2 eventually. I would suggest you read more about memory allocation, especially Large Object Heap (LOH). I am so amateur to explain it.
Cause memory fragmentation. Memory keeps increasing. GC has a very hard time to reclaim them.
Of course the system is not in a good shape.
How could we solve the problem without changing the design, with less impact?
We know that some operation will consume lots of memory, such as downloading a file, returning a data set. Instead of returning the byte array, we extend the response to carry the object. We could do that for all operations and get rid of the byte array. However, there are hundreds of operations. And we want to keep the contract simple and with less changes as much as possible.
So an improved version looks like
Run the application and see the memory again
Comparing the two, there is a big win: 2784 objects vs 366 objects; 146MB vs 73MB.
With the increasing power of hardware, RAM and Disk are not problems anymore. With the support from the managed language (such as C#/DotNet), developers code without caring too much about memory, memory allocation. I am not saying all developers. However, I believe there are many that do not care much about that issue.
It is about time to care every single line of code we write, shall we? We do not have to learn and understand every detail about the topics. These are good enough to start
Memory allocation in Heap, Gen 0, Gen 1, and Gen 2.
Memory fragmentation. Just like disk fragmentation.
Memory profiler at abstract level, such as using dotMemory, PerfView.
Garbage Collector. Just have a feel of it is a good start.
I am sure you will be surprised with how fun, how far it takes you.
I ran into a piece of code where it implemented a kind of retry pattern. It will retry a call if an exception is thrown from the first call. A simple version looks like (not a production or real code).
One of a problem with the above code is that when the first timeout exception occurs, the consumer cannot see it anywhere. At least, the ServiceCallHelper should give consumers a chance to deal with exceptions.
How do we do that with less impact on the design? We should strive for a solution which does not introduce a new dependency. I have seen a tendency of injecting a Logger. I might, and will, work. However, now you have dependency on the logger. And the consumer has to know about that logger.
Now take a look at the new version that I propose
With that simple change, we have introduced an extension point that consumers will be happy. And we ensure that our API does not hide away exceptions.
If you are designing an API, consider your extension points, and whether the API allows consumers to have a chance to be aware of exceptions.
.NET framework comes with powerful built classes Func and Action. Take advantages of the two might help you simplified your design.
Update: Stopped for while. Now Azure has changed tremendously. I published this post for me to, at least, have a reference later. The content has never finished 🙁
When something happens, executes a logic, outputs result (if there is) somewhere. At the highest abstraction, it is simple like that, Azure Functions. But its simplicity has captured almost all scenarios in real life.
I was about to write some concepts in Azure Functions. However, MS Docs is so good that you can go ahead and read them. I posted the link below to save you some typing; and for my reference next time
When a result is place into a storage, it triggers another action, and another action. The chain keeps going. It stops when a business requirement is met. The power of Azure Function is here. It gives us a flexible tool to build a complex business process. The limit is our design skills. Yeah, I know that there are limitations in term of technical challenge. But with a good design, architectural skill, you can build almost whatever you want to accomplish.
I am a developer. I need to know deeper. And most importantly, I have to answer this question
What can I do with such a powerful tool?
After reading around the internet and courses from Pluralsight, I head over to Azure portal and start my first function. Most of my writing here is not new. Many have written them. The point of my writing is for my learning purpose. I want to document what I learn in my own words.
Let’s build a simple team cooperation process using available tools. Say in my team whenever we do a release:
The biggest challenge, as a developer, is to get it done right. The infrastructure has many built-in support. There are variety of input and output bindings. Each binding has a set of convenient conversion support as well as customization at your disposal.
Azure Function in Azure Portal
Azure Function from Visual Studio
Environment: Visual Studio 2017 Enterprise, .NET Core 2.0
After creating a new Function project, here what VS gives me
using System.IO;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;
namespace Aduze.Functions
{
public static class HttpTriggerFunctions
{
[FunctionName("HttpTriggerDemo")]
public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string requestBody = new StreamReader(req.Body).ReadToEnd();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
}
}
There are notions of AspNetCore, WebJobs. In HttpTrigger function, the function accepts a HttpRequest and returns IActionResult. Whoever codes ASP.NET MVC knows what they are.
Challenge for Architects
How should you architect a system with all the power you have from Azure?
At the simplest form, WebJob is a background service (think of Windows Service) running alongside with the WebSite (or Web Application). The abstraction usage is that it will handle the long running jobs for web application, therefore, free the web application to serve as many requests as possible.
From the book or abstraction level thinking, think of a scenario where there is a book management website. A user has many books. One day, he wants to download all his books in a zip file. Assuming that the action will take time, in a matter of minutes, therefore, you do not want your users to wait. At the high design level, there are steps
Record a download all books request.
Trigger a background job to handle the request: Read all books and create a zip file.
Email the user with a link to download the zip file.
#1 and #3 are handled via web application. #2 is a good candidate for a WebJob. There are other ways to implement #2. But, in the context of this post, it is a WebJob.
That is the overview, the abstract level. Looks simple and easy to understand. But, hmm everything has a but, devil is at the detail. Let’s get our hands dirty in the code.
Context
Everything has its own context. Here they are
Given that there is an ASP.NET Core 2.0 website running in Azure App Service. Configuration settings, connection strings are configured using portal dashboard
I want to be able to build a WebJob that:
Can consume those settings. So I can manage application settings at one place, and change at wish without redeployment.
Take advantages of Dependency Inject from Microsoft.Extensions (the same as ASP.NET Core application)
Simple like that!
Environment and Code
Visual Studio Enterprise 2017
Version 15.6.4
.NET Framework 4.7.02556
If you are using a different environment, some default settings might be different.
Before showing code, you must install those packages using NuGet, I prefer using Package Manager Console. Tips: Type “Tab” to use auto complete.
class Program
{
// Please set the following connection strings in app.config for this WebJob to run:
// AzureWebJobsDashboard and AzureWebJobsStorage
static void Main()
{
IServiceCollection serviceCollection = new ServiceCollection();
ConfigureServices(serviceCollection);
var config = new JobHostConfiguration
{
JobActivator = new ServiceCollectionJobActivator(serviceCollection.BuildServiceProvider())
};
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
// See full trigger extensions https://github.com/Azure/azure-webjobs-sdk-extensions/blob/master/README.md
config.UseTimers();
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
/// <summary>
/// https://matt-roberts.me/azure-webjobs-in-net-core-2-with-di-and-configuration/
/// </summary>
/// <param name="serviceCollection"></param>
private static void ConfigureServices(IServiceCollection serviceCollection)
{
var config = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json", optional:true, reloadOnChange:true)
.AddEnvironmentVariables()
.Build();
serviceCollection.AddOptions();
serviceCollection.Configure<AppSetting>(config);
// Configure custom services
serviceCollection.AddScoped<Functions>();
}
}
First, create a ServiceCollection and configure it with all dependencies. Pay attention to the use of AddEnvironmentVariables()
Second, create a custom IJobActivator: ServiceCollectionJobActivator
Wire them up
public class ServiceCollectionJobActivator : IJobActivator
{
private readonly IServiceProvider _serviceProvider;
public ServiceCollectionJobActivator(IServiceProvider serviceProvider)
{
_serviceProvider = serviceProvider;
}
public T CreateInstance<T>()
{
return _serviceProvider.GetService<T>();
}
}
A very simple implementation. What it does is telling the JobHostConfiguration to use IServiceProvider (supply from ServiceCollection) to create instances.
And I have DI at will
public class Functions
{
private readonly AppSetting _settings;
public Functions(IOptions<AppSetting> settingAccessor)
{
_settings = settingAccessor.Value;
}
public void FetchTogglTimeEntry([TimerTrigger("00:02:00")] TimerInfo timer, TextWriter log)
{
log.WriteLine("Toggl job settings: {0}", _settings.TogglJobSettings.Url);
}
}
public class AppSetting
{
public TogglJobSettings TogglJobSettings { get; set; }
}
public class TogglJobSettings
{
public string Url { get; set; }
public string SecretKey { get; set; }
}
The Functions class accept IOptions<AppSetting> injected into its constructor, just like an MVC controller.
I just want to have only TogglJobSettings. However, it does not work if injected IOptions<TogglJobSettings>.
Looking at the syntax (having __ in the keys), they should have been correct with the binding. However, whatever in the Application settings, there will be APPSETTING_ prefix, by looking at the environment variables.
Go to Azure Website, access Console, type env to see all environment variables
Pretty cool tool 🙂
By creating a top level class AppSetting (think of appSettings in web.config), things just work out of the box.
Wrap Up
Once having them setup, I can start writing business code. The WebJobs SDK and its extensions supply many ways of triggering a job. The DI infrastructure setup might happen once, and reuse (copy and paste) many times in other WebJobs; however, I gain so much confident and knowledge when getting my hands on code. Well, actually, it is always a good, right way of learning anything.
If you are learning Azure, I suggest you open Visual Studio and start typing.