Immutable data types after .NET 5 release

Just couple weeks ago, Microsoft released RC of .NET 5 which is (unfortunately) not going to be an LTS (Long Term Support) release but on the other hand, it’s coming with some great features in it (yep yep).

One of them comes as a part of the new release of C# 9.0 (part of the .NET 5 release) which is Immutable Objects and Properties (records and init-only properties). Quite a smart concept in my opinion …

Recap on immutable data type

The immutable data type is basically data type of the variable of which the value cannot be changed after creation.

How does it look in reality?

Well, once immutable data typed object is created then only way how to change its value is to create a new one with a copied value of the previous instance.

What are the current immutable (and mostly used) data types from .NET CLR?

Primitive types

  • Byte and SByte
  • Int16 and UInt16
  • Int32 and UInt32
  • Int64 and UInt64
  • IntPtr
  • Single
  • Double
  • Decimal

Others

  • All enumeration types (enum, Enum)
  • All delegate types
  • DateTime, TimeSpan and DateTimeOffset
  • DBNull
  • Guid
  • Nullable
  • String
  • Tuple<T>
  • Uri
  • Version
  • Void
  • Lookup<TKey, TElement>

As you can see, we have quite a few to choose from already. How this list is going to look like after .NET 5 full release in November 2020?

Well, it’s going to be a revolutionary change in my 2 cents.

Principally, any object using .NET 5 runtime (and C# 9.0) can be immutable and also implement its own immutable state – and that is HOT feature.

The syntax of the immutable properties looks like in this example:

public class ObjectName
{
    public string FirstProperty { get; init; }
    public string SecondProperty { get; init; }
}

On the other hand, the syntax of immutable object (called a record) looks this:

public data class ObjectName
{
    public string FirstProperty { get; init; }
    public string SecondProperty { get; init; }
}

As you can see, the syntax is very clear and intuitive to use.

More details about new C# 9.0 features can be found here https://docs.microsoft.com/en-us/dotnet/csharp/whats-new/csharp-9#record-types.

How to provision Azure Function in Azure by using Terraform CLI

Choosing the right way how to keep an infrastructure versioned and well maintained in source code is becoming a quite big issue in these days. There are several options to choose from on the market currently, and it’s easy to get trapped in a never-ending research cycle. For those working with Azure services only, ARM Templates is more than an obvious answer to this but what if want to have more flexibility in going over beyond the Azure boundaries?

You may be wondering why I should use anything else but ARM templates?

The answer is simple. The ARM templates may be a bottleneck for the IT solutions using different cloud providers (multicloud solutions). In this case, managing infrastructure as a code may become a quite tricky (and ugly) thing to do over time. The thing is that every tool used for infrastructure management has it’s “own ways” of how to work with it and that comes with necessary knowledge base every production team must have beforehand. And as an implication of this, choosing the right tool for your infrastructure management (including deployment) is very important.

Terraform would be a great way how to face this challenge. Just as a proof of its simplicity, what I would like to show you here is a short demonstration of how easy it is to provision Azure function in the Azure cloud by using HashiCorp Configuration Language (HCL) and TF (Terraform) CLI utility in PowerShell.

You might be wondering, what features does TF has over the ARM templates? Well, these infrastructure management tools have “the same” set of functionality but TF has other perks on the top of that, which makes it more secure and convenient tool for DevOps (besides multicloud cloud support).

Key features are:

  1. HCL (HashiCorp Configuration Language) – high-level configuration syntax, well structured and intuitive language (TF also supports configuration using JSON for these JS geeks)
  2. Execution Plans – shows you exactly what is going to happen with infrastructure before the change is getting executed
  3. Resource Graph – the visual understanding of the infrastructure, and in my opinion, Terrafarom has done a very good job on this feature (don’t forget that it’s OpenSource!)
  4. Change Automation – yes, every change needed on infrastructure can be automated -> that means less human interaction -> and less room for human errors, YAY!

If you’re new in Terraform and want to get a feel of what Terraform is, have a look at this introduction video footage with the Co-Founder and CTO Armon Dadgar.

Prerequisites (before we start)

  • Terraform utility downloaded and configured on the local environment (guide of how to do it … here)
    for Win10 users, in case of having an issue with WSL2, I recommend following this article to get over this issue
  • Azure CLI installed and ready to roll (guide of how to do it … here)

Steps to follow

For these going exactly step by step as described in this guide make sure that any resource name starting with ‘ms‘ needs to be unique. I recommend using some other characters as prefix just to be sure that this exercise on your side will go smoothly. You won’t go far with copy&paste technique here – oops!

  1. Log in to Azure by using Azure CLI (Azure Command Prompt) or PowerShell
az login

2. If you have multiple subscriptions, skip this step otherwise. List them all out by running this command and choose the one wanted to be used (subscription_id )

az account list
Subscription details after login

3. Find out what is the latest supported AzureRM provider here (at this time of writing this post 2.29.0). This step is not mandatory but I would highly recommend to do it this way as AzureRM API might change in future so better to have a version of the CLI referenced to the code batch file.

4. Create a folder and the file within main.tf (mine is located at c:/Temp/terraform-test/)

5. Add this snipped code at the beginning of the file. This will configure Azure CLI authentication in Terraform

provider "azurerm" {
  version = "=2.29.0"
  subscription_id = "<your Azure subscription id from the step 1 or 2>"
  features {}
}

6. Append the file with the rest of the script from below. For this exercise, the data centre in Australia Central is going to be used (but change it if you like), new Azure function is going to be using consumption service plan as well as running on Windows OS (this is the default option anyway – change it to the Linux if you wish)

resource "azurerm_resource_group" "example" {
  name     = "azure-functions-cptest-rg"
  location = "australiacentral"
}

resource "azurerm_storage_account" "example" {
  name                     = "msfunctionsapptestsa"
  resource_group_name      = azurerm_resource_group.example.name
  location                 = azurerm_resource_group.example.location
  account_tier             = "Standard"
  account_replication_type = "LRS"
}

resource "azurerm_app_service_plan" "example" {
  name                = "azure-functions-test-service-plan"
  location            = azurerm_resource_group.example.location
  resource_group_name = azurerm_resource_group.example.name
  kind                = "FunctionApp"

  sku {
    tier = "Dynamic"
    size = "Y1"
  }
}

resource "azurerm_function_app" "example" {
  name                       = "mstest-azure-functions"
  location                   = azurerm_resource_group.example.location
  resource_group_name        = azurerm_resource_group.example.name
  app_service_plan_id        = azurerm_app_service_plan.example.id
  storage_account_name       = azurerm_storage_account.example.name
  storage_account_access_key = azurerm_storage_account.example.primary_access_key
}

7. Navigate to the folder with main.tf file created, open Command promp or PowerShell and type this command below

terraform init

This action will the create selections.json file at .terraform\plugins\ and download the AzureRMplugin into the .terraform\plugins\registry.terraform.io\hashicorp\azurerm\2.29.0\windows_amd64 directory. The CLI utility starts its live time with the batch files from now on – perfect isolation approach from the running environment (although CLI utility itself may be quite hungry for disk space!).

The selections.json file content

8. You can skip this step if in hurry but continue reading if you want to know more about how to generate the infrastructure change plan … Open the Command prompt or PowerShell and run this command from below to see the infrastructure plan before the change execution. I can strongly recommend using some advanced IDE like MS Code for working with TF (because the text editor and the command console are all integrated into one app) as opposed to switching from the text editor back to command console – this can be annoying…

terraform plan

The plan should look similar like in the screenshot below. For those using MS Code, I would recommend downloading HashiCorp Terraform extension to accelerate your further IaC development – I found it very useful in time efficiency!

Terraform infrastructure change plan

9. Let’s get ready for D-day. Type this command to apply and execute the changes to Azure

terraform apply

This command is going to generate the infrastructure change plan and prompts the confirmation message to the user – I am happy with the planning changes, so typing yes.

The Terraform confirmation message

10. ..and if everything has finished successfully, you should be able to see this message in the end

Resources successfully created
Resource group with all resources created in Azure portal

Entire main.tf file content

provider "azurerm" {  
  version = "=2.29.0"
  subscription_id = "834b29c3-9626-408d-88e0-12e92793d1f5"
  features {}
}

# Azure functions using a Consumption service plan on Windows OS (default option)
resource "azurerm_resource_group" "example" {
  name     = "azure-functions-cptest-rg"
  location = "australiacentral"
}

resource "azurerm_storage_account" "example" {
  name                     = "msfunctionsapptestsa"
  resource_group_name      = azurerm_resource_group.example.name
  location                 = azurerm_resource_group.example.location
  account_tier             = "Standard"
  account_replication_type = "LRS"
}

resource "azurerm_app_service_plan" "example" {
  name                = "azure-functions-test-service-plan"
  location            = azurerm_resource_group.example.location
  resource_group_name = azurerm_resource_group.example.name
  kind                = "FunctionApp"

  sku {
    tier = "Dynamic"
    size = "Y1"
  }
}

resource "azurerm_function_app" "example" {
  name                       = "mstest-azure-functions"
  location                   = azurerm_resource_group.example.location
  resource_group_name        = azurerm_resource_group.example.name
  app_service_plan_id        = azurerm_app_service_plan.example.id
  storage_account_name       = azurerm_storage_account.example.name
  storage_account_access_key = azurerm_storage_account.example.primary_access_key
}

Also available on GitHub https://github.com/stenly311/Terraform-AzureFunction-InAzure

Overall Terraform CLI rating

  • Cloud provider portability
  • Fewer lines needed to achieve the same infrastructure configuration need compering to Azure ARM Templates
  • Intuitive and fast to learn
  • OpenSource with a wide collection of “get-started” production like examples
5/5 Rambo rating

How to find out PowerShell version quickly

Every Developer/DevOps was looking for this command at least ones a year (including myself). Truth is that we all like PowerShell and sometimes tend to forget put a running engine pre-conditional check inside a batch script (as whatever script you’re producing, always make sure that it’s transfarable onto another environment) and referencing some function which is not in (default) installed version.

Anyway…take this post as a reminder reference quide.

3 ways of how to quickly find out what version of PowerShell having installed

1. $PSVersionTable.PSVersion

This is my prefered way over the others. Why? Because it works on local as well as on remote station.

2. (Get-Host).Version

3. $host.Version

Summing this up

That’s it. Hope you like this short reminder and let me know your preferred way of doing this. Cheers!

How to build and provision Azure Cognitive Search service in 20 minutes

This Azure service has been here for a while now, but lately got a few improvements which make the integration and use of it even easier and seamless than before.

Just before going any further, if you haven’t read anything about it, I recommend you to start with this article first https://docs.microsoft.com/en-us/azure/search/search-what-is-azure-search as I am not going to dive too much into the details today. This reading is going to be about my personal experience of getting Cognitive Search service provisioned with a bunch of data (of one source) connected to it.

Personally, I like to look at the problems to solve with my business lense. What that means is that focusing on building a content (business value) rather than building the search engine (feature). I am not saying that compromising (non-business related) system features which are helping users to enhance their User system Experience is a good thing to do. All I am saying is stop re-inventing the wheel!

Hey Devs, don’t give me this wiggle face saying things like: “c’mon, it’s not that hard to do it by yourself!”. Yes, but actually it is hard from the time complexity point of view … To build a great search engine with features your audience is going to like would take many weeks of man/hours to do so. These features including Auto-completion, geospatial search, filtering, and faceting capabilities for a rich UX, OCR (ideally backed by AI), key phrase extraction, image text found results highlighting and all of that with ability to scale this service as needed and add as many multiple (and different) data sources as needed.

Can you see my point, now? Did one of your eyebrow just lift up? :)) Anyway .. let’s jump into it and see how long this is going to take me to build in Azure portal.

Steps how to build a first Cognitive Search as a Service

1) Go to Azure portal, search for Cognitive Services and add new one called “Azure Cognitive Search

Adding Azure Cognitive Search to Cognitive Service library in the Azure portal

2) As for all services in Azure space, you need to fill up what Subscription and Resource Group this service will belong to. And as the next step, prefered URL, Geographic location of data centre and pricing Tier. I am choosing the free Tier (should be enough for this exercise) and location close to NZ. The next step is to click on Validate, and on Create button afterwards.

Filling up the service initials

3) The first step in the wizard is “Connect to your data” tab. That means that on this page you can connect to multiple data sources. As you can see from the picture below, quite a few options are available to choose from (and most likely going to cover all of the use case scenarios). For this exercise, I am going to take “Samples” and SQL database. You can add as many data sources as you want (with the respect on limitations of the selected service Tier type).

Adding a connection to the data source

4) At “Add cognitive skills” tab I decided to add a bunch of additional Text Cognitive Skills, even though this step is optional. My reasons are purely investigative and I would like to see how the @search.score field in returning data result sets is going to look like when trying to search my documents by any of these fields from Enriched data set.

Adding extra source fields for cognitive skills run

5) In the next step “Customize target index” (sometimes referred to as a “pull model“) I am going to leave all pre-populated settings as they are as I am happy with it for now. In this step you can configure things like the level of data exposure, data field types, filtering, sorting, etc.

Just to give you a better understanding of what the search index is in this context – think about it as in relational database a search index equates to a table. And also we have documents, which are the items of the index. Think about them as documents are roughly equivalent to rows in a table.

Also, remember to keep a Key field in Edm.String data type. This is a mandatory prerequisite.

Customizing the target indexes

6) In “Create an Indexer” tab (the way how to index data in a scheduled manner) I am not allowed to configure how often should be mapping table (index) build. Reason for it is that the Sample SQL database I am using in this exercise does not use any Change tracking policies (for example  SQL Integrated Change Tracking Policy). Why is needed? Well, basically Cognitive search needs to know when the data delete change happened to address that. You can read more about it here.

For now, I am going to submit this form and moving on.

The service starts provisioning itself (this should not take long to finish) and after a couple of minutes, I should have everything ready for testing.

Create an indexer tab

Testing the Search Service

Now, let’s have a look at “Search explorer” from service level main top menu and craft some data queries. My first query was “Bacheor-Wohnung” word, which nicely got populated into the URL query as value of &search element by itself…

Data result set from an example query

From now on it is all about to know how to use a query syntax (and you can really go hard on this). For more search query examples visit this MS documentation https://docs.microsoft.com/en-us/azure/search/search-explorer?WT.mc_id=Portal-Microsoft_Azure_Search

I have to say that building this service did take me about 20 minutes (for someone who has some experience already) from having nothing to easy-to-configure and scale search engine. Anyone should be able to build the first Cognitive search service by a similar time after reading this post now.

If there are any questions or want to know more about this service, visit this site built by Microsoft at https://docs.microsoft.com/en-us/azure/search/. These people did a really great job in documenting all of it. This material should help you to elevate your skills onto a more advanced level.

What is the Azure Cognitive Search Tiers pricing

_FREEBASICSTANDARD S1STANDARD S2STANDARD S3STORAGE OPTIMIZED L1STORAGE OPTIMIZED L2
Storage50 MB2 GB25 GB
(max 300 GB per service)
100 GB
(max 1 TB per service)
200 GB
(max 2 TB per service)
1 TB
(max 12 TB per service)
2 TB
(max 24 TB per service)
Max indexes per service31550200200 or 1000/partition in high density1 mode1010
Scale out limitsN/AUp to 3 units per service
(max 1 partition; max 3 replicas)
Up to 36 units per service
(max 12 partition; max 12 replicas)
Up to 36 units per service
(max 12 partition; max 12 replicas)
Up to 36 units per service
(max 12 partition; max 12 replicas)
up to 12 replicas in high density1 mode
Up to 36 units per service
(max 12 partition; max 12 replicas)
Up to 36 units per service
(max 12 partition; max 12 replicas)
up to 12 replicas in high density1 mode
Document Cracking: Image ExtractionN/A
(only 20 documents supported)
(price per 1,000 images)
0-1M images – $1.512
1M-5M images – $1.210
5M+ images – $0.983
(price per 1,000 images)
0-1M images – $1.512
1M-5M images – $1.210
5M+ images – $0.983
(price per 1,000 images)
0-1M images – $1.512
1M-5M images – $1.210
5M+ images – $0.983
(price per 1,000 images)
0-1M images – $1.512
1M-5M images – $1.210
5M+ images – $0.983
(price per 1,000 images)
0-1M images – $1.512
1M-5M images – $1.210
5M+ images – $0.983
(price per 1,000 images)
0-1M images – $1.512
1M-5M images – $1.210
5M+ images – $0.983
Private Endpoints Related ChargesN/AAdditional charges may apply2Additional charges may apply2Additional charges may apply2Additional charges may apply2Additional charges may apply2Additional charges may apply2
Price per unitFree$0.153/hour$0.509/hour$2.033/hour$4.065/hour$5.805/hourN/A
Azure Cognitive Search Tiers pricing

Overall Azure service rating

  • it is very easy to create own search SaaS in couple of minutes
  • intuitive way how to integrate new data sources to the service
  • easy to leverage cognitive capabilities in features like OCR
  • CONVENIENCE – zero coding required on service side, all search service settings can be configured in Azure portal
5/5 Rambo rating

What type of coding challenge to expect on technical interview with Google

It has been a while since being interviewed by Google and got to answer a lot of technical questions. The essence of being successful is to be prepared! Especially now, in these Covid-19 difficult times, when getting a job is even harder than before for young developers with no professional network nor working experience (hey, YOU are not alone in this!)

And so, I am writing this Post for you, the NEXT DEV GENERATION! But just to be clear, this post is not about to leak the hiring questions to the public. It is about to give YOU an idea of what sort of coding challenges you may get along the way.

The most given questions (and tricky ones) are how to efficiently solve the problems with the algorithm. You as a Dev must show understanding of what a time complexity is, how to work with data structures and how to write (and write less) readable code, and all of that while people on another side of the conference meeting are WATCHING! (feel the stress but stay CALM, stay COOL)

Remember that this task assignment was given to me a couple of years ago and don’t rely on getting exactly the same coding Task on your D-Day. The assignment will be different but the level of (solution) the complexity of the task is more likely going to be the same.

Coding challenge

Task assignment

You have a collection of numeric item values among which are numbers ‘0’. Build the algorithm which shifts all zeros to the end of the array with the best time complexity possible. You are not allowed to use any additional data structures in the solution. Also, keep the items at the same order as they are.

Design

Always do design first!

Normally, it is a good practice to ask as many questions as possible to clarify all the requirements at the beginning (these are all positive points). Some of them can be (not explicitly written ones in task description):

  • should my solution be structured as for production use?
  • do you want me to write a unit test, too?
  • can I use Google? – NOPE, don’t ask this one. All good companies usually structure the technical questions in the way that any (capable) candidate should be able to answer them. Don’t take it personally if failed, you are not just there yet.

Coding

The solution I have used was based on swapping the items within the array:

using System;

namespace TestApp
{
    class Program
    {
        static void Main(string[] args)
        {
            var array = new int[] { 1, 0, 4, 5, 0, 4, 5, 3, 0 };

            var iterations = 0;
           
            int j = 0;
            for (int i = 0; i < array.Length; i++)
            {
                ++iterations;
                if (array[i] != 0)
                {
                    array[j++] = array[i];
                }
            }

            while (j < array.Length)
            {
                ++iterations1;
                array[j++] = 0;
            }

            Console.WriteLine($"Array: '{string.Join(",", array)}'");
            Console.WriteLine($"Time complexity: O({iterations})"); 
            Console.ReadLine();
        }
    }
}

Let’s examine the code.

As you can see, I have two loops. By the first one (for), I am trying to find a non-zero value in each iteration, copy the value from the current index to Pivot index and incrementing Pivot at the end of the cycle. If zero is found, Pivot index value remains and the loop goes onto the next item. If a non-zero value is found again, the value at the current index gets copied over to Pivot index (zero value) and Pivot index value gets incremented by 1.

The second loop (while) is going to add zero values at all indexes between Pivot index value and the last array index (that many zeros have to be placed back to an array).

What is the time complexity of this solution? Let’s do an analysis of it.

First loop (for) goes over 9 items within an array. The array has 3 zero values (while loop). Total number of iterations is: 9 + 3 = 12 => O(12) => O(n)

Linear time complexity? THIS IS PRETTY GOOD TO ME! But do I want to gain an extra point (and I WANTED to) by building a slightly different approach with fewer loop iterations?

So, I asked Google Hiring Technical Manager whether I can compromise the last requirement and reorder the non-zero values in the array a little bit. He has agreed…

Why am I doing this?! The answer is optimization … As you can see, the while loop might not be a part of the solution (now) if going thru an array in a reverse way and swapping the zero-value items with the one sitting at the last (examined) array index:

using System;

namespace TestApp
{
    class Program
    {
        static void Main(string[] args)
        {
            var array = new int[] { 1, 0, 4, 5, 0, 4, 5, 3, 0 };

            var iterations = 0;

            var end = array.Length - 1;
            var index = end;
            for (int i = end; i >= 0; i--)
            {
                ++iterations;
                if (array[i] == 0)
                {
                    var left = array[index];
                    var right = array[i];

                    array[i] = left;
                    array[index--] = right;
                }
            }

            Console.WriteLine($"Array: '{string.Join(",", array)}'");
            Console.WriteLine($"Time complexity: O({iterations})");
            Console.ReadLine();
        }
    }
}

What is the time complexity now?

Algorithm is using one loop (for) and goes over 9 items within an array => O(9) => O(n)

Not a bad approach and another plus point going towards my credit bank (Yep, Yep!).

Conclusion

The second approach might not sound like huge performance achievement (and it is NOT for such a small dataset) but it shows the Technical Recruiter Manager your way of thinking! Remember, it can be only a good impression what stands between choosing you over another tens/hundreds of candidates applying for the same role as you do.

Wishing you good luck and let me know in comments below how the technical interview did go along!

You can download the code from my GitHub repo here: https://github.com/stenly311/Moving-Zero-Value-Items-To-The-End-Of-Array

Few tips what to look at before going to technical interview

Why every Software Developer should learn these 3 languages

We are living in a very fast and dynamic world now. The days when software developer could have just a narrow set of skillset are gone and in order to “do good” on the market, everyone must adopt.

It does not have to be a radial adaptation process (phew!) but having a reasonable good knowledge about certain development languages, patterns, frameworks and “way of doing stuff” ala trends is the must.

That is why YOU as a software developer should know these languages at least an intermediate level to be able to code some basics without googling.

Alright, enough of initial sauce of words, let’s get into these three languages according to 2020 Dev survey.

Must languages to learn

Python

1. Believe it or not, the best option for you is Python. I am not going to write what this language is in detail but in brief, this interpreted and high-level and generic-purpose language has become integrated into almost any type of solution you can think of (cross-platform). Well, that is not surprising to me as been with us for almost 3 decades now (1991). What is more interesting in it is the actual philosophy which stands on these points:

  • Beautiful is better than ugly
  • Explicit is better than implicit
  • Simple is better than complex
  • Complex is better than complicated
  • Readability counts

You can read more about the language here https://en.wikipedia.org/wiki/Python_(programming_language)
You can get yourself quickly into it by looking at this repo (my recommendation) https://github.com/microsoft/c9-python-getting-started

JavaScript

2. Honestly, I am surprised that JavaScript made it to second place (and not to the top). I personally think that this multiparadigm language has a lot of potential for the future and so every developer should learn it.

Read more about JavaScript here https://en.wikipedia.org/wiki/JavaScript
My recommendation howto get started is to follow this sites:
* https://developer.mozilla.org/en-US/docs/Learn/Getting_started_with_the_web/JavaScript_basics
* https://www.w3schools.com/js/

Go (Golang)

3. And probably my favourite one over these two is Go (Golang). Not because of my experience (just started to learn this) but because of what is capable of in very short time (hey I am C# dev, I know what I am talking about!). This would not be my surprise if Go makes its way to the top of the ladder in the next 3 years.

Read more about it here https://golang.org/
And after that go and install Go on your desktop, and hit this page of how to get started https://golang.org/doc/tutorial/getting-started

Used data source

Survey conducted by 65k of tech geeks


For more information have a look at this page https://insights.stackoverflow.com/survey/2020#technology-most-loved-dreaded-and-wanted-languages-wanted

Just remember, that data has been collected from an active society contributing to Stackoverflow. That means that these results do not EXACTLY reflect the market situation globally nor in your region. Always do your homework and look at the different data sources, related to the place you live (and going to be for the next 5 years).

New Zealenders this does not apply for you. You cannot go wrong with these three ones. Just for reference, NZ based company Rocket Lab is constatly hiring Software Engineers with Golang experience https://www.rocketlabusa.com/careers/positions/.

Overall Stackoverflow survey rating

  • it’s great to have an actual IT pros attending in this survey
  • Stack overflow holds a big audience
  • in my opinion, data were collected from younger generation as oppose to older and so segregated datasets might not be in required balance for reports
3/5 Rambo rating

How much ACR (Azure Container Registry) costs?

Well, believe it or not, this Azure service has no free subscription. The ‘cheapest’ one is about $0.252/day with total of 10 GiB of storage and 2 Web hooks. Unfortunatelly, with no support for Geo replication.

As pricing can change over the time, this site should give you the most up-to-date details: https://azure.microsoft.com/en-us/pricing/details/container-registry/

_BASICSTANDARDPREMIUM
Price per day$0.252$1.008$2.520
Included storage (GiB)10100500
Premium offers enhanced throughput for docker pulls across multiple, concurrent nodes
Total web hooks210500
Geo ReplicationNot SupportedNot SupportedSupported
$2.520 per replicated region
Azure Container Registery pricing

Do I like ACR?

Yes and no …

For a big projects in size, where the biggest proportion of the solution services is getting provisioned in Azure – Yes, definetely. The level of convinience of having ‘everything’ (source code, tool-set, hosting environment, …) in one place plays a big role in here. The assumption is that if Devs/DevOps are happy with tool-set within the same platform, the overall progress on the project should be faster as there is no need for an extra work for system integration and shaping diametrically different skill set (theory but works in many cases).

And for the projects hungry for disk space and tight to budget – No. There are cheaper alternatives on the market, for example Docker.com (with one private repository in Free plan – whoop, whoop!). Pricing starts as low as USD $5/month (with annual plan) which is insanely CHEAP! So if Azure is not your dime in solution, Docker.com wold be my choice to pick.

More details about Docker pricing (and most updated) can be found here: https://www.docker.com/pricing

Docker pricing and subscriptions
Docker pricing and subscriptions

Overall rating of this technology

5/5 Rambos

Committing and Pushing Docker image changes to Azure Container Registry

Docker image

If you made it this far, then you must know something about the Docker (Containerization). That is great because this post is not about what Docker really is but how to work with image revisions in conjunction with the Azure Container Registry (aka repo).

I am assuming you have your own repository in Azure created already and know the basic commands of how to spin up the container (or leave the comments below).

Also that a Docker desktop is installed on your PC and has docker image ready to be used for this exercise.

User story

You as a developer want to create a starting (based) image out of the running container on your localhost (image type regardless of this exercise) for your co-workers. Image is going to be parked in ACR for easy access. The initial version is going to have a tag ‘v1’.

Steps to follow

1. Download MS Azure Command Prompt, the latest version can be found here or just use google search


2. Validate that installation has been succesful by starting the MS Azure Promp and run

az --version
If you see this, then you did well!


3. Log in to Azure by using this command below (you should be redirected onto the browser app with portal.azure.com as URL. Now, use your user credentials and wait for callback redirect back to the terminal (MS ACP)

az acr login --name <your ACR name> 

Example:

az acr login --name webcommerce22


4. Commit the latest changes on the top of running docker container (docker desktop) with tag v1 (this operation creates a new image). Remember that only these characters are allowed in naming ‘a-z0-9-_.’

docker commit <docker container hash id> <repository URL>[/<new image name>][:<tag name>]

Example:

docker commit 2cbbb6f54f4b webcommerce22.azurecr.io/web-api-pricing:v1
The created new image out of running container


5. Push the changes to ACR

docker push <repository URL>[/<image name>][:<tag id>]

Example:

docker push webcommerce22.azurecr.io/web-api-pricing:v1
Pushing image to Azure Container Registery
New container repository is created with v1 image in it


And here you are. Great way how to keep your changes over the container image revisioned.

For more details about the docker commands, I can recommend to follow this URL https://docs.docker.com/engine/reference/commandline/docker/

Overall feature rating

5/5 Rambo rating