.Net Framework 1.0 to .Net Core

Introduction

Launched in 2002 .Net framework has been one of the popular development platform for building enterprise apps and with Microsoft’s backing it had the stability and maturity which made it an instant hit. Over the years it has been forked into many subsets to target specific platforms like Mono for Linux, .Net Compact, Windows Phone for mobile, Silverlight for Web UI and many more, although this has worked well but with the advent of Mobile, IOT, Cloud and likes the list of disparate platforms to support has been growing.

Problem

All these forks of .Net Framework have caused a lot of fragmentation which causes sharing code across diff subsets difficult, also .Net framework being a machine wide install and a prerequisite in most scenarios has its challenges on the newer platforms. Not always will you have the permissions to install or update the existing .Net framework, disk size can be an issue on some of the platforms to install the full .Net Platform.

In most cases applications running on these platforms will always be using a very small subset of whats available in .Net framework so if a small essential subset along with the runtime can be packaged with the application itself the deployment becomes so much simpler and with small size and possibly memory footprint.

Solution

.Net Core and .Net Standard are Microsoft’s solution to the above stated problems.

Released in 2016 .Net Core is a cross platform implementation of .net framework with support for web and console apps across Linux, Mac and Windows. Any libraries needed in addition to the Core itself can be pulled via nuget. There is no need for core to be pre-installed on the target machine, everything that’s needed to run the application can be included in the build output.

Just this feature makes .Net Core preferred choice for building my future .Net apps as I can run my apps targeting anything from .Net Core 1.0, 2.0 or any other version in future and they all can run on the same machine without having to worry about the framework version installed or whether upgrade is needed or not.

.Net Core has smaller footprint in terms of memory and build size making it easier to download over devices with limited storage space specially in IOT based solutions, also since being open source makes it a great choice not just for big enterprise but for everyone since its already Cross Platform.

Coming to .Net Standard its a specification of .Net API’s created with the intention to help share code across different .Net platforms. Every .Net implementation whichever platform its meant for targets a specific version of .Net Standard which helps the developer in writing code which can run across platforms e.g. if I am trying to build a DLL which is meant to be consumed across .Net Framework 4.6.1, .Net Core 1.0 and Mono 4.6 then my DLL should be build targeting .Net Standard 1.3 and this will ensure it runs smoothly across all the required platforms. Microsoft has published this simplified table to help decide the .Net Standard version to choose.

Lastly I would say that with .Net Core and .Net Standard Microsoft has hit the nail on its head and we should see .Net getting even more popular over the years with adoption increasing among Open Source community also and with Azure doing so great there is so much to learn and build ūüôā

I will try and share some sample .Net Core apps i am planning to work on soon.

 

Advertisements

Quickly Testing SQL Connectivity

TL;DR

Create an empty text file on your windows system and change its extension to ‘udl’ instead of ‘txt’. Open the file and a SQL connectivity windows opens for you to enter the details and test connection.

Background

Recently I got into a situation where an old app running perfectly for last few years suddenly stopped working complaining about connectivity to SQL Server. The logs generated by the app were not capturing details other than a ‘SQL Exception has occurred’.¬† There were no changes made to the app in some time so obviously the doubt was whether SQL server was fine or not.

Basic verification established that there was nothing wrong with SQL Server as well. Ping from the App Server to the machine hosting the SQL Server was working and port 1433  was also open which meant this was not a network problem but something deeper.

Next step in mind was to write a quick console app which tries to query sample data from this SQL Server, add extensive traces to captures details and give it a spin on the App Server.

While I was waiting for Visual Studio to open I tried to google if there’s another way to test SQL Connection and I stumble upon¬†Test remote SQL connectivity EASILY!. Going through the link I recalled how easy it was to test a remote SQL connection and I had used it umpteen times while starting out as a developer.

Steps to test SQL Connection

1. Create a empty text file anywhere on your system
2. Open this ext file, go to the menu and select file=>save as
3. At the bottom of the dialog change the drop-down “Save as type” to all files and file name as test.udl

SaveAs-UDL
4. Press save and now you should see a new file with an icon diff from notepad associated with it.

UDL-Icon

5. Open the file and a SQL connectivity windows opens for you to enter the details and test connection.

Setup Local SMTP Server

Introduction

We all have been in situations where our application is sending out mails and we want to test the output. This seems simple but in my case SMTP server was in a different network and our corporate firewall restrictions were making it complicated and time consuming. I thought of finding a simpler solution and after some googling stumbled upon PaperCut 

Its a simple receive only SMTP server which is super easy to setup and once started captures all outgoing emails from your system which are send via localhost SMTP server and shows them in a nice UI. It allows you to inspect everything about your email from its body to header to mime types.

Setting up PaperCut

 

  1. Setup your code to send mail after ovveriding the following SMTP details
    • SMTP Server : localhost
    • SMTP Port : 25
    • EnableSSL : False
  2. Download PaperCut and start the exe
  3. Send email and wait for a few seconds, it should appear in the PaperCut window and you can inspect all the details you like about your email messageFig.1

It does give you options to target a particular IP address and Port if you so desire. I haven’t tried it though.

Azure: Storage Accounts

Azure storage is one of the fundamental concept to grasp before any non-trivia deployment can be done in azure. In this post I have tried to explain all the available storage options offered by Azure.

To get started with storing our data in Azure we need a valid subscription [free or pay as you go] within your Azure Account. Once you have an active subscription you can choose storage account type out of the two options offered by Azure.

  • Blob
  • General Purpose

Each storage account has an upper limit of 500 TB and any data stored in these accounts gets replicated to ensure durability and high availability. Which replication is applied depends on the choice user makes and the type of data.

  • LRS (Locally redundant storage): 3 copies in same data center.
  • ZRS¬†(Zone-redundant storage): 3 copies across data center and one or two regions.
  • GRS¬†(Geo-redundant storage): 6 copies, 3 in primary region and 3 in secondary region.
  • RA-GRS (Read-access geo-redundant storage): same as GRS but with read access on copies in¬†secondary region.

Now back to the storage accounts,

Blob Storage account: As the name suggests these accounts are specialized in storing blobs. Azure has categorized blobs into Block Blob, Append blob and Page blob [not supported by Blob Storage account]. All blobs are divided into chunks to make for easy upload/download. This account type also supports the concept of Hot and Cold tier where Hot tier is optimized for data with frequent read/write and Cold tier for storing infrequently accessed data.

  • Block Blob: ¬†Default type for storing large objects, with each block upto 100 MB in size and upto 50,000 blocks / file, allowing a single file of max size 4.76 TB.
  • Append Blob: Useful for files which grow incrementally like log files with new content getting appended to the end of the file, max block size for append blobs is 4 MB and upto 50,000 blocks/file,¬†allowing a single file of max size 195 GB.
  • Page Blob: Its not offered in Blob Storage account as its used for storing VHD files for VM’s. Each VHD file has to define the max size at the time of creation with upper limit of 8 TB. File consists of 512 byte pages optimized for random read and writes.

General Purpose Storage Account: This is the default type for majority of storage scenarios with 4 diff types of storage in this account.

  • Blob Storage: Its the same Blob storage with Page, Block and Append blob options¬† as described above but with all 3 being available for use, also there is no hot/cold tier on offer.
  • File Storage: It is a general purpose file share which is accessible everywhere using SMB 3.0 protocol. Its similar to windows file share in usage and can be mapped to a windows drive letter with “net use” command for quick and easy access.
  • Queue Storage: Azure storage queues are simple message store for reliable storing of messages upto 64 KB in size for maximum of 7 days. Max queue size is only limited by the storage space available in the storage account. It does not offer any broadcast or other advance messaging based services, for those services Azure Service bus is the right choice.
  • Table Storage: Azure tables are a structured, non-relational data store. Each table stores a set of entities and each entity can have a max. of 255 properties to store data. Out of 255 properties 3 are pre-defined by Azure namely Timestamp¬† which is populated by azure, partitionkey and rowkey. The other two properties need to be populated by user to uniquely identify each row in the table. Table storage is highly scalable ¬†with capability to store terabytes of data in a single table and allow really performant queries [although this will require partitioning].

Lastly there is an option of choosing standard (HDD) or premium (SSD) disks for storing your VHD files [Page Blob] in a General purpose storage account and the concept of managed disks. They are primarily used while creating VM’s which I will try and cover in one of my future blog.

Azure : Getting Started

Introduction

Microsoft Azure is one of the two leading Cloud Computing Platforms available to us today other being Amazon AWS.
Both of these are battle tested platforms with a host of services available at competing prices. In this article we are discussing basics of Azure.

Cloud Types

  • Public Cloud : Public cloud is the cloud system offered over the internet by vendors like Microsoft and Amazon.¬†This is what generally people refer to when talking about cloud, you are the consumer here paying for the service provided by cloud vendor.
  • Private Cloud :You own and manage everything, vendors like Microsoft help you with the software which runs on top of your Infrastructure to build your private cloud which is available to people only in your intranet.
    It has a very high upfront cost and similar requirements for upkeep like on-premises data centers. This is never a good option until you have are bound by some law or secrecy pact to keep everything in a closed environment.
  • Hybrid Cloud: This is a mix of both, your public cloud connected with your private cloud over a secured encrypted connection.
    Keep in mind that an on-premise server connected to your public cloud cannot be categorized as hybrid. This makes sense for organisations which have already decided on private cloud
    but want to keep the public cloud as an extension for failover scenarios or handling sudden spikes, please keep in mind that creating a workload for hybrid cloud would be complicated compared to the other two

Services Offered

Azure has a long list of Services and Products on offer ranging from Compute, Network, IOT, Storage, Web + Mobile but in order to consume these services we need to decide on the cloud model thats best for us

  • SaaS (Software as a Service): In this model you are just the consumer of the software with nothing to do with where its hosted, installed, how its build or any other detail. All you are concerned about is using the software for a small fees. For using such software you will need some kind of thin client which in most cases would be your Browser with a working internet connection some e.g. of Azure SaaS offerings are Office360, Outlook.com
  • PaaS (Platform as a Service): In this model you are responsible for your Application and its Data rest all which involve everything from runtime, OS, Networking to the actual hardware is all taken care of by the cloud provider. If you think about it we can call it SaaS for developers. Most popular PaaS offering from Azure is Azure App Services which we will be discussing more about later.
  • IaaS (Infrastructure as a Service): This model gives you the most flexibility but is most complex and costly of the 3. You are responsible not only for Application and its Data but also runtime, Middelware and OS. You need to do the upkeep of your servers along with the application which is additional work. PaaS should be the way to go unless you have a good reason for choosing IaaS.