All posts in Architecture

Mocking web services

Categories: .net, Architecture, web
Comments Off on Mocking web services

For one of the projects i manage i have two teams and in the end, one implements web services for the other to consume. In production and integration testing, things tend to go well. But when we are faced with debugging or testing the side that consumes web services, we need something more.

I love do discover new things or new ways of doing things, enter EasyMock (https://github.com/CyberAgent/node-easymock). It is a small nodejs web server that returns any file on disk with extra options in it’s config.json file.

You install it with :

$ npm install -g easymock
And you start it within your work directory with :
$ easymock
Server running on http://localhost:3000
Listening on port 3000 and 3001
Documentation at: http://localhost:3000/_documentation/
Logs at: http://localhost:3000/_logs/

If you wanted to mock a CurrentUserCount rest web service which is located at /api/CurrentUserCount, all you need to do is create a “api” directory with a file named “CurrentUserCount_get.json” within it. Here is that result :

There is even a handy automatically created documentation page:

Happy mocking !

Custom nuget feeds with Visual Studio Online Build

Categories: .net, Architecture
Comments Off on Custom nuget feeds with Visual Studio Online Build

So I was working on DayTickler, and we suddenly decided to start using Xamarin controls from Telerik(Progress) and Syncfusion. Traditionally, that usually meant downloading the installer and then referencing the proper assemblies from the local drive. Another workflow was to copy the assemblies to the project directory in some sort of “lib” folder so that those assemblies could be used in a CI environment.

Fast forward to 2016  and we have something called Nuget, so i tried using these to achieve the same objective. The first was to add the two nuget feeds to Visual Studio’s Nuget configuration screen. Easy enough and from there, i was able to provide my Telerik credentials (their feed is private) and install the packages. Yay !

But when you commit to Visual Studio Online, there is no way to build because it would now fail on package restore.

The solution is to:

1 – Create a nuget.config file in the solution so that the build server knows where to download nuget feeds from:

2 – Open your build definition, and in the “Feeds and authentication” section, go point to your nuget.config file:

3 – Press “+” near “Credentials for feeds outside the account/collection”, and add the appropriate details.

 

That’s it ! worked like a charm and there is a “Verify connection” button to ensure all is good.

Application Insight Availability

Categories: Architecture, Azure
Comments Off on Application Insight Availability

I just helped put a rather big system in production. There are always a lot of things to do and we lately turned on availability monitoring with Azure Application Insights.

Since we knew we were going to use this feature, we had added a “ping” controller to our API tier. This controller had a single method called “IsAlive”. It returns a 200 if it can access the database and a 400 if not.

5 minutes after we turned on monitoring, we were able to visualise the latency from 5 spots on the internet likely to have clients using the system. If something fails, we get alerts thrown that tell us when the system falls and when it gets back online.

Doing this a few years ago would had required specialised tools and now, it is a few clicks away in all of the major platforms. Use it, it’s five minutes to ensure you are warned if a problem arises.

Azure and Memory Leaks

Categories: Architecture, Azure
Comments Off on Azure and Memory Leaks

I know I know, how can i have memory leaks, you’ve got a garbage collector in that fancy .net don’t you ?

So this is the second project i work on where we find memory leaks in production. The first one was a third party library that didn’t release a bitmap when it generated reports using it, the second one wants to hold on to Entity Framework contexts after they’ve been used.

Not much i can do about the first case cause i don’t have access to the objects and can release them, but the second one is directly in our code.

The thing is, as soon as Entity Framework context goes out of scope, it should release it’s memory -it’s all managed, and IDispose juste releases the memory deterministically.

Now the fun part, with IOC and DI containers, we tend to receive our references and the container itself manages the lifetime. So if our container is badly configured, we depend on the container to release objects (so they can garbage collected) and just lost the ability to get garbage collection based on out of scope…

DI containers are fun and fancy, and a real plus when you are testing, but PLEASE ensure they are configured correctly !

The first memory leak was hosted on premise with IT people who had no idea how to take a memory snapshot, that was hard.

Azure on the other hand, has all of this stuff built-in, i was completely blown away at how much diagnostic tools were build into the plateform. It’s called “Diagnostics as a Service”. Go discover it !

SQL Dynamic Data Masking

Categories: Architecture, Azure
Comments Off on SQL Dynamic Data Masking

When working with database, we sometimes have to backup-restore them in order to bring them to a test environments. When this happens on premise, there is a good chance the backup gets restored to a know machine.

With the cloud and outsourcing, the backup could be restored miles away on another continent.

Go read this : SQL Azure Data Masking

Ok, the concept is easy enough to grasp, but it is based on the fact that you login with one or another user. Most web applications use a generic user to connect to the database and implement application level security, not at the database layer.

Who knows how we can keep using application security, but pass a special token or something when we need to read the actual data ?

Is there a way to apply the dynamic masking within the database at time of backup so people getting a copy of it gets a safe copy ?

There are two use cases, the first is scrambling data. The second is just not showing it all (i.e. a Er** instead of Erik). Do the where clauses apply to the real value or the mask ?

 

Great things with this technology, i just have to start testing it out…

SQL Azure Sizing

Categories: Architecture, Azure
Comments Off on SQL Azure Sizing

Well, one of my teams just went to production with a rather big project. The whole thing is on hosted on Microsoft Azure.

Like all projects, you always learn a couple of things, there are always good things, bad things and things that could of been done better.

For this project, we chose the Azure SQL Databases to hold the data. An S0 instance gives us plenty of space, and since we actually spent time benchmarking the system, ou sql queries were optimised and we had caching where it counted. I thought we were pretty good and the there would have to be a massive amount of concurrent users on the system to kill it…

Turns out all you need is 60, because although a S0 can take a lot of parallel inserts at a time, it can only handle 60 connections at a time. That is a real bummer.

The chances that we hit 60 connections at the same moment is still pretty slim, because connections are only open for the lifetime of a request. Still, i brought our system up to S1 which gives us 90 connections, just in case…

Another fun thing is we implemented in EF the required SQL error retry logic, so if we get denied because of no available connections, the system will simply retry after a certain timeout, that should prevent the code from failing at the expense of longer execution time.

I though i understood the whole DTU thing, but never took into account the connection limit.. Oh well, live and learn

Refer to this page for information of these limits.

Power BI wowness !

Categories: Architecture, web
Comments Off on Power BI wowness !

I am sitting in a meeting where i am looking at beautifu PowerBI dashbaords built on top of a system we built that is going live soon.

The architecture we had planned called for using PowerBI, but didn`t just how good it is.

The best part, there is a whole gallery of custom visualisations (https://app.powerbi.com/visuals/) for those times you want to express something that wasn’t included inside the box.

Lastly, how did we get the data into PowerBI ? EntityFramework +Restier + Odata ! If the Power BI team is listening, let us  do custom data sources !

Azure Functions

Categories: Architecture, Azure
Comments Off on Azure Functions

By  now you’ve probably heard of something called ServerLess architecture. Now let’s get something straight – this is not about creating apps without servers, but rightly about building apps about we don’t think about servers as the deployment unit.

Non-ServerLess architectures (wow, that sounds weird) is our traditional way of building aps where we compile something into a DLL (or something), then package a few of these things together into a unit. Each unit has a bunch of responsibilities and usually depends on other units (a SQL database, a web service…). Each unit is then deployed on a server of some sort and then we have to think about scaling these things correctly and maintaining connections between the different parts… sounds crazy if you ask me 🙂 When you scale, you end up scaling the whole functionalities included in the unity being scaled.

ServerLess architectures focus on small functionalities (think micro-services). Each function is deployed independently and probably depends on other functions. The thing is each function is scalable on it’s own. One function that is used often might be scaled up, while the other function which gets called once a day stays calm. Under the hood, there are servers but we just tend not to think about them too much. There are usually two kinds of functions, ones that are triggered (http call, event…) or timed (on a schedule).

Azure Functions is one of many offerings (Amazon Lambda is another big player and i love these guys WebTask.IO) that allows you to deploy systems using this kind of architecture.

The last thing i really like about these systems is the extension concept they provide. For example, let’s say a search aggregator requires an algorithm by each provider to verify if user can click on a link. Instead of coding each rue in the main system, each provider can easily host a “Function” that provides the answer and the search aggregator calls them just in time…

Next big thing i do will use these concepts in one way or another – watch out servers out there cause i will start forgetting you exist !!!

Azure Service Bus vs Event Hub

Categories: Architecture, Azure
Comments Off on Azure Service Bus vs Event Hub

I love queuing mechanisms, they allow you to decouple the input of a system from the output of a system, basically so they can be processed at different speeds relative to each other. They also usually come with retry mechanisms just because of the way they are build.

Service Bus is basically a FIFO. Things go in, and things get out in the order they went in. Consumers take an item from the queue, process it and then it gets removed. Azure allows you to define “Topics” instead of “Queues” so that you can have multiple readers (pub-sub concept) for a given item but it’s still the same relative process, read, process, remove.

Event Hub is a much nicer thing if you ask me because it is based on the concept of a stream and removal of items from this stream is independent from consumers processing it. Basically, publishers insert items into the stream and consumers are free to read from this stream from any starting point. Each reader has to maintain a pointer which indicates where they are in the stream and are free to move this pointer forward or backwards. This is great for things where you would like to replay items because you flushed a database somewhere. The catch is you have to maintain the size of stream to keep it at a manageable size. With storage so  cheap now it might be easier for some systems to just never delete items. If it’s a stream for IoT devices with millions of items per day, then maybe only the last 48 hours are relevant and readers who are not quick enough simply “miss out on some data”, which could be okay for some scenarios.

I am still new to this so if I make some over simplifications or errors, please let me know.

I can’t wait for building my first Azure Event Hub app which is not a prototype !