My intro to Docker - Part 5 of something

In the previous posts, I have talked about everything from what Docker is, to how you can set up a stack of containers using docker-compose, but all of the posts have been about setting up containers on a single machine. Something that can be really useful, but imagine being able to deploy your containers just as easily to a cluster of machines. That would be awesome!

With Docker Swarm, this is exactly what you can do. You can set up a cluster of Docker hosts, and deploy your containers to them in much the same way that you would deploy to a single machine. How cool is that!

Creating a cluster, or swarm

To be able to try out what it’s like working with a cluster of machines, we need a cluster of machines. and by default, Docker for Windows/Mac includes only a single Docker host when it’s installed. However, it also comes with a tool called docker-machine that can be used to create more hosts very easily.

More...

My intro to Docker - Part 4 of something

In the previous posts about Docker, here, here and here, we’ve looked at what Docker is, how to set up a basic container and how to set up a stack of containers using docker-compose. One thing we haven’t talked about is the fact that most projects use some form of persistent data store, and the most common store, at least in my world, is a relational database of some sort. So this time, I want to cover something that might seem slightly odd…setting up an MS SQL Server…on Linux…in Docker.

Yes, you heard me right… I’m going to show you how to set up an MS SQL Server instance in a Linux-based Docker container. Something that wouldn’t have been possible, in any way, not too long ago, but Microsoft “recently” released a version of MS SQL Server that runs on Linux, which is really cool. And running it in Docker just makes sense!

Running MS SQL Server in a Docker container

Starting a SQL Server instance in a Docker isn’t that hard, but there are a couple of things that need to be set up for it to work.

More...

My intro to Docker - Part 3 of something

So far in this little blog series about Docker, I have covered what Docker is, how it works, how to get a container up and running using pre-built images, as well as your own images. But so far, it has all been about setting up a single container with some form of application running inside it. What if we have a more complicated scenario? What if we have a couple of different things we want to run together? Maybe we want to run our ASP.NET Core app that we built in the previous post behind an nginx instance instead of exposing the Kestrel server to the internet… Well, obviously Docker has us covered.

However, before we go any further, I just want to mention that I will only be covering something called docker-compose in this post. This can be used to create a stack of containers that are set started and stopped together. I will not be covering distributing the application accross several nodes this time. There will be more about that later. And even if that is probably the end goal in a lot of cases, being able to just run on a single host can be useful as well. Especially while developing stuff.

What is docker-compose?

When you installed Docker for Windows or Docker for Mac, you automatically got some extra tools installed. One of them is docker-compose, which is a tool for setting up several containers together in a stack, while configuring their network etc. Basically setting up and configuring a set of containers/apps that work together.

More...

Going green…a.k.a. “Hello tretton37!”

As of this Monday, I’m officially working as a ninja at tretton37 in Stockholm. And for those of you who don’t speak Swedish, tretton37 means thirteen37, which is just an awesome name for an IT company.

So what is a ninja? Well, it is pretty much just an IT consultant as such, but the company doesn’t like the idea of that word. At least in Sweden, it has basically been reduced to be the same as a hired resource. The word consultant isn’t about consulting, giving advice and offering knowledge anymore. So to mitigate that, the common name for a person that works at tretton37 is “ninja”. The focus is to give our clients more than just a resource that can code. It’s about more than that. It’s about giving tips and ideas, and go beyond just building something, and instead take a bigger responsibility for the solution. Making sure that we do our best to give the client what they need and not just what they ask for. It’s about listening to what they want to accomplish, and not what they want us to build.

This view of what we should be doing correlates well with my own view on what we should be doing. So I’m very excited to be here, and hopefully there are some cool projects for me here in the future. With some happy clients at the end…

While I wait for that though…I’ll take this chance to catch up on some blogging and things, with the goal being that my blog will once again be a living thing that actually provides value.

My intro to Docker - Part 2 of something

In the previous post, I talked a bit about what Docker is, how it works, and so on. And I even got to the point of showing how you can create and start containers using existing images from Docker Hub. However, just downloading images and running containers like that, is not very useful. Sure, as a Microsoft dev, it's kind of cool to start up a Linux container and try out some leet Linux commands in bash. But other than that it is a little limiting. So I thought I would have a look at the next steps involved in making this and actually useful thing…

Creating something to host in a container

The first step in using Docker is to have something that should run inside of our containers. And since I am a .NET developer, and .NET Core has Linux support, I thought I would write a small ASP.NET Core application to run in my container.

Note: Before you can run this, you need to install the .NET Core SDK

More...

My intro to Docker - Part 1 of something

It’s been quite a while since I blogged, and there are several reasons for this. First of all, I haven’t really had the time, but I also haven’t really found a topic that I feel passionate enough about to blog about. But having played around with Docker, I now have! So I thought I would jot down some stuff about Docker… If nothing else, it gives me a away to come back to what I used to know when I have forgotten all of it again…

What is Docker?

The first thing to cover is what Docker really is. I have seen a lot of explanations, both of what it is, and why it is so good. But I have had a hard time grasping it in the way that it has been explained to me. So here is my explanation. And by my explanation, I mean the way I think about it. It might not be 100% correct from an implementation point of view, but it is the way I see it…

More...

Setting Up Continuous Deployment of an ASP.NET App with Gulp from VSTS to an Azure Web App using Scripted Build Definitions

A few weeks ago, I wrote a couple of blog posts on how to set up continuous deployment to Azure Web Apps, and how to get Gulp to run as a part of it. I covered how to do it from GitHub using Kudu, and how to do it from VSTS using XAML-based build definitions. However, I never got around to do a post about how to do it using the new scripted build definitions in VSTS. So that is why this post is going to be about!

The Application

The application I’ll be working with, is the same on that I have been using in the previous posts. So if you haven’t read them, you might want to go and have a look at them. Or, at least the first part of the first post, which includes the description of the application in use. Without that knowledge, this post might be a bit hard to follow…

More...

Uploading Resources to Blob Storage During Continuous Deployment using XAML Builds in Visual Studio Team Services

In my last blog post, I wrote about how we can set up continuous deployment to an Azure Web App, for an ASP.NET application that was using Gulp to generate client side resources. I have also previously written about how to do it using GitHub and Kudu (here and here). However, just creating the client side resources and uploading the to a Web App is really not the best use of Azure. It would be much better to offload those requests to blob storage, instead of having the webserver having to handle them. For several reasons…

So let’s see how we can modify the deployment from the previous post to also include uploading the created resources to blob storage as part of the build.

More...

Setting Up Continuous Deployment of an ASP.NET App with Gulp from VSTS to an Azure Web App using XAML Build Definitions

I recently wrote a couple of blog posts (first post, second post) about setting up CD of an ASP.NET web app with Gulp from GitHub to an Azure Web App. However, what if we aren’t using GitHub? What if we are using Visual Studio Tem Service (former Visual Studio Online)? Well, in that case, it is a whole different ballgame. There is actually not too much that is the same at all…

More...

Uploading Resources to Blob Storage During Continuous Deployment using Kudu

In  my last post, I wrote about how to run Gulp as part of your deployment process when doing continuous deployment from GitHub to an Azure Web App using Kudu. As part of that post, I used Gulp to generate bundled and minified JavaScript and CSS files that was to be served to the client.

The files were generated by using Gulp, and included in the deployment under a directory called dist. However, they were still part of the website. So they are still taking up resources from the webserver as they need to be served from it. And also, they are taking up precious connections from the browser to the server… By offloading them to Azure Blob Storage, we can decrease the amount of requests the webserver gets, and increase the number of connections used by the browser to retrieve resources. And it isn’t that hard to do…

More...