In my last blog post, I wrote about how we can set up continuous deployment to an Azure Web App, for an ASP.NET application that was using Gulp to generate client side resources. I have also previously written about how to do it using GitHub and Kudu (here and here). However, just creating the client side resources and uploading the to a Web App is really not the best use of Azure. It would be much better to offload those requests to blob storage, instead of having the webserver having to handle them. For several reasons…
So let’s see how we can modify the deployment from the previous post to also include uploading the created resources to blob storage as part of the build.
Creating a Script to Upload the Resources
The first thing we need to get this going, is some form of code that can do the actual uploading the generated files to blob storage. And I guess one of the easiest ways is to just create a PowerShell script that does it.
So by calling in some favors, and Googling a bit, I came up with the following script
[CmdletBinding()]
param(
[Parameter(Mandatory = $true)]
[string]$LocalPath,
[Parameter(Mandatory = $true)]
[string]$StorageContainer,
[Parameter(Mandatory = $true)]
[string]$StorageAccountName,
[Parameter(Mandatory = $true)]
[string]$StorageAccountKey
)
function Remove-LeadingString
{
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true)]
[AllowEmptyString()]
[string[]]
$String,
[Parameter(Mandatory = $true)]
[string]
$LeadingString
)
process
{
foreach ($s in $String)
{
if ($s.StartsWith($LeadingString, $true, [System.Globalization.CultureInfo]::InvariantCulture))
{
$s.Substring($LeadingString.Length)
}
else
{
$s
}
}
}
}
function Construct-ContentTypeProperty
{
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true)]
[System.IO.FileInfo]
$file
)
process
{
switch($file.Extension.ToLowerInvariant()) {
".svg" { return @{"ContentType"="image/svg+xml"} }
".css" { return @{"ContentType"="text/css"} }
".js" { return @{"ContentType"="application/javascript"} }
".json" { return @{"ContentType"="application/javascript"} }
".png" { return @{"ContentType"="image/png"} }
".html" { return @{"ContentType"="text/html"} }
default { return @{"ContentType"="application/octetstream"} }
}
}
}
Write-Host "About to deploy to blobstorage"
# Check if Windows Azure Powershell is avaiable
if ((Get-Module -ListAvailable Azure) -eq $null)
{
throw "Windows Azure Powershell not found! Please install from http://www.windowsazure.com/en-us/downloads/#cmd-line-tools"
}
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$existingContainer = Get-AzureStorageContainer -Context $context | Where-Object { $_.Name -like $StorageContainer }
if (!$existingContainer)
{
$newContainer = New-AzureStorageContainer -Context $context -Name $StorageContainer -Permission Blob
}
$dir = Resolve-Path ($LocalPath)
$files = (Get-ChildItem $dir -recurse | WHERE {!($_.PSIsContainer)})
foreach ($file in $files)
{
$fqName = $file.Directory,'\',$file.Name
$ofs = ''
$fqName = [string]$fqName
$prop = Construct-ContentTypeProperty $file
$blobName = ($file.FullName | Remove-LeadingString -LeadingString "$($dir.Path)\")
Set-AzureStorageBlobContent -Blob $blobName -Container $StorageContainer -File $fqName -Context $context -Properties $prop -Force
}
Yes, I needed to get some help with this. I have very little PowerShell experience, so getting some help just made it a lot faster…
Ok, so what does it do? Well, it isn’t really that complicated. It takes 4 parameters, the local path of the folder to upload, the name of the container to upload the files to, the name of the storage account, and the key to the account. All of these parameters will be passed in by the build definition in just a little while…
Next, it declares a function that can remove the beginning of a string if it starts with the specified string, as well as a function to get the content type of the file being uploaded based on the file extension.
After these two functions have been created, it verifies that the Azure PowerShell commandlets are available. If not, it throws an exception.
It then creates an Azure context, which is basically the way you tell the commandlets you are calling, what credentials to use.
This is then used to create the specified target container, if it doesn’t already exist. After that, it recursively walks through all files and folders in the specified upload directory, uploading one file at the time as it is found.
Not very complicated at all…but could probably do with some optimization in the form of parallel uploads etc, but I couldn’t quite figure that out, and I had other things to solve as well…
Calling the Script During Deployment
Once the script is in place, it needs to be called during the build. Luckily, this is a piece of cake! All you need to do, is to add the following target in the XXX.wpp.targets file that was added in the previous post.
<Target Name="UploadToBlobStorage" AfterTargets="RunGulp" Condition="'$(Configuration)' != 'Debug'">
<Message Text="About to deploy front-end resources to blobstorage using the script found at $(ProjectDir)..\..\Tools\BlobUpload.ps1" />
<PropertyGroup>
<ScriptLocation Condition=" '$(ScriptLocation)'=='' ">$(ProjectDir)..\..\Tools\BlobUpload.ps1</ScriptLocation>
</PropertyGroup>
<Exec Command="powershell -NonInteractive -executionpolicy bypass -command "&{&'$(ScriptLocation)' '$(ProjectDir)dist' 'dist' '$(StorageAccount)' '$(StorageAccountKey)'}"" />
</Target>
As you can see, it is another Target, that is run after the Target called RunGulp. It also has that same condition that the NpmInstall target had, making sure that it isn’t run while building in Debug mode.
The only things that are complicated are the syntax of calling the PowerShell script, which is a bit wonky, and the magical $(StorageAccount) and $(StorageAccountKey) properties, which are actually properties that I have added at the top of the targets file like this
<PropertyGroup>
<StorageAccount></StorageAccount>
<StorageAccountKey></StorageAccountKey>
</PropertyGroup>
However, as you can see, they are empty. That’s because we will populate them from the build definition so that we don’t have to check-in out secret things into source control.
Modifying the Build Definition to Set the Storage Account Values
The final step is to edit the build definition, and make sure that the storage account name and key is set properly, so the PowerShell-script gets the values passed in properly.
To do this, you edit the build definition, going to the Process part of it and expanding the “5. Advanced” section. Like this
Under this section, you will find an “MSBuild arguments” item. In here, you can set the values of the properties used in the targets file. This is done by adding a /p:[PropertyName]=”[PropertyValue]” for each value you want to set. So in this case, I add
/p:StorageAccount="deploymentdemo" /p:StorageAccountKey="XXXXXXXX"
That’s it! If you just check-in the modified targets file, and the PowerShell script, you should be able to queue a new build and have it upload the generated files to blob storage for you.
Finally, to make sure you add the correct includes to the actual web page, I suggest having a look this blog post. It is about doing this with Kudu from GitHub, but the part about “Changing the website” in that post, is just as valid for this scenario. It enables the ability to easily switch between local resources, different blob storage accounts, and even CDN.
Cheers!