DarksideCookie

Come to the dark side...we have cookies!

A way to upload files to Windows Azure Mobile Services

Ok, so it is time for another Mobile Services post I believe. My previous posts about the subject has covered the basics as well as authentication when it comes to Mobile Service. But so far, I have only been doing the most simple tasks, such as added and read data from a SQL Database. However, I have mentioned that Mobile Services is supposed to be sort of a layer on top of more of Microsoft’s cloud offering like for example the Service Bus, storage etc. So in this post, I want to demo how you can utilize Mobile Services to upload files to blob storage.

There are probably a lot of different ways to do this, but 2 stood out for me. The one I am about to describe, using public containers, as well as using shared access signatures (SAS). So before going about it “my way”, I am going to explain SAS, and why I don’t like it even though it might be a “cleaner” way to do it.

Blob storage access is limited by default, something that I like, which is why I won’t even bother talking about public containers. But if uploading files to a public location works for you, then that is easier than what I am about to talk about…

So…private containers… To be able to access private containers, you need to sign your requests to Azure. This signature requires a key, which should be kept private for obvious reasons. So including it in a client application like the ones using Mobile Services would be a massive security issue. The solution to this is that you can create a special key (SAS) that will make it possible to access blob storage for a limited time. The SAS is generated serverside and can then be handed to the client to give him/her access to upload files. More information can be found online at places like this.

Ok, so why don’t I like this? Well, I just find that it means that I have to task the client application with doing the actual upload. This means that if I ever change storage, or even the storage structure, I will have to update the client. Besides, in Win8 and WP8, the client is supposed to do one thing great, and not 100 things “so so”. So tasking my recipe app with communicating with blob storage just because I want a picture of the user for personalization seems a bit off. (No I am not building a recipe app, it was just an example…)

And besides, I would still need some form of serverside code to get me the SAS. It would have been a completely different thing if including blob access code in the client meant not needing serverside code, and thus saving money, but I still need it. SO even if it is a “nicer” solution, it gives me no real added benefit more than adding stuff to the client that I don’t believe should be there.

Anyhow, time to move forward with the solution instead of talking about why I don’t like the other solution.

So… I already have a Mobile Service up and running since before, so I will just skip that. I also have a Windows 8 App Store client since before, so I will keep using that. All I need to do is to add the code needed to select an image and then post it to my Mobile Service.

I decided to do this quick and dirty in code behind of my application, but that’s just to keep it simple… So I added the following XAML

<StackPanel>
<TextBlock x:Name="txtFileName" />
<Button Content="Select File" Click="SelectFile" />
<Button Content="Send File" Click="SendFile" />
</StackPanel>

And the following C#

private async void SelectFile(object sender, RoutedEventArgs e)
{
var dlg = new FileOpenPicker();
dlg.ViewMode = PickerViewMode.Thumbnail;
dlg.FileTypeFilter.Add(".jpg");
var file = await dlg.PickSingleFileAsync();

if (file == null)
return;

_file = file;
txtFileName.Text = _file.DisplayName;
}

private async void SendFile(object sender, RoutedEventArgs e)
{
var msg = new ImageUpload { fileName = "Microsoft.jpg" };
await msg.SetImageData(_file);
await App.MobileService.GetTable<ImageUpload>().InsertAsync(msg);
new MessageDialog("Done").ShowAsync();
}

As you can see, there are 2 event handlers. The first one does the file selection using a FileOpenPicker, which is basic stuff. The second one creates a new ImageUpload object, which I will cover in a little bit, and then uses the Mobile Services proxy to send it to the cloud. The real stuff is going on in the ImageUpload class though…

[DataTable(Name = "images")]
public class ImageUpload
{
public async Task SetImageData(StorageFile file)
{
var content = await FileIO.ReadBufferAsync(file);
var bytes = content.ToArray();
image = Convert.ToBase64String(bytes);
}

public int id { get; set; }
public string fileName { get; set; }
public string image { get; set; }
}

Ok, so the “real stuff” is still pretty simple. All the ImageUpload class does, besides hold the values that are to be sent to the cloud, is to take the contents of the file and convert it to a Base64 encoded string. That way, I can push my file to the cloud as just a string, which Mobile Services already supports.

So now that the ImageUpload class has been created and pushed to the cloud, what happens there? Well, there are a couple of things that have to happen. First of all, the image I am uploaded has to be converted from a Base64 encoded string to my actual image, and then that image has to be sent to blob storage. But let’s just take it one step at the time.

The first thing is to create a new table in my Mobile Service called “images”. Next I need to create an insert script, which is where all the stuff will be hapening.

The first part of the script looks like this

function insert(item, user, request) {
var azure = require('azure');
var blobService = azure.createBlobService('DefaultEndpointsProtocol=https;AccountName=XXXXX;AccountKey=YYYYY');

createContainerIfNotExists(blobService, function(error) {
if (error) {
request.respond(500);
return;
}
uploadFile(blobService, item.image, item.fileName, function(error) {
if (error) {
request.respond(500);
return;
}
delete item.image;
request.execute();
});
});
}

Ok, so what is happening in there? Well, the first thing that happens is that the script gets a reference to the Node.js module called “azure”. This is used for accessing Azure resources…duh… Next that module is used to create a proxy client for my blob storage.

This proxy is then used to create the container if it doesn’t exist. If that method fails, the script returns an HTTP 500. If not, it uploads the file using another helper method. And once again, if that fails, it return an HTTP 500. Otherwise, it removes the Image property so that it isn’t stored in the table, and then executes the request, inserting the rest of the entity properties into the table.

That part isn’t very complicated…so let’s look at the helper methods. First up is the createContainerIfNotExists.

It takes a blob storage proxy as a parameter and uses it to ensure that the target container exists using a publicAccessLevel of “blob”.

function createContainerIfNotExists(blobService, callback) {
console.log('creating container if needed')
blobService.createContainerIfNotExists('democontainer', {publicAccessLevel : 'blob'}, function(error) {
if(error){
console.log(error);
callback(error);
return;
}
console.log('container created')
callback();
});
}

As you can see, I am doing quite a bit of logging as well. This helps when something goes wrong…

The next helper is the uploadFile method. It looks like this

function uploadFile(blobService, file, filename, callback) {
console.log('uploading file');
var fileBuffer = new Buffer(file, 'base64');
blobService.createBlockBlobFromStream('democontainer'
, filename
, new ReadableStreamBuffer(fileBuffer)
, fileBuffer.length
, { contentTypeHeader:'image/jpg' }
, function(error){
if(error){
console.log(error);
callback(error);
return;
}
console.log('file uploaded')
callback();
});
}

It basically just forwards the file information to the blob storage proxy’s createBlockBlobFromStream method. However, there are a few interesting bits in here. First of all, it takes the “file”, which is really the Base64 encoded string, and puts it inside a Buffer, which is told that  the content is Base64 encoded. So now I have my file content as a Buffer instead of a string, which is a good start. However, the method I am calling is called createBlockBlobFromStream. This means that it requires a Stream object, not a Buffer. Unfortunately, this is not .NET, so there isn’t just some neat implicit cast or extension method that solves this. And I couldn’t even find an implementation of Stream, which is an abstract base class, that wraps a Buffer. So after collecting some tips and code snippets from around the web, I built my own. It isn’t actually that complicated, but it becomes a lot of rows of code…

var ReadableStreamBuffer = function(fileBuffer) {
var that = this;
stream.Stream.call(this);
this.readable = true;
this.writable = false;

var frequency = 50;
var chunkSize = 1024;
var size = fileBuffer.length;
var position = 0;

var buffer = new Buffer(fileBuffer.length);
fileBuffer.copy(buffer);

var sendData = function() {
if(size === 0) {
that.emit("end");
return;
}

var amount = Math.min(chunkSize, size);
var chunk = null;
chunk = new Buffer(amount);
buffer.copy(chunk, 0, position, position + amount);
position += amount;
size -= amount;

that.emit("data", chunk);
};

this.size = function() {
return size;
};

this.maxSize = function() {
return buffer.length;
};

this.pause = function() {
if(sendData) {
clearInterval(sendData.interval);
delete sendData.interval;
}
};

this.resume = function() {
if(sendData && !sendData.interval) {
sendData.interval = setInterval(sendData, frequency);
}
};

this.destroy = function() {
that.emit("end");
clearTimeout(sendData.interval);
sendData = null;
that.readable = false;
that.emit("close");
};

this.setEncoding = function(_encoding) {
};

this.resume();
};
util.inherits(ReadableStreamBuffer, stream.Stream);

Ok, if you are interested, I suggest you look through that piece of code. If not, I will give you a quick rundown of what it does.

It basically takes a Buffer, which is pretty much like an byte[]. It wraps that, and keeps track of the current position inside it. It then uses a timer to push a chunk of bytes to baseclass ever so often as long as there is data to push.

That’s actually all there is to it!

Posted: Dec 13 2012, 11:49 by ZeroKoll | Comments (3) |
  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5
Manage post: :)

Comments (3) -

Michael McCabe United Kingdom said:

Hi.

This is a fantastic demo and exactly what I was looking for. However it wasn't actually doing anything. No logs were being created and I was getting no error.

I decided to try and remove all the insert javascript and just do a log at the top, then comit it and I get this error

Request body maximum size limit was exceeded.

Have you hit this problem at all? I'm doing it on a windows phone (nokia920) so I can asume its trying to pass quite a large image! Have you done any kind of compression?

# April 17 2013, 18:09

ZeroKoll Sweden said:

Hi Michael!
Sorry about that. I never uploaded anything large, but I assume that there is a request limit for mobiles services as you say. The solution could be to have a look at my other post about uploading files. It uses shared access keys instead, which would solve your problem.
Cheers,
Chris

# April 22 2013, 09:01

Danaraj Malaysia said:

Hi ZeroKoll,

This is cool. I'm translating this reference to a Windows Phone 8 app instead. However, can I get more reference on downloading the image back from Azure and view it on WP8 ?

Thank you in advance.

# June 08 2013, 17:09

Pingbacks and trackbacks (1)+

Add comment




  Country flag
biuquote
Loading