Upload Large Files to Azure Storage Account using C#

If you are running a web application in Azure then you must have used Storage Account service. This is a commonly used service to store binary files for example – images, files and videos.

This is quite common usecase for any web application is to upload files. It is really easy to upload small files, however when it comes to uploading large files, ex – 1 GB, 1.5 GB etc then it is little bit challenging and upload can fail due to various reasons.

The article will discuss step by step approach to upload large files. You need to perform first two tasks to setup the prerequisites to achive it.

  1. Create a console application name BlobStorageFileUpload from Visual Studio 2022
  2. Add the NuGet package Azure.Storage.Blobs

At this moment, build you project’s solution to make sure that build is successful without any issue.

Go to Program.cs file and add the code shown below steps –

1 – Define the variables

// Storage Connection String
var connectionString = "<Storage Connection String>";

//Storage container name
var containerName = "<Container Name>";

//Array of files to be uploaded
//You can any big file of your choise
var fileNames = new string[] { "1.3 GB.xlsx" };

2 – Blob Upload and Client Options Settings

// Blob upload options
BlobUploadOptions upOptions = new BlobUploadOptions
{
    TransferOptions = new StorageTransferOptions
    {
        MaximumTransferSize = 4 * 1024 * 1024,
        InitialTransferSize = 4 * 1024 * 1024
    }    
};

// Blob client options
BlobClientOptions blobClientOptions = new BlobClientOptions
{
    Transport = new HttpClientTransport(
                             new HttpClient 
                            { 
                               Timeout = TimeSpan.FromSeconds(240)
                            }),
    Retry = { NetworkTimeout = TimeSpan.FromSeconds(240) }
};

3 – Perform Upload

Console.WriteLine("Upload Started");
foreach (var fileName in fileNames)
{
    Stopwatch stopWatch = new Stopwatch();
    stopWatch.Start();
    var container = new BlobContainerClient(connectionString, 
                                          containerName,
                                          blobClientOptions);
    var blob = container.GetBlobClient(fileName);
    var stream = File.OpenRead(fileName);
    await blob.UploadAsync(stream, upOptions);
    stopWatch.Stop();
    TimeSpan ts = stopWatch.Elapsed;

    string eT = String.Format("{0:00}:{1:00}:{2:00}.{3:00}",
           ts.Hours, ts.Minutes, ts.Seconds,
           ts.Milliseconds / 10);
    Console.WriteLine($"File : {fileName} Upload Time: {eT}");
}
Console.WriteLine("Upload Completed");

4 – Test Upload

To the test the code, put a large file for example – “1.3 GB.xlsx” at the location ..\BlobStorageFileUpload\bin\Debug\ or what ever root location of executable and the file name to the array variable “”.

You can also put multiple large files at the executable location and update the array “fileNames” so that multiple files can be upload at one after another without timeout.

1- Upload Output

Conclusion: The code has been shown here tested with multiple big files and worked without any issue. You have seen how easly we can achieve the large files to Azure Storage.

The demo has been shown for learning purpose. This is a friendly reminder that please be mindful before using the code for production like enviornment.

If you have any suggestions/feedback, please put them in the comment box.

Happy Learning 🙂

Leave a Reply

Up ↑

%d bloggers like this: