The archive tier feature of the storage account v2 is something that is gaining more and more attention from customers. Why…because of the price and recently Microsoft announced a price reduction of up to 50% for some regions. In addition to this there have been some new features launched which at the time of writing are currently in public preview.
You can now request a high priority retrieval from the archive tier. You may be already aware that there is a rehydration period of up to 15 hours when requesting data to be recalled from the archive tier. This is probably fine for most cases where this data is rarely needed and when it is there is no immediate panic for it. Now there is an option (with an additional fee) to put a high priority request in which will prioritise the rehydration request. Microsoft will endeavour to rehydrate blobs under 10GB within one hour but are stating if there is a large volume of high priority requests this can still take between one and five hours.
One of the other features announced in preview is the ability to upload blobs directly to the archive tier. Up until now, a user would have to first upload to either the Hot or Cool tier before moving the blob(s) to the archive tier. This can incur additional costs such as storage transaction costs for moving between tiers and the minimum 30 day charge for using the cool tier.
The preview feature allows you to use the native REST API for Azure storage by making use of the Put Blob operation.
I managed to get this to work using a nice REST API browser extension called ‘Restlet Client’. This provided a nice GUI to test this feature. I had to provider the URI of the blob container where I wanted to upload my blob to and append a Shared Access Signature (SAS) token to the URI. This token can be easily generated from SAS settings section of the storage account in the Azure portal.
I also need to include some HTTP headers to the REST API.
Specifically these are:
x-ms-version: 2019-02-02
(This needs to be set to this API version for the x-ms-access-tier header below to work)
x-ms-blob-type: BlockBlob
x-ms-access-tier: Archive
(Can be set to Hot, Cool or Archive. Here we want to put directly to Archive)
Finally, I add the file I wish to upload to my storage account. Restlet provides a handy drag and drop feature for this.

Once I received a 201 response code I can check my blob container in the Azure portal and confirm that there is my uploaded file in the Archive tier.

That’s all fine for a single file (blob) upload but what if I have multiple files to upload, this could take a while. No doubt there are several ways to execute this but one way is to use PowerShell. The below script will upload all files in a given local folder to your blob container and place them into Archive storage. Note, this script will not support subfolders and maintain directory structures but I’m sure someone who is better at PowerShell than me can write a more comprehensive script to achieve this.
#Source path, e.g. "C:\Users\akinane\downloads\archive\*"
$directoryPath = "[enter source path]"
#Iterate through folder
foreach ($file in get-ChildItem $directoryPath) {
$file = $file.DirectoryName + '\' + $file.Name
#Get the filename for the URI
$name = (Get-Item $file).Name
#The target URI and SAS token
$uri = "https://[storage account name].blob.core.windows.net/[container name]/$($name)?[copy your SAS token here]"
#HTTPS headers
$headers = @{
'x-ms-version' = '2019-02-02'
'x-ms-blob-type' = 'BlockBlob'
'x-ms-access-tier' = 'Archive';
}
#Upload files
Invoke-RestMethod -Uri $uri -Method Put -Headers $headers -InFile $file
#Output to screen for each file uploaded
Write-Output "Uploaded File: " $file
}
I expect in future it will be much simpler to upload directly to the archive tier but if this is something you need to do today then you can make use of this preview feature using the method I’ve outlined in this article.