Azure DocumentDB – CRUD Operations

Started playing around with Azure DocumentDB, and I am pretty excited about it. The simplicity is just crazy. Most of the application out there need CRUD operations, So I thought I will try to compile them in 1 post for quick reference.

  • To begin with you need to create a DocumentDB database account. The below link walks through that.

  • Next step is to create DocumentDB collections, which will hold our documents.
  • Next step is to add documents into the collections. Following link explains how to add, view documents using the document explorer, we will look how to do them using the sdk.
  • Next step is to get the DocumentDB client library package to start coding against Azure DocumentDB.

  • Next create a DocumentClient , from which we will execute all our CRUD operations.

You can find the DocumentDB URI and primary key from the azure portal.


  •  Once we have a DocumentClient we are ready to work with collections. To work with collections we need to get the collection link, the easiest way to find the selflink for the collection is to use a tool called DocumentDB Studio .


Copy the selflink as part of your configuration.

As application developer we work with entities and objects and their json serialized formats, Azure DocumentDB is designed from the ground up to natively support JSON and JavaScript directly inside the database engine, which makes application development much more agile.  You can download the simple console application here which covers all the basic CRUD operations on Azure  DocumentDB. Also for quick reference, below is the code.

  •  Resources

Azure Storage Logging

You have configured azure storage to work with your application. Next step is to log all the requests that come to your azure storage. The steps are super easy. To Enable logging on your storage account.

  • Navigate to your azure storage account.
  • Click on the configure tab and find the logging section below.


  • Check all the requests you want to log, read/write/ delete on your Tables, Blobs and Queues.
  • Set the Retention policy in days, the number of days you want to keep the analytics data. Setting it 0 means no limit. It’s always good to set a retention period since logs can become large quite fast.
  • Once you enable logging, a blob container called $logs will be created in the same storage account(log files are saved as blobs)
  • I use a Azure Storage Explorer tool to see the contents of my log files.


  • You can also use azure storage client library to work with $logs blob container just like any other container. You can find all the code samples here .

Azure webjobs with queue trigger

Most of the websites always require a batch processing job which can do resource intensive tasks like sending emails , listen to a queue etc. Azure Webjobs  allows us to just that and its just a .NET console application which runs within the context of an azure website.
These jobs can be scheduled to run on demand, continuously, or on a predefined schedule with a recurrence.
Additionally, operations defined within WebJobs can be triggered to run either when the job runs or when a new file is created in Blob storage or a message is sent to an Azure queue.
To create a azure webjob with queue trigger
1.Go to Visual Studio -> Right click your website -> Add -> New Azure Webjob project


2.Open Program.cs.
3.Modify the class Program so that it is public. If you do not make the class public, WebJobs will not detect and run your operations.
4.Inside Program.cs, add a using statement for Microsoft.Azure.WebJobs.
5.Inside main() you will see these 2 lines of code.
JobHost host = new JobHost();
which specifies that webjob will run continuously.
6.Next, add a method to be invoked by WebJobs, following method will invoked when a new message appears in the queue named myqueue.
public static void TriggerFunction([QueueTrigger(“myqueue”)]CloudQueueMessage message)


A webjob can also be triggered when a new blob is detected within the container.


To run this WebJob locally, you need to apply some local configuration. Open the App.config file and add connection strings for the canonical storage account connection strings expected by WebJobs SDK, AzureWebJobsDashboard, and AzureWebJobsStorage.

Be sure to replace name and key values with the values from your own storage account.

To test your webjob, set it as a startup project and press f5. Your webjob is up and running and waiting for a trigger, once a new message arrives in the queue the trigger function is trigerred.

Deploy Azure Website

There are quite a few ways to deploy your azure website using visual studio , but i find the following the easiest.

  • Create a vanilla website in azure. This is my destination site where I will deploy all my changes


  • Once you have created a website in azure, try accessing it from your browser.webcreated
  • You should see a welcome page similar to this.
  • accessweb
  • Next step is to make changes and deploy them to your website.
  • Let’s say you have to change the way your website looks like.
  • Create a new web application using visual studio.
  •  newweb
  • Once you have done your changes and ready to deploy to azure, navigate to
  • Find your web app and click on it to go to the dashboard page.
  • dashboard
  • Click on Download the Publish profile link to download the publishing profile file.
  • pubprofile
  • Move back to Visual Studio and right click on your web application and click on Publish.
  • vspublish1
  • You will find a dialog box like this. Click on Import.
  • importclick
  • Browse to the publish profile file you downloaded from azure and import the file. You should see all the details auto populate in the dialog box as shown below.
  • autopop
  • Click on publish to start the deployment process.
  • pubsuccess
  • Navigate to your website to see the changes deployed.
  • websuccess2
  • Continue this process for continuous deployment from visual studio.