Azure SQL Database Elastic Database Jobs

One of more common concerns among database administrators who consider migrating their estate to Azure SQL Database is their ability to manage efficiently the migrated workloads. In particular, the absence of the SQL Server Agent introduces the challenge of identifying a means to perform ad-hoc and scheduled automated management tasks across multiple databases, such as rebuilding indexes, updating reference data, or collecting customer telemetry. Fortunately, recently introduced Elastic Database jobs (still in preview at the time of writing of this article) provide a convenient and efficient way to address this concern. Our objective is to describe their basic functionality and implementation steps.

While their name implies that Elastic Database jobs are intended specifically for Elastic Database pools (which we have covered recently on in this article), this is not actually the case. Their scope can include any custom-defined group of Azure SQL Databases as well as either all or a subset (in the form of a shard set, which you provision by using Elastic Database tools) of databases in an Elastic Database pool. The primary distinction between these two scenarios is that the latter is supported directly from the Azure portal (at, while the former requires the use of Azure PowerShell module or REST API. In either case, jobs can carry out an arbitrary Transact-SQL script (assuming of course that it takes into account limitations applicable to Azure SQL Database), however, the PowerShell-based approach also allows you to execute a data-tier application (in the form of a DACPAC file) across multiple databases in the target collection. Note that if you choose to use T-SQL scripts, you should ensure that they are idempotent (which essentially means that their successful execution will always result in the same outcome, even if they are run multiple times in a row). The reason for this is the resiliency built into the Elastic Database jobs, which automatically retry their tasks, even in cases of transient issues (the number of retries and intervals between them is configurable). This implies that scripts should include an additional logic that, for example, checks for the existence of objects that they are supposed to create. Note that jobs execution is logged, providing accountability and auditability, as well as simplifying troubleshooting.

Jobs are implemented by using a customer-hosted Azure PaaS Cloud Service (which indicates that they rely on the classic, Service Management-based deployment model). The cloud service contains one or more (you should increase this number to at least two to ensure high availability) worker roles, hosted on virtual machines that provide compute resources. The amount of resources is determined by the virtual machine size, which by default is set to A0. The worker roles rely on the control database storing all job metadata and implemented as an Azure SQL Database. In addition, they take advantage of an Azure Service Bus namespace as the messaging mechanism facilitating their communication with each other. Finally, the implementation of the Elastic Database jobs also includes an Azure Storage account, containing diagnostic logs.

Creating and configuring all these components is handled by leveraging the Azure PowerShell module (available via Web Platform Installer) and NuGet Command-line Utility (available from NuGet is used to download and import the Elastic Database jobs package, which includes the InstallElasticDatabaseJobsCmdlets.ps1 PowerShell script. Its execution will result in provisioning of an Elastic Database jobs component described in the previous paragraph, allowing you to specify such parameters as the name of the target Azure Resource Group and its location, the number of Cloud Service worker role virtual machines and their size, pricing tier of the Azure SQL Database hosting the control database as well as administrative credentials of its server instance.

Alternatively, if your intention is to manage jobs exclusively for an existing Elastic Database pool, then you have the option of deploying Elastic Database jobs components directly from the Azure portal at Once you authenticate to access the Azure subscription, you would navigate to the Elastic Database pool blade. From there, you would click the Create jobs icon in the toolbar, revealing the Install services blade. Since the solution is currently in preview, you will need to first accept the preview terms. Next, in the Install services blade, click JOB CREDENTIALS, which designates the admin user name and password for the control database. At the end of this process, the job infrastructure will be deployed.

Similarly, creating and managing Elastic Database jobs can be performed either by using Azure PowerShell or the Azure portal, with the former having the benefit of simplicity and the latter offering considerably more flexibility. The create job icon is accessible directly from the Elastic Database Pool blade (after the infrastructure components providing job support have been deployed). Once you click on it, you will be prompted for administrative credentials to the jobs control database. If you provide them, you will be presented with the Create Job blade, where you can specify the user name and its password to connect to the target database (targeted by the current job) with sufficient permissions to carry out the corresponding tasks. You will also be able to either paste or type in a T-SQL script to be executed. Job status can be evaluated by clicking the Manage jobs icon in the Elastic Database Pool blade.

Creating and managing jobs by using Azure PowerShell is considerably more involved and will be the subject of one of our upcoming articles. Effectively, this concludes our overview of the basic functionality and implementation steps of this new preview feature of Azure SQL Database.

See all articles by Marcin Policht

Marcin Policht
Marcin Policht
Being a long time reader, I'm proud to join the crowd of technology gurus gathered here. I have a fair share of Microsoft exams behind me, a couple of acronyms after my name - MCSE, MCSD, and MCT, decent familiarity with majority of MS BackOffice products (SMS, SQL, Exchange, IIS), programming and scripting languages (VB, C++, VBScript with wsh) and several years of practical experience with Windows environment administration and engineering. My focus these days is on SQL Server and Windows 2000, and I'll attempt to share the most interesting experiences with these products.

Latest Articles