SQL Server 2012 Integration Services – Unattended Execution of SSIS Packages

As we have demonstrated in our recent articles published on this forum, there are several different methods of executing SQL Server 2012 Integration Services packages in an interactive manner. More commonly though, tasks accomplished via SSIS are a part of procedures that run unattended, either scheduled to launch at a particular date and time or triggered by some arbitrarily chosen event. In this presentation, we will describe the most typical approach to implementing such scenario.

As you might recall from our earlier discussions, SQL Server 2012 Integration Services offer two distinct ways of deploying packages, referred to respectively as Package and Project deployment models. The first of them represents the legacy approach and for the most part, it has remained unchanged since the most recent product release. Depending on your preferences, packages deployed in this fashion can be stored in .dtsx format (as part of the file system) or be incorporated into MSDB database. In either case, it is possible to manage them via SSIS service, in which case they appear in the Stored Packages folder within the Integration Services node of the SQL Server Management Studio console. (Note though, that this applies only to .dtsx files residing in locations corresponding to the SSIS Package Store, which are determined by the value of <StorePath> element of %Program Files%Microsoft SQL Server110DTSBinnMsDtsSrvr.ini.xml file). Packages deployed via the second deployment model (introduced in SQL Server 2012) are encapsulated into projects hosted in the Integration Services Catalog, named SSISDB, (accessible via the Database Engine node of the SQL Server Management Studio console).

The most convenient and popular method of invoking SSIS Packages in the unattended fashion leverages SQL Server Agent service functionality. In order to implement it, you will first have to ensure that the service is enabled (with its Start Mode set to Automatic) and running (which you can determine using SQL Server Configuration Manager console). Once this is verified, in the Object Explorer window of SQL Server Management Studio console, select New->Job from the context sensitive menu of SQL Server Agent node (or its Jobs subfolder), resulting in the display of the New Job dialog box. On its General page, specify an arbitrary package name and, if desired, assign a job category (assuming the standard configuration, this is by default set to [Uncategorized (Local)]). Switch to the Steps page and click on the New… command button, which will present you with New Job Step dialog box. Type in an arbitrary name of the intended action in the Step name textbox and select SQL Server Integration Services Package in the Type listbox. Next, you need to choose the security context in which the individual step will execute by selecting Run as account, that is set by default to SQL Server Agent Service Account. (We will review factors that should be taken into consideration when configuring this setting shortly).

In the lower area of the New Job Step dialog box, on the General tab, you will find entries that allow you to designate the package to be executed. Notice that this portion of the interface depends on the choice of Package source, which can take one of the following values:

  • SSIS Catalog – applicable to the Project Deployment Model; requires that you specify the target Server where the package is hosted, authentication method, and the name and full path (within the SSISDB folder hierarchy) of your Package. This option also exposes the Configuration tab, further subdivided into Parameters (giving you the ability to review and modify values of parameters defined within the package), Connection Managers (listing all connection managers, whose individual properties can be modified at this point), and Advanced (facilitating overriding property values, assigning a desired logging level, as well as allowing you to enforce Dump on errors and 32-bit runtime settings) tabs. You also have an option to designate one of the pre-created environments via the Environment listbox. (For more information on these features, refer to our earlier article).
  • SQL Server– applicable to the Package Deployment Model; refers to scenarios in which packages are stored in MSDB database and accessible via the Stored PackagesMSDB folder hierarchy of Integration Services node, within Object Explorer in SQL Server Management Studio console. Just as with SSIS Catalog, you need to start by specifying the Server, authentication method, and the package path (within Stored PackagesMSDB folder, including the package name). However, unlike before where we had access to the Configuration tab only, selecting this option exposes a considerably larger number of configurable settings, which are grouped into eight tabs (roughly corresponding to page layout in the DTExecUIutility) labeled as follows:
    • Configurations– allows you to add one or more configuration files (typically, with extension .dtsconfig). In the traditional package deployment model scenarios, configurations give you the ability to assign values to package components (including their properties and variables) using external data sources during invocation (while in general it is possible to use registry entries, database tables, environment variables, parent package variables, or XML files, only the last one of these options is applicable here). Note that this supplements the ability to designate configuration files during package development (but keep in mind that those defined within the package will take precedence if they conflict with the content of configuration files specified here).
    • Command files – lets you add one or more command files (standard text files) containing command line options for the dtexec utility (which, behind the scenes, is employed by the SQL Server Integration Services step of a SQL Server Agent job to carry out its actions). One of the more common examples of its usage would be a text file containing multiple /SET switches that assign arbitrary values to package variables. (For more information regarding dtexec syntax, refer to our recent article dedicated to  “Using dtexec.exe to Run File System-based Packages”).
    • Data sources– contains editable entries representing Connection Managers within the package, giving you the ability to modify their connection strings. (Incidentally, these settings correspond to the Connection Managers page of DTExecUIutility).
    • Execution options – just as in DTExecUI, it is possible to set several options related to package validation here (in particular, Fail the package on validation warnings and Validate package without executing) and execution (such as enabling package checkpoints, including specifying the location of the checkpoint file, enforcing restart options, or designating the Use 32 bit runtime). It is also possible to override the MaxConcurrentExecutable property, which determines the maximum number of tasks that can run in parallel within the package (the default value of -1 indicates that the limit is calculated by adding 2 to the number of logical processors).
    • Logging – intended for assigning one or more log providers (you can choose among SSIS log provider for Text files, SSIS log provider for SQL Server, SSIS log provider for SQL Server Profiler, SSIS log provider for Windows Event Log and SSIS log provider for XML files) and specifying connection strings associated with them (specifics depend on the provider – for more information refer to LogProvider.ConfigStringMSDNarticle).
    • Set values – provides the ability to directly set values of package properties (constituting an alternative to the command files we described above). Properties are identified using property paths, whose basic syntax was described in our article dedicated to the dtexec utility.
    • Verification– allows you to specify conditions that need to be satisfied in order for the package to be executed, which may include such characteristics as a digital signature, package build (sequential number assigned during the build process), package ID, or version ID. You can set these values during the package design stage in SQL Server Data Tools via the Properties window of a package (accessible after selecting the package on the Package Explorer tab in the Designerwindow).
    • Command line – displays the summary of all options selected through the remaining tabs (and represents the final list of parameters that will be applied during the step execution). You can Edit the command line manually or, if you change your mind, Restore the original options.
  • File system– applicable to the Package Deployment Model, it is intended for scenarios in which packages are stored as .dtsx files in a local or remote file system location, that are not controlled by SSIS service. You are required to provide the full path of the package file. Selecting this option gives you access to the same selection of settings as those available when using SQL Server as the package store.
  • SSIS Package Store – applicable to the Package Deployment Model, refers to scenarios in which packages are stored as .dtsx files in local or remote file system locations that are controlled by SSIS service, as well as MSDB database (effectively, it supersedes the SQL Server option). It is possible to modify the default locations by adding extra <StorePath> entries to %Program Files%Microsoft SQL Server110DTSBinnMsDtsSrvr.ini.xml (as described in the MSDN Library). Just as with SQL Server entry, you need to specify the Server, authentication method, and the package path (within SSIS Packages folder hierarchy) including the package name. In this case, you also have access to the same selection of settings as those available when using SQL Server or File system as the package store.

When running SSISDB-resident packages, details of their execution can be examined by employing centralized logging implemented as part of the Integration Services Catalogs. More specifically, the context-sensitive menu of the package node within SSISDB folder hierarchy includes Reports entry, with Standard Reports and Custom Reports cascading menu options. Both of them offer a convenient way to obtain data about package invocations, including their overview, all of the auto-generated messages, as well as performance statistics.

If you want to examine results of running packages stored in the other three location types, you can take advantage of the Log File Viewer (accessible via the View History item in the context sensitive menu of the job containing the SQL Server Integration Services Package step). Alternatively, the most recent job execution status is available from Job Activity Monitor (under SQL Server Agent within the Object Explorer window).

As mentioned earlier, it is important to consider the security context in which unattended jobs will be executing. The default, relying on SQL Server Agent service account (which employs NT ServiceSQLSERVERAGENT built-in security principal) might not have appropriate security privileges (NT ServiceSQLSERVERAGENT is a member of the sysadmin fixed server role but its rights outside of the SQL Server instance are limited, as detailed in MSDN Library). In order to overcome these limitations and to comply with the principle of least privilege, you should consider using a separate, dedicated Windows local or domain account instead.

This is accomplished by combining two entities available in SQL Server Management Studio – namely credentials and SQL Server Agent proxy accounts. The first of them allows you to designate a Windows security principal that is available across all SQL Server 2012 components (and can be used when accessing external resources, such as local folders or remote file shares). The purpose of the last one is to define a security context in which a SQL Server Agent job can run (or, more specifically, its individual subsystems, such as in our case SQL Server Integration Services Package).

To create a credential, navigate to the Credentials subfolder of the Security folder (in Object Explorer within SQL Server Management Studio). From its context sensitive menu, select the New Credential… entry. In the resulting dialog box, type in its name and use the Identity textbox to specify the corresponding Windows account, along with its password in the Password and Confirm password entries. Keep in mind that, most typically, such passwords would be set not to expire. In addition, the user account should have Log on as a batch job user right on the SQL Server computer. Once that is completed, you are ready to put the new credentials to use. To do so, right-click on the Proxies folder under the SQL Server Agent node and activate the New Proxy Account dialog box using the matching menu item. Populate Proxy name and Credential name textboxes and in the Active to the following subsystems section enable the checkbox next to SQL Server Integration Services Package entry. In order to restrict access to the proxy account, switch to the Principals page, where you choose the Principal type (SQL Login, Msdb role, or Server role) from the list of Available principals. Designated principals will be able to assign proxy account to the Run as entry when defining SQL Server Integration Services Package step within a SQL Server Agent job.

This completes our overview of SSIS specific aspects of configuring SQL Server Agent jobs. For more information regarding job creation and scheduling, refer to MSDN Library.

See all articles by Marcin Policht

Marcin Policht
Marcin Policht
Being a long time reader, I'm proud to join the crowd of technology gurus gathered here. I have a fair share of Microsoft exams behind me, a couple of acronyms after my name - MCSE, MCSD, and MCT, decent familiarity with majority of MS BackOffice products (SMS, SQL, Exchange, IIS), programming and scripting languages (VB, C++, VBScript with wsh) and several years of practical experience with Windows environment administration and engineering. My focus these days is on SQL Server and Windows 2000, and I'll attempt to share the most interesting experiences with these products.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles