SQL Server 2005 – SQL Server Integration Services – Debugging Script Component – Part 15

Continuing the coverage of debugging features in SQL Server 2005 Integration
Services Data Flow, which we started in the previous
installment
of this series, we will now focus on the Script Component. As
we have already pointed out, some of the methods presented when discussing this
subject in the context of Control Flow (in particular, the ones relying on
event-generated breakpoints) are not applicable to source, destination, and
transformation Data Flow components. This disparity extends even further when
comparing Script Task with its Data Flow counterpart – Script Component – and
takes the form of syntactical and functional differences between them.

As you might recall, the main purpose of the Script Task is to execute
custom Visual Basic .NET code, which performs actions not available via the
built-in tasks and containers. This activity-oriented focus reflects its
Control Flow characteristics. Script Component, on the other hand, while using a
similar implementation approach (based on the same programming language), was
designed to satisfy the need for extending the range of Data Flow features
beyond an out-of-the-box selection of data sources, transformations, and
destinations. While SQL Server 2005 Integration Services already includes a variety
of predefined, ready to use components, they are simply not suitable for every
type of data and its modifications. Flexibility of the Script Component allows
you to address the challenges of processing unusually formatted data or
applying complex business rules.

You need to decide which Script Component type to create as soon as you add
it to the Data Flow area of SSIS designer – by selecting one of three option
buttons in the Select Script Component Type dialog box:

  • Source Script Component – intended for access to non-standard
    data sources, includes Connection Manager and Script sections, allowing you to
    specify a data store (in the form of a connection manager) from which the
    component extracts the information and to write code responsible for performing
    this activity, respectively. Its distinguishing feature is the content of Inputs
    and Outputs section, where you can define multiple outputs, but no inputs. We
    will discuss the usage of these settings in more detail in the next article of
    our series.

  • Destination Script Component – like the Source Script Component,
    it contains identically structured Connection Manager (for designating target
    data stores) and Script sections. What makes it different is the content of the
    Inputs and Outputs section (with a single, already present by default, input
    entry and the possibility of creating multiple outputs), as well as the Input
    Columns section, where you specify data columns that you want to be accessible
    from the scripting environment. Our next article will also include more
    information on this subject.

  • Transformation Script Component – obtains its data via one input
    and, after its modifications are applied, feeds it to one or more outputs. Similar
    to Destination Script Component, it contains Connection Manager, Script, Input
    Columns, as well as Inputs and Outputs sections. The last one contains a single
    Input and a single Output entry (although you have an option of adding extra
    outputs, if necessary).

When working with Input Columns (in the case of Destination and
Transformation Scipt Components), for each one you select, you need to assign a
Usage Type parameter (which takes on one of two possible values – ReadOnly or ReadWrite
depending on whether you intend to alter their content). To configure output
columns, switch to the Input and Outputs section of the Script Transformation
Editor dialog box. Depending on our initial choice of the script component
type, you might see there one (Source and Transformation Script Component) or
none (Destination Component) output nodes. To add new ones, simply click on the
Add Output command button and assign a descriptive name to it. Once the desired
output node is present in the Inputs and Outputs area, expand it to display its
Output Columns subnode. At this point, you will be able to add output columns
using the Add Column command button located underneath, and modify their
properties using the Properties area to the right.

The Script section of the Script Component Editor (for source,
transformation, and destination types) resembles the one appearing in the
Control Flow Script Task. The two also share similar sets of properties (such
as ReadOnlyVariables and ReadWriteVariables, which make possible the referencing
of SSIS variables from the script). As with Script Task, clicking on the Design
Script button brings up the Microsoft Visual Studio for Applications interface
with auto generated code of the ScriptMain class template in it (which, in
turn, contains a sub, responsible for processing individual input or output
rows, depending on the component type).

Even though, as we mentioned before, standard breakpoint functionality
implemented in Control Flow tasks and containers is not present in Data Flow
components (as you can easily find out, the Edit Breakpoints item does not
appear in their context sensitive menu), which makes it more difficult to
follow the execution in real time, there are a few workarounds, which can be
applied when working with the Script Component:

  • It is possible to invoke and log individual events at arbitrarily
    chosen points of Script Component execution. This is accomplished via a two
    step process, consisting of enabling SSIS logging and modifying the component
    code by adding calls to those methods of the ComponentMetaData object, which
    generate (or "fire", in the programming nomenclature) a desired event
    type (such as FireCustomEvent, FireError, FireInformation, FireProgress, or FireWarning).
    For more information on SSIS logging, refer to one of
    our earlier articles of this series
    . In this case, you would need to
    configure it on the Data Flow Task level (from the Configure SSIS Logs:Package
    dialog box in the SSIS designer) and ensure that the type of event you selected
    to be logged matches the event fired by the method you applied. (Note also that
    you have several additional Data Flow task-specific events for which you can
    enable logging, such as BufferSizeTuning, PipelineBufferLeak, PipelineExecutionPlan,
    PipelineExecutionTrees, PipelineInitialization, or ScriptComponentLogEntry).
    Events can be invoked via code in both Script Task and Script Component,
    however, syntax varies between them. In the case of Script Task, this is done
    using:

    Dts.Events.EventMethod(informationCode, subComponent, description, helpFile, helpContext, fireAgain)
     

    where, EventMethod is the
    name of the event method (from the list provided above) followed by its
    parameters (within parenthesis) describing this event in more detail, such as informationCode
    (an integer identifier of the event), subComponent (designating an
    object which raised the event), description (providing more information
    about the nature of the event), helpFile (referencing name of a Help
    file, where documentation relating to the event can be found), helpContext
    (pointing to the location within the Help file), and fireAgain
    (determining whether the event should be fired on subsequent invocations of
    this method). When dealing with the Script Component, the same can be
    accomplished by calling the above listed event methods via the ComponentMetaData
    property of the ScriptMain class (this is done by appending the method’s name
    to "Me.ComponentMetaData." prefix and specifying the same set of
    parameters following the method name and enclosed in the set of parenthesis:

    Me.ComponentMetaData.EventMethod(informationCode, subComponent, description, helpFile, helpContext, fireAgain)

    Following the execution, you should be able to find
    information about events invoked during execution of the Script Component in a
    log, whose location and format would depend on the configuration parameters
    specified in the Configure SSIS Logs:Package dialog box. The event
    notifications (such as FireInformation, FireWarning, FireError, or FireProgress)
    will also be displayed in real time in the Progress (or Execution Results, once
    the package run completes) tab of the SSIS Designer interface as well as the
    Output window (with the exception of the FireProgress method).

  • If the logging is enabled on the Data Flow task level for the ScriptComponentLogEntry
    event on the Details tab of the Configure SSIS Logs:Package dialog box in the
    SSIS designer, you can also record any custom messages by applying Log method
    of the ScriptMain class (without the need for raising events). This will create
    entries in all destinations defined via log providers, which can then be
    reviewed after the execution completes to collect troubleshooting information.
    The syntax of the statement that implements this approach is as follows:

    Me.Log(messageText, dataCode, dataBytes)

    where messageText, dataCode, and dataBytes are
    intended for text, numeric, and Byte (one-dimensional array of Byte values)
    data to be logged.

  • You can apply the traditional method of displaying desired information
    via MessageBox.Show method of the Systems.Windows.Forms namespace. If you
    include the Imports directive, then you can call the method using class.method
    notation, otherwise, you will also need to type in the name of the relevant
    namespace, as in the following example:

    System.Windows.Forms.MessageBox.Show("Kilroy was here")

Most of the debugging methods that we described in the
previous article
of this series when discussing general ways of debugging
Data Flow components also apply to the Script Component. In particular, you can
take advantage of the color-coded status indicators, pipeline path data record
counter, Output window, or Progress (as well as Execution Results) pane of the
SSIS Designer interface. While the Script Component does not contain a
designated default Error Output, it is straightforward to create one (using the
Inputs and Outputs section of the Script Transformation Editor) and redirect
faulty rows to it (with custom code which implements the logic distinguishing
them from valid ones). By connecting the output to a Data Flow destination
component, errors can be separately stored and analyzed. Furthermore, it is
also possible to use this ability to provide additional debugging information
about errors generated by other Data Flow components (by including within
Script Transformation Component code translating an errorCode into equivalent,
but easier to interpret, errorDescription). We will demonstrate this
functionality in our next article.

»


See All Articles by Columnist
Marcin Policht

Marcin Policht
Marcin Policht
Being a long time reader, I'm proud to join the crowd of technology gurus gathered here. I have a fair share of Microsoft exams behind me, a couple of acronyms after my name - MCSE, MCSD, and MCT, decent familiarity with majority of MS BackOffice products (SMS, SQL, Exchange, IIS), programming and scripting languages (VB, C++, VBScript with wsh) and several years of practical experience with Windows environment administration and engineering. My focus these days is on SQL Server and Windows 2000, and I'll attempt to share the most interesting experiences with these products.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles