Tải bản đầy đủ (.pdf) (10 trang)

Hands-On Microsoft SQL Server 2008 Integration Services part 22 potx

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (381.41 KB, 10 trang )

188 Hands-On Microsoft SQL Server 2008 Integration Services
package and run that as a child package using the Execute Package task in the main
SSIS package and hence avoid access to salaries data.
Probably the biggest benefit of having an Execute Package task is that the packages
are able to communicate with other packages and have an effect on the success or failure
of other packages. To clarify, consider the container hierarchy in which errors and events
get flagged up the hierarchy with the package being at the top of the hierarchy. When
we use a child package in a parent package, the Execute Package task actually contains
the child package and becomes a parent container for that child package. Any event
occurring in the child package gets passed on to the Execute Package task, which it
can share with other Execute Package tasks used in the parent package. Now think of
transactions that cause tasks to commit or roll back as a unit within a package. Using this
task, transactions can actually span across packages and can cause multiple packages to
commit or roll back as a unit. SSIS provides you the benefits of dealing with individual
tasks and lets you use them as an independent unit, yet at the same time you are also able
to integrate the packages together to work as a unit.
You can run a child package in its own process or in the process of the parent
package using the ExecuteOutOfProcess option. If you are following a modular design
for your package and want your parent and child packages to commit or fail together
as a single unit, you will be using the in-process method—i.e., you will run the child
package in the same process as that of the parent package. For this configuration,
you will not be running additional processing threads and the memory required by
the process will, of course, be less. In this case, the context switching will not happen
and you will experience less processing overhead. However, the down side is that if a
child package crashes due to some problem, that may also kill the parent package. In
addition, if your system has more than 4GB memory, SSIS won’t be able to use it, as a
single process in 32-bit systems can use maximum of 2GB of virtual memory (or 3GB
if you use /3GB switch in boot.ini file). Chapter 13 discusses memory utilization of
SSIS packages and pros and cons of using 32-bit versus 64-bit systems in more detail.
Alternatively, if you want to make use of the full memory resources available on the
system, or you want to protect the parent package from crashes in the child package


due to bugs or other issues, or you want your parent package not to depend on the
success or failure of a child package, then you may prefer to use the out-of-process
method—i.e., the parent and the child packages will run in their own processes. As the
parent package will be using multiple processes, you will see more context switching,
due to the overhead of maintaining multiple processes, and the memory usage will also
be more—in fact, memory utilization can grow to more than 4GB on a 32-bit system
if available on the computer. The following exercise is designed to help you understand
how to use the Execute Package task and the implications of connecting various tasks
in a package.
Chapter 5: Integration Services Control Flow Tasks 189
Hands-On: Consolidating Workflow Packages
The packages you have developed so far to download, expand, archive, and import
files are independent units of work and isolated, too. You want to consolidate all the
packages into one package with a defined sequence and configure them with different
precedence constraints to achieve a more desirable work flow.
Method
In this exercise, you will use the Execute Package task to embed the given packages in
the parent package and join these packages with success constraint. As a second part to
this exercise, you will change the constraints and see the effects on execution results.
Exercise (Building Consolidated Package)
In this part, you will create a new parent package in to which you will add four child
packages created in the earlier exercises using Execute Package Tasks.
1. Start BIDS and open the Control Flow Tasks project. Create a new package in
the Solution Explorer with the name Consolidating workflow packages.dtsx.
2. Drag and drop the Execute Package task from the Toolbox on to the designer
surface.
3. Double-click the Execute Package task to open the Execute Package Task Editor.
4. Type the following in the General page of the editor:
Name Downloading zipped files
Description This task executes the named package.

5. In the Location field on Package page, select File System from the drop-down
list. The other option, SQL Server, could be chosen if the package were stored
in the MSDB database of SQL Server. As your packages are stored in the file
system, you will not be using the SQL Server option here.
6. Click in the Connection field and select <New connection…> to open the File
Connection Manager Editor. Leave Existing File in the Usage Type field and
type C:\SSIS\Projects\Control Flow Tasks\Downloading zipped files.dtsx
in the File field to point the Execute Package task to the Downloading Zipped
Files package.
7. The PackageName field is available when you select SQL Server in the Location
field to allow you to choose the package from the list of packages stored in the
MSDB store. In your case, the field is disabled, as the package name has already
been provided in the Connection field.
190 Hands-On Microsoft SQL Server 2008 Integration Services
8. You can specify the password in the Password field if the package has been
protected with a password. Leave it at the default setting for now. Package
protection levels have been covered in detail in Chapter 7.
9. As we are not using any transactions across the packages, change the
ExecuteOutOfProcess field value to True as shown in Figure 5-17. Click
OK to close the Execute Package Task Editor window.
Figure 5-17 Configuring the Execute Package task
Chapter 5: Integration Services Control Flow Tasks 191
10. From the Toolbox, drop another Execute Package task on the designer surface
just below the Downloading Zipped Files task. Stretch the green arrow from
the Downloading Zipped Files task and join it to the new Execute Package task.
Now, following Steps 3 to 9, configure this task with the following settings:
Name Expanding downloaded files
Description This task executes the named package
Location File system
Connection C:\SSIS\Projects\Control Flow Tasks\Expanding downloaded files.dtsx

ExecuteOutOfProcess True
11. Similarly, add the following packages using the Execute Package task with the
following details and connect them using the green arrows.
For the Archiving Downloaded Files task:
Name Archiving downloaded files
Description This task executes the named package
Location File system
Connection C:\SSIS\Projects\Control Flow Tasks\Archiving downloaded files.dtsx
ExecuteOutOfProcess True
For the Importing Expanded Files task:
Name Importing expanded files
Description This task executes the named package
Location File system
Connection C:\SSIS\Projects\Control Flow Tasks\Importing expanded files.dtsx
ExecuteOutOfProcess True
Your package should look like the one shown in Figure 5-18.
12. Before we run this package, make sure the zipped files are still available on the
FTP server in the Sales folder. After checking this, delete the DealerSales01
.txt and DealerSales02.txt files from the C:\SSIS\downloads folder and delete
DealerSales01.zip and DealerSales02.zip from the C:\SSIS\downloads\Archive
folder. Using SQL Server Management Studio, run the following commands to
delete all the rows from the DealerSales table:
TRUNCATE TABLE [Campaign].[dbo].[DealerSales]
192 Hands-On Microsoft SQL Server 2008 Integration Services
Now that all the previous files and data have been deleted, run the package by
pressing 
5. You will see that the defined packages are opened and executed
in sequence one after another. Once all four packages have been executed
successfully, check the folders to see the files at expected places and the rows
loaded into DealerSales table—242,634 in total.

Exercise (Understanding Precedence Constraints)
In this part of the exercise, you will make changes in the ExtractFiles.bat file to
fail Expanding Downloaded Files task and then study the behavior of the package
execution using success and completion constraint.
13. You have seen that the packages execute successfully. Make the following changes
in the ExtractFiles.bat file to fail the Expanding Downloaded Files task:
REM DEL C:\SSIS\downloads\%1.txt
C:\SSIS\UNZIP %1.zip %1.txt
Figure 5-18 Consolidating the Workflow Packages package
Chapter 5: Integration Services Control Flow Tasks 193
As you can now understand, when the ExtractFiles.bat file is called, it won’t be able
to find the unzip.exe file in the C:\SSIS folder and hence will fail in operation.
Using Windows Explorer, delete the DealerSales01.zip and DealerSales02.zip
files from C:\SSIS\downloads\Archive folder, but do not delete DealerSales01.txt
and DealerSales02.txt files from the C:\SSIS\downloads folder.
14. Right-click the Consolidating Workflow Packages package in the Solution
Explorer window and choose the Execute Package command from the context
menu. You will see the Downloading Zipped Files package appearing on the
screen and being executed successfully, followed by Expanding downloaded files
package being executed but failing as expected. Note that after the failure of
this child package, the parent package Consolidating Workflow Packages stops
immediately and doesn’t execute tasks down the line. Stop debugging the package
by pressing -
5. Note that in the second to last line in the Output window,
the package is declared finished with a failure. The exact message is “SSIS package
‘Consolidating workflow packages.dtsx’ finished: Failure.” (See Figure 5-19.)
If you don’t see the Output window, you can open it by pressing --.
Figure 5-19 Failing Consolidating Workflow Packages with the Success constraint
194 Hands-On Microsoft SQL Server 2008 Integration Services
15. Having seen the package fail when using the Success constraint, you will now

change the Success constraint to a Completion constraint for the Archiving
Downloaded Files task to see how the package behaves. Changing this constraint
actually specifies that Archiving Downloaded Files task should run when the
Expanding Downloaded Files task completes without regard to success or failure
of Expanding Downloaded Files package.
Right-click the green arrow from the Expanding Downloaded Files task to the
Archiving Downloaded Files task and click Completion in the context menu. The
green arrow will change to blue. This blue arrow signifies the On Completion
constraint for the Archiving Downloaded Files task.
Execute the Consolidating Workflow Packages package again. This time you
will see the first task, Downloading Zipped Files, completing successfully, and
then the Expanding Downloaded Files task failing as expected. But your parent
package doesn’t stop this time; instead, it goes on to run the remaining tasks
successfully and loading records in the table as the text files were available (which
you didn’t delete in Step 13). This explains how the package behaves in case of a
Completion constraint compared to a Success constraint. If you check the Output
window for status, you will still see the same message you saw last time for the
package being finished with a failure, but you do know for sure that this time the
last two tasks ran successfully (Figure 5-20).
Review
In the first part of this exercise, you used the Execute Package task to include child
packages in the parent package and consolidated all the different modules into one
integrated package with all the features you built separately. In the second part you
learned the behavior of execution of a package with the Success constraint and later
with the Completion constraint. If a task fails during run time for any reason, the
following tasks that are using the Success constraint for this package will not be
executed and the package will fail immediately. On the other hand, if the failing task
connects with following tasks using the Completion constraint, the downstream tasks
get executed with no regard to success or failure, though the tasks that depend upon
the processing of the failing task may be affected due to unavailability of data that

would have been otherwise provided by the failing task. You also learned that the final
message for a package might not tell you a true story about the execution status of the
package. So, you definitely need to configure logging for your packages to know more
about the execution status of the various tasks and the reasons of failure, if any. One
thing more about precedence constraints is that you can actually evaluate expressions
and use them along with the constraints to determine the execution for the subsequent
tasks in a package.
Chapter 5: Integration Services Control Flow Tasks 195
Send Mail Task
Using the Send Mail task, you can send messages from your packages such as on
the basis of success or failure of the package or on the basis of an event raised during
execution of the package. This task uses the SMTP Connection Manager to send
mails using the SMTP server. You can either specify the message directly in the task,
let the task read from a file, or choose a variable to be sent as a message. You can use
this feature to pass messages or variables between SSIS packages running on different
servers. You can also use the Send Mail task to send notification messages about
success or failure of other tasks. You have already used this task in the “Contacting
Opportunities” Hands-On exercise in Chapter 4.
Figure 5-20 Failing Consolidating Workflow Packages with the Completion constraint
196 Hands-On Microsoft SQL Server 2008 Integration Services
WMI Data Reader Task
For the benefit of those who haven’t used Windows Management Instrumentation
(WMI), it is explained here along with a brief background to give you a head start.
The Distributed Management Task Force (DMTF) is an industry consortium that is
involved with the development, support, and maintenance of management standards
for computer systems and is involved with management technologies such as Common
Information Model (CIM) and Web-Based Enterprise Management (WBEM). CIM,
a standard for describing management information, allows different management
applications to collect the required data from a variety of sources and is platform
independent. WBEM uses browsers and applications to manage systems and networks

throughout the enterprise. WBEM uses CIM as the database for information about
computer systems and network devices. Microsoft has implemented the DMTF’s
CIMV2 and WBEM standards in WMI.
The WMI schema is logically partitioned into namespaces for organizational and
security purposes. You should use the WMI Control (Server Manager | Configuration |
WMI Control | Properties) or the Wmimgmt.msc, Microsoft Management Console
(MMC) snap-in to view and modify the security on WMI namespaces. A namespace
actually groups a set of classes and instances that are logically related to a particular
managed environment. For example, CIMV2 groups a set of classes and instances that
relate to aspects of the local Windows environment. Though DMTF has defined a lot
of namespaces within WBEM, Microsoft has chosen to instrument the various classes
and properties that fall within the CIMV2 namespace.
The Windows operating system provides management information through the
WMI component. WMI can be used to query computer systems, networks, and
applications that can be further extended to create event-monitoring applications.
Using the WMI Data Reader task, you can run WQL queries to get the information
from WMI such as the presence, state, or properties of hardware components, Windows
event logs, and installed applications; using this you can build some sort of intelligence
within your packages to decide, based on the results of a WQL query, whether the other
tasks in the package should run.
The WMI Query Language (WQL) is a subset of ANSI SQL with minor semantic
changes to support WMI. You can write data, event, and schema queries using WQL.
Data queries are most commonly used in WMI scripts and applications to retrieve
class instances and data associations, whereas schema queries are used to retrieve class
definitions and schema associations and event queries are used to raise event notifications.
The good news is that writing a WQL query is similar to writing an SQL query because
they use a common dialect. WQL is modified to support WMI event notification and
other WMI-specific features. However, the tough bit is that WMI classes vary between
versions of Windows and the WQL queries used here may not work on your system, as
they have been tested only on a Windows Server 2008 machine.

Chapter 5: Integration Services Control Flow Tasks 197
Let’s do a quick and short Hands-On exercise to demonstrate how to configure the
WMI Data Reader task and write WQL queries.
Hands-On: Reading the Application Log
You are required to copy the application log error messages for all the failing SSIS
packages to a text file.
Method
In this exercise, you will use the WMI Data Reader task to read an application log for
error messages generated by Integration Services after January 1, 2010.
Here’s the step-by-step procedure:
1. Create a WMI Connection Manager to connect to the server.
2. Write a WQL query to read the application log.
3. Complete configurations of the WMI Data Reader task.
Exercise (Create WMI Connection Manager)
Now you know the steps you have to use, so let us get going.
1. Open the Control Flow Tasks project in BIDS. Add a new package to the SSIS
Packages folder in the Solution Explorer window. Rename the package Reading
Application Log.dtsx.
2. Drag the WMI Data Reader task from the Toolbox onto the SSIS Designer.
Double-click the task icon to open the WMI Data Reader Task Editor. On the
WMI Options page, click in the WmiConnection field and choose <New WMI
Connection…> from the drop-down list box. This will open the WMI Connection
Manager Editor. Type the following in the Connection Manager Information area:
Name Localhost
Description WMI connection to localhost
Leave the following default settings in the Server and namespace area (refer to
Figure 5-21):
Server name \\localhost
Namespace \root\cimv2
Select the Use Windows Authentication check box and click Test to test the

connection. When you receive the success message, click OK to close the message
and click OK again to close the WMI Connection Manager Editor.

×